Science.gov

Sample records for addition verification studies

  1. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ENVIROFUELS DIESEL FUEL CATALYZER FUEL ADDITIVE

    EPA Science Inventory

    EPA's Environmental Technology Verification Program has tested EnviroFuels diesel fuel additive, called the Diesel Fuel Catalyzer. EnviroFuels has stated that heavy-duty on and off road diesel engines are the intended market for the catalyzer. Preliminary tests conducted indicate...

  3. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  4. Environmental Technology Verification Report: Taconic Energy, Inc. TEA Fuel Additive

    EPA Science Inventory

    The Greenhouse Gas Technology Center (GHG Center) is one of six verification organizations operating under EPA’s ETV program. One sector of significant interest to GHG Center stakeholders is transportation - particularly technologies that result in fuel economy improvements. Taco...

  5. Needs Assessment Project: FY 82 Verification Study.

    ERIC Educational Resources Information Center

    Shively, Joe E.; O'Donnell, Phyllis

    As part of a continuing assessment of educational needs in a seven-state region, researchers conducted a verification study to check the validity of educational needs first identified in fiscal year (FY) 1980. The seven states comprise Alabama, Kentucky, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia. This report describes assessment…

  6. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  7. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  8. Verification study of an emerging fire suppression system

    DOE PAGES

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less

  9. Verification study of an emerging fire suppression system

    SciTech Connect

    Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; Gubernatis, David C.

    2016-01-01

    Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation and mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.

  10. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    SciTech Connect

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  11. Mental Arithmetic in Children with Mathematics Learning Disabilities: The Adaptive Use of Approximate Calculation in an Addition Verification Task

    ERIC Educational Resources Information Center

    Rousselle, Laurence; Noel, Marie-Pascale

    2008-01-01

    The adaptive use of approximate calculation was examined using a verification task with 18 third graders with mathematics learning disabilities, 22 typically achieving third graders, and 21 typically achieving second graders. Participants were asked to make true-false decisions on simple and complex addition problems while the distance between the…

  12. Advanced NSTS propulsion system verification study

    NASA Technical Reports Server (NTRS)

    Wood, Charles

    1989-01-01

    The merits of propulsion system development testing are discussed. The existing data base of technical reports and specialists is utilized in this investigation. The study encompassed a review of all available test reports of propulsion system development testing for the Saturn stages, the Titan stages, and the Space Shuttle main propulsion system. The knowledge on propulsion system development and system testing available from specialists and managers was also 'tapped' for inclusion.

  13. Mental arithmetic in children with mathematics learning disabilities: the adaptive use of approximate calculation in an addition verification task.

    PubMed

    Rousselle, Laurence; Noël, Marie-Pascale

    2008-01-01

    The adaptive use of approximate calculation was examined using a verification task with 18 third graders with mathematics learning disabilities, 22 typically achieving third graders, and 21 typically achieving second graders. Participants were asked to make true-false decisions on simple and complex addition problems while the distance between the proposed and the correct answer was manipulated. Both typically achieving groups were sensitive to answer plausibility on simple problems, were faster at rejecting extremely incorrect results than at accepting correct answers on complex addition problems, and showed a reduction of the complexity effect on implausible problems, attesting to the use of approximate calculation. Conversely, children with mathematics disabilities were unaffected by answer plausibility on simple addition problems, processed implausible and correct sums with equal speed on complex problems, and exhibited a smaller reduction of the complexity effect on implausible problems. They also made more errors on implausible problems. Different hypotheses are discussed to account for these results. PMID:18443150

  14. Metrology test object for dimensional verification in additive manufacturing of metals for biomedical applications.

    PubMed

    Teeter, Matthew G; Kopacz, Alexander J; Nikolov, Hristo N; Holdsworth, David W

    2015-01-01

    Additive manufacturing continues to increase in popularity and is being used in applications such as biomaterial ingrowth that requires sub-millimeter dimensional accuracy. The purpose of this study was to design a metrology test object for determining the capabilities of additive manufacturing systems to produce common objects, with a focus on those relevant to medical applications. The test object was designed with a variety of features of varying dimensions, including holes, cylinders, rectangles, gaps, and lattices. The object was built using selective laser melting, and the produced dimensions were compared to the target dimensions. Location of the test objects on the build plate did not affect dimensions. Features with dimensions less than 0.300 mm did not build or were overbuilt to a minimum of 0.300 mm. The mean difference between target and measured dimensions was less than 0.100 mm in all cases. The test object is applicable to multiple systems and materials, tests the effect of location on the build, uses a minimum of material, and can be measured with a variety of efficient metrology tools (including measuring microscopes and micro-CT). Investigators can use this test object to determine the limits of systems and adjust build parameters to achieve maximum accuracy.

  15. Metrology test object for dimensional verification in additive manufacturing of metals for biomedical applications.

    PubMed

    Teeter, Matthew G; Kopacz, Alexander J; Nikolov, Hristo N; Holdsworth, David W

    2015-01-01

    Additive manufacturing continues to increase in popularity and is being used in applications such as biomaterial ingrowth that requires sub-millimeter dimensional accuracy. The purpose of this study was to design a metrology test object for determining the capabilities of additive manufacturing systems to produce common objects, with a focus on those relevant to medical applications. The test object was designed with a variety of features of varying dimensions, including holes, cylinders, rectangles, gaps, and lattices. The object was built using selective laser melting, and the produced dimensions were compared to the target dimensions. Location of the test objects on the build plate did not affect dimensions. Features with dimensions less than 0.300 mm did not build or were overbuilt to a minimum of 0.300 mm. The mean difference between target and measured dimensions was less than 0.100 mm in all cases. The test object is applicable to multiple systems and materials, tests the effect of location on the build, uses a minimum of material, and can be measured with a variety of efficient metrology tools (including measuring microscopes and micro-CT). Investigators can use this test object to determine the limits of systems and adjust build parameters to achieve maximum accuracy. PMID:25542613

  16. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  17. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  18. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  19. Additional field verification of convective scaling for the lateral dispersion parameter

    SciTech Connect

    Sakiyama, S.K.; Davis, P.A.

    1988-07-01

    The results of a series of diffusion trials over the heterogeneous surface of the Canadian Precambrian Shield provide additional support for the convective scaling of the lateral dispersion parameter. The data indicate that under convective conditions, the lateral dispersion parameter can be scaled with the convective velocity scale and the mixing depth. 10 references.

  20. Overhead imaging for verification and peacekeeping: Three studies. Arms Control Verification Studies No. 6

    SciTech Connect

    Banner, A.V.

    1991-01-01

    This paper examines commercially available overhead remote sensing systems and their applications for international security. The paper describes the basic operating characteristics and features of commercially available systems, then uses two case studies to examine potential applications. In the first, imagery acquired during the Soviet withdrawal from Afghanistan in 1988 and 1989 is used to assess whether commercially available satellite imagery would be useful for monitoring large scale withdrawals of conventionally armed forces. In the second case study, imagery of selected sites in Namibia and Angola is used to examine whether such imagery could have supported United Nations peacekeeping operations in those countries. Potential applications of airborne remote sensing systems are also demonstrated using previously acquired imagery to show the kinds of results which could be obtained using commercially available systems.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  2. Environmental Technology Verification Program - ETV - Case Studies: Demonstrating Program Outcomes

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This cd con...

  3. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  4. Verification of multi-proxy paleoclimatic studies: a case study

    NASA Astrophysics Data System (ADS)

    McIntyre, S.; McKitrick, R.

    2004-12-01

    Multi-proxy studies have been the primary means of transmitting paleoclimatic findings to public policy. For policy use, such studies should be replicable in the sense of King (1995). The best-known and most widely applied multi-proxy study is Mann, Bradley and Hughes (1998) ("MBH98") and its 1999 extension, which claimed to have exceptional "robustness" and "skill". We attempted to replicate MBH98 results and found, among other problems, that MBH98 methodology included two important unreported steps: (1) Subtraction of the 1902-1980 mean prior to principal components (PC) calculations (rather than, say, the 1400-1980 mean in the AD1400 step); (2) Extrapolation of a duplicate version of the Gaspé tree ring series. We show that high early 15th century values occur in important variations and that their results are not robust to the following: (1) Presence or absence of the extrapolation of 4 years at the beginning of the Gaspé tree ring series; (2) subtraction of the 1400-1980 mean rather than subtraction of the 1902-1980 mean, while using the same number of retained PC series in each step as MBH98; (3) the presence or absence of the North American PC4, while subtracting the 1400-1980 mean and using 5 PCs in the AD1400 step; (4) presence or absence of a small subset of high-altitude tree ring sites, mostly "strip bark" bristlecone pines, mostly collected by one researcher, Donald Graybill. The subtraction of the 1902-1980 mean dramatically inflates the role of the bristlecone pine sites, which then impart a distinctive hockey stick shape to the MBH98 PC1 and then to the NH temperature reconstruction. MBH98 claimed "skill" through apparently significant Reduction of Error (RE) statistics, reporting 0.51 in the AD1400 step, as compared to a reported 99 percent significance level of 0, which they calculated through simulations using red noise with low AR1 coefficients (0.2). We benchmarked a more realistic significance level by applying MBH98 PC methods to 10

  5. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  6. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  8. Hands-on Verification of Mechanics Training: A Cost-Effectiveness Study of Videodisc Simulation.

    ERIC Educational Resources Information Center

    Maher, Thomas G.

    This document reports the results of a study on the feasibility of training smog check mechanics in California via hands-on verification of mechanics' ability to inspect and repair vehicles. The reviews of the research literature that compare the learning effectiveness of different delivery media tend to support the position that in learning, the…

  9. Additional EIPC Study Analysis. Final Report

    SciTech Connect

    Hadley, Stanton W; Gotham, Douglas J.; Luciani, Ralph L.

    2014-12-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 14 topics was developed for further analysis. This paper brings together the earlier interim reports of the first 13 topics plus one additional topic into a single final report.

  10. Verification of a New Biocompatible Single-Use Film Formulation with Optimized Additive Content for Multiple Bioprocess Applications

    PubMed Central

    Jurkiewicz, Elke; Husemann, Ute; Greller, Gerhard; Barbaroux, Magali; Fenge, Christel

    2014-01-01

    Single-use bioprocessing bags and bioreactors gained significant importance in the industry as they offer a number of advantages over traditional stainless steel solutions. However, there is continued concern that the plastic materials might release potentially toxic substances negatively impacting cell growth and product titers, or even compromise drug safety when using single-use bags for intermediate or drug substance storage. In this study, we have focused on the in vitro detection of potentially cytotoxic leachables originating from the recently developed new polyethylene (PE) multilayer film called S80. This new film was developed to guarantee biocompatibility for multiple bioprocess applications, for example, storage of process fluids, mixing, and cell culture bioreactors. For this purpose, we examined a protein-free cell culture medium that had been used to extract leachables from freshly gamma-irradiated sample bags in a standardized cell culture assay. We investigated sample bags from films generated to establish the operating ranges of the film extrusion process. Further, we studied sample bags of different age after gamma-irradiation and finally, we performed extended media extraction trials at cold room conditions using sample bags. In contrast to a nonoptimized film formulation, our data demonstrate no cytotoxic effect of the S80 polymer film formulation under any of the investigated conditions. The S80 film formulation is based on an optimized PE polymer composition and additive package. Full traceability alongside specifications and controls of all critical raw materials, and process controls of the manufacturing process, that is, film extrusion and gamma-irradiation, have been established to ensure lot-to-lot consistency. © 2014 American Institute of Chemical Engineers Biotechnol. Prog., 30:1171–1176, 2014 PMID:24850537

  11. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    SciTech Connect

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  12. Aircraft surface coatings study: Verification of selected materials

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Three liquid coatings and four films that might improve and/or maintain the smoothness of transport aircraft surfaces are considered. Laboratory tests were performed on the liquid coatings (elastomeric polyurethanes) exposed to synthetic type hydraulic fluid, with and without a protective topcoat. Results were analyzed of a 14-month flight service evaluation of coatings applied to leading edges of an airline 727. Two additional airline service evaluations were initiated. Labortory tests were conducted on the films, bonded to aluminum substrate with various adhesives, to determine the best film/adhesive combinations. A cost/benefits analysis was performed and recommendations made for future work toward the application of this technology to commercial transports.

  13. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  14. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  15. Energy management and control system verification study. Master's thesis

    SciTech Connect

    Boulware, K.E.; Williamson, G.C.

    1983-09-01

    Energy Management and Control Systems (EMCS) are being installed and operated throughout the Air Force. Millions of dollars have been spent on EMCS, but no study has conclusively proved that EMCS has actually saved the Air Force energy. This thesis used the Regression subprogram of Statistical Packages for the Social Sciences (SPSS) to determine if these systems are indeed saving the Air Force energy. Previous studies have shown that Multiple Linear Regression (MLR) is the best statistical predictor of base energy consumption. Eight bases were selected that had an operational EMCS. Two EMCS bases were compared with one control base for each of four CONUS winter heating zones. The results indicated small (less than 2%) energy savings have occurred at half of the EMCS bases studied. Therefore, this study does not conclusively prove that EMCS's have saved energy on Air Force bases. However, the methodology developed in this report could be applied on a broader scale to develop a more conclusive result.

  16. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification

    PubMed Central

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Background Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. Methods At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. Result 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Conclusion Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints. PMID:27355447

  17. Verification of a Quality Management Theory: Using a Delphi Study

    PubMed Central

    Mosadeghrad, Ali Mohammad

    2013-01-01

    Background: A model of quality management called Strategic Collaborative Quality Management (SCQM) model was developed based on the quality management literature review, the findings of a survey on quality management assessment in healthcare organisations, semi-structured interviews with healthcare stakeholders, and a Delphi study on healthcare quality management experts. The purpose of this study was to verify the SCQM model. Methods: The proposed model was further developed using feedback from thirty quality management experts using a Delphi method. Further, a guidebook for its implementation was prepared including a road map and performance measurement. Results: The research led to the development of a context-specific model of quality management for healthcare organisations and a series of guidelines for its implementation. Conclusion: A proper model of quality management should be developed and implemented properly in healthcare organisations to achieve business excellence. PMID:24596883

  18. Structure Property Studies for Additively Manufactured Parts

    SciTech Connect

    Milenski, Helen M; Schmalzer, Andrew Michael; Kelly, Daniel

    2015-08-17

    Since the invention of modern Additive Manufacturing (AM) processes engineers and designers have worked hard to capitalize on the unique building capabilities that AM allows. By being able to customize the interior fill of parts it is now possible to design components with a controlled density and customized internal structure. The creation of new polymers and polymer composites allow for even greater control over the mechanical properties of AM parts. One of the key reasons to explore AM, is to bring about a new paradigm in part design, where materials can be strategically optimized in a way that conventional subtractive methods cannot achieve. The two processes investigated in my research were the Fused Deposition Modeling (FDM) process and the Direct Ink Write (DIW) process. The objectives of the research were to determine the impact of in-fill density and morphology on the mechanical properties of FDM parts, and to determine if DIW printed samples could be produced where the filament diameter was varied while the overall density remained constant.

  19. Formulation verification study results for 241-AN-106 waste grout

    SciTech Connect

    Lokken, R.O.; Martin, P.F.C.; Morrison, L.C.; Palmer, S.E.; Anderson, C.M.

    1993-06-01

    Tests were conducted to determine whether the reference formulation and variations around the formulation are adequate for solidifying 241-AN-106 (106-AN) waste into a grout waste form. The reference formulation consists of 21 wt% type I/II Portland cement, 68 wt% fly ash, and 11 wt% attapulgite clay. The mix ratio is 8.4 lb/gal. Variations in dry blend component ratios, mix ratio, and waste concentration were assessed by using a statistically designed experimental matrix consisting of 44 grout compositions. Based on the results of the statistically designed variability study, the 106-AN grout formulations tested met all the formulation criteria except for the heat of hydration.

  20. Hybrid Enrichment Verification Array: Module Characterization Studies Version 2

    SciTech Connect

    Zalavadia, Mital A.; Smith, Leon E.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Mace, Emily K.; Deshmukh, Nikhil S.

    2015-12-01

    The work presented in this report is focused on the characterization and refinement of the HEVA approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. In comparison to previous versions, the new design boosts the high energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  1. Subject-specific planning of femoroplasty: an experimental verification study.

    PubMed

    Basafa, Ehsan; Murphy, Ryan J; Otake, Yoshito; Kutzer, Michael D; Belkoff, Stephen M; Mears, Simon C; Armand, Mehran

    2015-01-01

    The risk of osteoporotic hip fractures may be reduced by augmenting susceptible femora with acrylic polymethylmethacrylate (PMMA) bone cement. Grossly filling the proximal femur with PMMA has shown promise, but the augmented bones can suffer from thermal necrosis or cement leakage, among other side effects. We hypothesized that, using subject-specific planning and computer-assisted augmentation, we can minimize cement volume while increasing bone strength and reducing the risk of fracture. We mechanically tested eight pairs of osteoporotic femora, after augmenting one from each pair following patient-specific planning reported earlier, which optimized cement distribution and strength increase. An average of 9.5(±1.7) ml of cement was injected in the augmented set. Augmentation significantly (P<0.05) increased the yield load by 33%, maximum load by 30%, yield energy by 118%, and maximum energy by 94% relative to the non-augmented controls. Also predicted yield loads correlated well (R(2)=0.74) with the experiments and, for augmented specimens, cement profiles were predicted with an average surface error of <2 mm, further validating our simulation techniques. Results of the current study suggest that subject-specific planning of femoroplasty reduces the risk of hip fracture while minimizing the amount of cement required.

  2. Subject-Specific Planning of Femoroplasty: An Experimental Verification Study

    PubMed Central

    Basafa, Ehsan; Murphy, Ryan J.; Otake, Yoshito; Kutzer, Michael D.; Belkoff, Stephen M.; Mears, Simon C.; Armand, Mehran

    2014-01-01

    The risk of osteoporotic hip fractures may be reduced by augmenting susceptible femora with acrylic polymethylmethacrylate (PMMA) bone cement. Grossly filling the proximal femur with PMMA has shown promise, but the augmented bones can suffer from thermal necrosis or cement leakage, among other side effects. We hypothesized that, using subject-specific planning and computer-assisted augmentation, we can minimize cement volume while increasing bone strength and reducing the risk of fracture. We mechanically tested eight pairs of osteoporotic femora, after augmenting one from each pair following patient-specific planning reported earlier, which optimized cement distribution and strength increase. An average of 9.5(±1.7)ml of cement was injected in the augmented set. Augmentation significantly (P<0.05) increased the yield load by 33%, maximum load by 30%, yield energy by 118%, and maximum energy by 94% relative to the non-augmented controls. Also predicted yield loads correlated well (R2=0.74) with the experiments and, for augmented specimens, cement profiles were predicted with an average surface error of <2mm, further validating our simulation techniques. Results of the current study suggest that subject-specific planning of femoroplasty reduces the risk of hip fracture while minimizing the amount of cement required. PMID:25468663

  3. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    PubMed Central

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2008-01-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  4. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    SciTech Connect

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2007-02-15

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  5. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  6. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study.

    PubMed

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-21

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the β+-activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications. PMID:26237315

  7. Decoloration of Amaranth by the white-rot fungus Trametes versicolor. Part II. Verification study.

    PubMed

    Gavril, Mihaela; Hodson, Peter V

    2007-02-01

    The involvement of lignin peroxidase (LiP) in the decoloration of the mono-azo substituted napthalenic dye Amaranth was investigated with pure enzymes and whole cultures of Trametes versicolor. The verification study confirmed that LiP has a direct influence on the initial decoloration rate and showed that another enzyme, which does not need hydrogen peroxide to function and is not a laccase, also plays a role during decoloration. These results confirm the results of a previous statistical analysis. Furthermore, the fungal mycelium affects the performance of the decoloration process.

  8. Verification of genes differentially expressed in neuroblastoma tumours: a study of potential tumour suppressor genes

    PubMed Central

    Thorell, Kaisa; Bergman, Annika; Carén, Helena; Nilsson, Staffan; Kogner, Per; Martinsson, Tommy; Abel, Frida

    2009-01-01

    Background One of the most striking features of the childhood malignancy neuroblastoma (NB) is its clinical heterogeneity. Although there is a great need for better clinical and biological markers to distinguish between tumours with different severity and to improve treatment, no clear-cut prognostic factors have been found. Also, no major NB tumour suppressor genes have been identified. Methods In this study we performed expression analysis by quantitative real-time PCR (QPCR) on primary NB tumours divided into two groups, of favourable and unfavourable outcome respectively. Candidate genes were selected on basis of lower expression in unfavourable tumour types compared to favourables in our microarray expression analysis. Selected genes were studied in two steps: (1) using TaqMan Low Density Arrays (TLDA) targeting 89 genes on a set of 12 NB tumour samples, and (2) 12 genes were selected from the TLDA analysis for verification using individual TaqMan assays in a new set of 13 NB tumour samples. Results By TLDA analysis, 81 out of 87 genes were found to be significantly differentially expressed between groups, of which 14 have previously been reported as having an altered gene expression in NB. In the second verification round, seven out of 12 transcripts showed significantly lower expression in unfavourable NB tumours, ATBF1, CACNA2D3, CNTNAP2, FUSIP1, GNB1, SLC35E2, and TFAP2B. The gene that showed the highest fold change in the TLDA analysis, POU4F2, was investigated for epigenetic changes (CpG methylation) and mutations in order to explore the cause of the differential expression. Moreover, the fragile site gene CNTNAP2 that showed the largest fold change in verification group 2 was investigated for structural aberrations by copy number analysis. However, the analyses of POU4F2 and CNTNAP2 showed no genetic alterations that could explain a lower expression in unfavourable NB tumours. Conclusion Through two steps of verification, seven transcripts were found to

  9. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    NASA Technical Reports Server (NTRS)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  10. Open-source MFIX-DEM software for gas-solids flows: Part 1 - Verification studies

    SciTech Connect

    Garg, Rahul; Galvin, Janine; Li, Tingwen; Pannala, Sreekanth

    2012-04-01

    With rapid advancements in computer hardware, it is now possible to perform large simulations of granular flows using the Discrete Element Method (DEM). As a result, solids are increasingly treated in a discrete Lagrangian fashion in the gas–solids flow community. In this paper, the open-source MFIX-DEM software is described that can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles. This method is referred to as the continuum discrete method (CDM) to clearly make a distinction between the ambiguity of using a Lagrangian or Eulerian reference for either continuum or discrete formulations. This freely available CDM code for gas–solids flows can accelerate the research in computational gas–solids flows and establish a baseline that can lead to better closures for the continuum modeling (or traditionally referred to as two fluid model) of gas–solids flows. In this paper, a series of verification cases is employed which tests the different aspects of the code in a systematic fashion by exploring specific physics in gas–solids flows before exercising the fully coupled solution on simple canonical problems. It is critical to have an extensively verified code as the physics is complex with highly-nonlinear coupling, and it is difficult to ascertain the accuracy of the results without rigorous verification. These series of verification tests set the stage not only for rigorous validation studies (performed in part II of this paper) but also serve as a procedure for testing any new developments that couple continuum and discrete formulations for gas–solids flows.

  11. Open Source MFIX-DEM Software for Gas-Solids Flows: Part 1 - Verification Studies

    SciTech Connect

    Garg, Rahul; Galvin, Janine; Li, Tingwen; Pannala, Sreekanth

    2012-04-01

    With rapid advancements in computer hardware, it is now possible to perform large simulations of granular flows using the Discrete Element Method (DEM). As a result, solids are increasingly treated in a discrete Lagrangian fashion in the gas–solids flow community. In this paper, the open-source MFIX-DEM software is described that can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles. This method is referred to as the continuum discrete method (CDM) to clearly make a distinction between the ambiguity of using a Lagrangian or Eulerian reference for either continuum or discrete formulations. This freely available CDM code for gas–solids flows can accelerate the research in computational gas–solids flows and establish a baseline that can lead to better closures for the continuum modeling (or traditionally referred to as two fluid model) of gas–solids flows. In this paper, a series of verification cases is employed which tests the different aspects of the code in a systematic fashion by exploring specific physics in gas–solids flows before exercising the fully coupled solution on simple canonical problems. It is critical to have an extensively verified code as the physics is complex with highly-nonlinear coupling, and it is difficult to ascertain the accuracy of the results without rigorous verification. These series of verification tests set the stage not only for rigorous validation studies (performed in part II of this paper) but also serve as a procedure for testing any new developments that couple continuum and discrete formulations for gas–solids flows.

  12. Open-source MFIX-DEM software for gas-solids flows: Part I verification studies

    SciTech Connect

    Garg, Rahul; Galvin, Janine; Li, Tingwen; Pannala, Sreekanth

    2012-01-01

    With rapid advancements in computer hardware, it is now possible to perform large simulations of granular flows using the Discrete Element Method (DEM). As a result, solids are increasingly treated in a discrete Lagrangian fashion in the gas solids flow community. In this paper, the open-source MFIX-DEM software is described that can be used for simulating the gas solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles. This method is referred to as the continuum discrete method (CDM) to clearly make a distinction between the ambiguity of using a Lagrangian or Eulerian reference for either continuum or discrete formulations. This freely available CDM code for gas solids flows can accelerate the research in computational gas solids flows and establish a baseline that can lead to better closures for the continuum modeling (or traditionally referred to as two fluid model) of gas solids flows. In this paper, a series of verification cases is employed which tests the different aspects of the code in a systematic fashion by exploring specific physics in gas solids flows before exercising the fully coupled solution on simple canonical problems. It is critical to have an extensively verified code as the physics is complex with highly-nonlinear coupling, and it is difficult to ascertain the accuracy of the results without rigorous verification. These series of verification tests set the stage not only for rigorous validation studies (performed in part II of this paper) but also serve as a procedure for testing any new developments that couple continuum and discrete formulations for gas solids flows.

  13. Regional Field Verification -- Case Study of Small Wind Turbines in the Pacific Northwest: Preprint

    SciTech Connect

    Sinclair, K.

    2005-05-01

    The U.S. Department of Energy/National Renewable Energy Laboratory's (DOE/NREL) Regional Field Verification (RFV) project supports industry needs for gaining initial field operation experience with small wind turbines and verify the performance, reliability, maintainability, and cost of small wind turbines in diverse applications. In addition, RFV aims to help expand opportunities for wind energy in new regions of the United States by tailoring projects to meet unique regional requirements and document and communicate the experience from these projects for the benefit of others in the wind power development community and rural utilities. Between August 2003 and August 2004, six turbines were installed at different host sites. At least one year of data has been collected from five of these sites. This paper describes DOE/NREL's RFV project, reviews some of the lessons learned with regards to small wind turbine installations, summarizes operations data from these sites, and provides preliminary BOS costs.

  14. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    PubMed Central

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  15. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants.

    PubMed

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-06-27

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  16. ICAN/PART: Particulate composite analyzer, user's manual and verification studies

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Murthy, Pappu L. N.; Mital, Subodh K.

    1996-01-01

    A methodology for predicting the equivalent properties and constituent microstresses for particulate matrix composites, based on the micromechanics approach, is developed. These equations are integrated into a computer code developed to predict the equivalent properties and microstresses of fiber reinforced polymer matrix composites to form a new computer code, ICAN/PART. Details of the flowchart, input and output for ICAN/PART are described, along with examples of the input and output. Only the differences between ICAN/PART and the original ICAN code are described in detail, and the user is assumed to be familiar with the structure and usage of the original ICAN code. Detailed verification studies, utilizing dim dimensional finite element and boundary element analyses, are conducted in order to verify that the micromechanics methodology accurately models the mechanics of particulate matrix composites. ne equivalent properties computed by ICAN/PART fall within bounds established by the finite element and boundary element results. Furthermore, constituent microstresses computed by ICAN/PART agree in average sense with results computed using the finite element method. The verification studies indicate that the micromechanics programmed into ICAN/PART do indeed accurately model the mechanics of particulate matrix composites.

  17. A Study of Additional Costs of Second Language Instruction.

    ERIC Educational Resources Information Center

    McEwen, Nelly

    A study was conducted whose primary aim was to identify and explain additional costs incurred by Alberta, Canada school jurisdictions providing second language instruction in 1980. Additional costs were defined as those which would not have been incurred had the second language program not been in existence. Three types of additional costs were…

  18. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  19. GFO-1 Geophysical Data Record and Orbit Verifications for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This final report summarizes the research work conducted under NASA's Physical Oceanography Program, entitled, GFO-1 Geophysical Data Record And Orbit Verifications For Global Change Studies, for the investigation time period from December 1, 1997 through November 30, 2000. The primary objectives of the investigation include providing verification and improvement for the precise orbit, media, geophysical, and instrument corrections to accurately reduce U.S. Navy's Geosat-Followon-1 (GFO-1) mission radar altimeter data to sea level measurements. The status of the GFO satellite (instrument and spacecraft operations, orbital tracking and altimeter) is summarized. GFO spacecraft has been accepted by the Navy from Ball Aerospace and has been declared operational since November, 2000. We have participated in four official GFO calibration/validation periods (Cal/Val I-IV), spanning from June 1999 through October 2000. Results of verification of the GFO orbit and geophysical data record measurements both from NOAA (IGDR) and from the Navy (NGDR) are reported. Our preliminary results indicate that: (1) the precise orbit (GSFC and OSU) can be determined to approx. 5 - 6 cm rms radially using SLR and altimeter crossovers; (2) estimated GFO MOE (GSFC or NRL) radial orbit accuracy is approx. 7 - 30 cm and Operational Doppler orbit accuracy is approx. 60 - 350 cm. After bias and tilt adjustment (1000 km arc), estimated Doppler orbit accuracy is approx. 1.2 - 6.5 cm rms and the MOE accuracy is approx. 1.0 - 2.3 cm; (3) the geophysical and media corrections have been validated versus in situ measurements and measurements from other operating altimeters (T/P and ERS-2). Altimeter time bias is insignificant with 0-2 ms. Sea state bias is about approx. 3 - 4.5% of SWH. Wet troposphere correction has approx. 1 cm bias and approx. 3 cm rms when compared with ERS-2 data. Use of GIM and IRI95 provide ionosphere correction accurate to 2-3 cm rms during medium to high solar activities; (4

  20. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    PubMed

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2014-05-26

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented.

  1. A study on the factors that affect the advanced mask defect verification

    NASA Astrophysics Data System (ADS)

    Woo, Sungha; Jang, Heeyeon; Lee, Youngmo; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    Defect verification has become significantly difficult to higher technology nodes over the years. Traditional primary method of defect (include repair point) control consists of inspection, AIMS and repair steps. Among them, AIMS process needs various wafer lithography conditions, such as NA, inner/outer sigma, illumination shape and etc. It has a limit to analyze for every layer accurately because AIMS tool uses the physical aperture system. And it requires meticulous management of exposure condition and CD target value which change frequently in advanced mask. We report on the influence of several AIMS parameters on the defect analysis including repair point. Under various illumination conditions with different patterns, it showed the significant correlation in defect analysis results. It is able to analyze defect under certain error budget based on the management specification required for each layer. In addition, it provided us with one of the clues in the analysis of wafer repeating defect. Finally we will present 'optimal specification' for defect management with common AIMS recipe and suggest advanced mask process flow.

  2. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM CASE STUDIES: DEMONSTRATING PROGRAM OUTCOMES, VOLUME II

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This bookle...

  4. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  5. Study on thermal effects & sulfurized additives, in lubricating greases

    NASA Astrophysics Data System (ADS)

    Shah, Ami Atul

    Lithium Base grease constitutes about 50% of market. The greases are developed to be able to work in multiple working conditions and have longer working life. Greases with extreme pressure additives and anti-wear additives have been developed as a solution to many of the applications. These developed greases are tested under ASTM D2266 testing conditions to meet the requirements. The actual working conditions, although, differ than the real testing conditions. The loading, speed and temperature conditions can be more harsh, or fluctuating in nature. The cyclic nature of the parameters cannot be directly related to the test performance. For this purpose studies on the performance under spectrum loading, variable speed and fluctuating temperature must be performed. This study includes tests to understand the effect of thermal variation on some of the most commonly used grease additives that perform well under ASTM D2266 testing conditions. The studied additives include most widely used industrial extreme pressure additive MoS2. Performance of ZDDP which is trying to replace MoS2 in its industrial applications has also been studied. The tests cover study of extreme pressure, anti-wear and friction modifier additives to get a general idea on the effects of thermal variation in three areas. Sulphur is the most common extreme pressure additive. Sulphur based MoS 2 is extensively used grease additive. Study to understand the tribological performance of this additive through wear testing and SEM/EDX studies has been done. This performance is also studied for other metallic sulfides like WS2 and sulphur based organic compound. The aim is to study the importance of the type of bond that sulphur shares in its additive's structure on its performance. The MoS2 film formation is found to be on the basis of the FeS formation on the substrate and protection through sacrificial monolayer deposition of the MoS2 sheared structure. The free Mo then tends to oxidise. An attempt to

  6. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  7. Transition Metal Complexes of Naproxen: Synthesis, Characterization, Forced Degradation Studies, and Analytical Method Verification

    PubMed Central

    Hasan, Md. Sharif; Kayesh, Ruhul; Begum, Farida; Rahman, S. M. Abdur

    2016-01-01

    The aim of our current research was to synthesize some transition metal complexes of Naproxen, determine their physical properties, and examine their relative stability under various conditions. Characterizations of these complexes were done by 1H-NMR, Differential Scanning Calorimetry (DSC), FT-IR, HPLC, and scanning electron microscope (SEM). Complexes were subjected to acidic, basic, and aqueous hydrolysis as well as oxidation, reduction, and thermal degradation. Also the reversed phase high-performance liquid chromatography (RP-HPLC) method of Naproxen outlined in USP was verified for the Naproxen-metal complexes, with respect to accuracy, precision, solution stability, robustness, and system suitability. The melting points of the complexes were higher than that of the parent drug molecule suggesting their thermal stability. In forced degradation study, complexes were found more stable than the Naproxen itself in all conditions: acidic, basic, oxidation, and reduction media. All the HPLC verification parameters were found within the acceptable value. Therefore, it can be concluded from the study that the metal complexes of Naproxen can be more stable drug entity and offer better efficacy and longer shelf life than the parent Naproxen. PMID:27034891

  8. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Technical Reports Server (NTRS)

    Watts, A. W.

    1982-01-01

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  9. Transition Metal Complexes of Naproxen: Synthesis, Characterization, Forced Degradation Studies, and Analytical Method Verification.

    PubMed

    Hasan, Md Sharif; Kayesh, Ruhul; Begum, Farida; Rahman, S M Abdur

    2016-01-01

    The aim of our current research was to synthesize some transition metal complexes of Naproxen, determine their physical properties, and examine their relative stability under various conditions. Characterizations of these complexes were done by 1H-NMR, Differential Scanning Calorimetry (DSC), FT-IR, HPLC, and scanning electron microscope (SEM). Complexes were subjected to acidic, basic, and aqueous hydrolysis as well as oxidation, reduction, and thermal degradation. Also the reversed phase high-performance liquid chromatography (RP-HPLC) method of Naproxen outlined in USP was verified for the Naproxen-metal complexes, with respect to accuracy, precision, solution stability, robustness, and system suitability. The melting points of the complexes were higher than that of the parent drug molecule suggesting their thermal stability. In forced degradation study, complexes were found more stable than the Naproxen itself in all conditions: acidic, basic, oxidation, and reduction media. All the HPLC verification parameters were found within the acceptable value. Therefore, it can be concluded from the study that the metal complexes of Naproxen can be more stable drug entity and offer better efficacy and longer shelf life than the parent Naproxen. PMID:27034891

  10. WTS-4 system verification unit for wind/hydroelectric integration study

    NASA Astrophysics Data System (ADS)

    Watts, A. W.

    The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.

  11. A Modeling and Verification Study of Summer Precipitation Systems Using NASA Surface Initialization Datasets

    NASA Technical Reports Server (NTRS)

    Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.

    2010-01-01

    One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using

  12. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing.

  13. Preventing illegal tobacco and alcohol sales to minors through electronic age-verification devices: a field effectiveness study.

    PubMed

    Krevor, Brad; Capitman, John A; Oblak, Leslie; Cannon, Joanna B; Ruwe, Mathilda

    2003-01-01

    Efforts to prohibit the sales of tobacco and alcohol products to minors are widespread. Electronic Age Verification (EAV) devices are one possible means to improve compliance with sales to minors laws. The purpose of this study was to evaluate the implementation and effectiveness of EAV devices in terms of the frequency and accuracy of age verification, as well as to examine the impact of EAV's on the retailer environment. Two study locations were selected: Tallahassee, Florida and Iowa City, Iowa. Retail stores were invited to participate in the study, producing a self-selected experimental group. Stores that did not elect to test the EAV's comprised the comparison group. The data sources included: 1) mystery shopper inspections: two pre- and five post-EAV installation mystery shopper inspections of tobacco and alcohol retailers; 2) retail clerk and manager interviews; and 3) customer interviews. The study found that installing EAV devices with minimal training and encouragement did not increase age verification and underage sales refusal. Surveyed clerks reported positive experiences using the electronic ID readers and customers reported almost no discomfort about being asked to swipe their IDs. Observations from this study support the need for a more comprehensive system for responsible retailing. PMID:15015859

  14. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  15. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  16. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  17. Feasibility study of patient positioning verification in electron beam radiotherapy with an electronic portal imaging device (EPID).

    PubMed

    Ramm, U; Köhn, J; Rodriguez Dominguez, R; Licher, J; Koch, N; Kara, E; Scherf, C; Rödel, C; Weiß, C

    2014-03-01

    The purpose of this study is to demonstrate the feasibility of verification and documentation in electron beam radiotherapy using the photon contamination detected with an electronic portal imaging device. For investigation of electron beam verification with an EPID, the portal images are acquired irradiating two different tissue equivalent phantoms at different electron energies. Measurements were performed on an Elekta SL 25 linear accelerator with an amorphous-Si electronic portal imaging device (EPID: iViewGT, Elekta Oncology Systems, Crawley, UK). As a measure of EPID image quality contrast (CR) and signal-to-noise ratio (SNR) are determined. For characterisation of the imaging of the EPID RW3 slabs and a Gammex 467 phantom with different material inserts are used. With increasing electron energy the intensity of photon contamination increases, yielding an increasing signal-to-noise ratio, but images are showing a decreasing contrast. As the signal-to-noise ratio saturates with increasing dose a minimum of 50 MUs is recommended. Even image quality depends on electron energy and diameter of the patient, the acquired results are mostly sufficient to assess the accuracy of beam positioning. In general, the online EPID acquisition has been demonstrated to be an effective electron beam verification and documentation method. The results are showing that this procedure can be recommended to be routinely and reliably done in patient treatment with electron beams.

  18. Benchmark Study of Industrial Needs for Additive Manufacturing in Finland

    NASA Astrophysics Data System (ADS)

    Lindqvist, Markku; Piili, Heidi; Salminen, Antti

    Additive manufacturing (AM) is a modern way to produce parts for industrial use. Even though the technical knowledge and research of AM processes are strong in Finland, there are only few industrial applications. Aim of this study is to collect practical knowledge of companies who are interested in industrial use of AM, especially in South-Eastern Finland. Goal of this study is also to investigate demands and requirements of applications for industrial use of AM in this area of Finland. It was concluded, that two of the reasons prohibiting wider industrial use of AM in Finland, are wrong expectations against this technology as well as lack of basic knowledge of possibilities of the technology. Especially, it was noticed that strong 3D-hype is even causing misunderstandings. Nevertheless, the high-level industrial know-how in the area, built around Finnish lumber industry is a strong foundation for the additive manufacturing technology.

  19. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  20. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  1. Electrostatic Levitation for Studies of Additive Manufactured Materials

    NASA Technical Reports Server (NTRS)

    SanSoucie, Michael P.; Rogers, Jan R.; Tramel, Terri

    2014-01-01

    The electrostatic levitation (ESL) laboratory at NASA's Marshall Space Flight Center is a unique facility for investigators studying high temperature materials. The laboratory boasts two levitators in which samples can be levitated, heated, melted, undercooled, and resolidified. Electrostatic levitation minimizes gravitational effects and allows materials to be studied without contact with a container or instrumentation. The lab also has a high temperature emissivity measurement system, which provides normal spectral and normal total emissivity measurements at use temperature. The ESL lab has been instrumental in many pioneering materials investigations of thermophysical properties, e.g., creep measurements, solidification, triggered nucleation, and emissivity at high temperatures. Research in the ESL lab has already led to the development of advanced high temperature materials for aerospace applications, coatings for rocket nozzles, improved medical and industrial optics, metallic glasses, ablatives for reentry vehicles, and materials with memory. Modeling of additive manufacturing materials processing is necessary for the study of their resulting materials properties. In addition, the modeling of the selective laser melting processes and its materials property predictions are also underway. Unfortunately, there is very little data for the properties of these materials, especially of the materials in the liquid state. Some method to measure thermophysical properties of additive manufacturing materials is necessary. The ESL lab is ideal for these studies. The lab can provide surface tension and viscosity of molten materials, density measurements, emissivity measurements, and even creep strength measurements. The ESL lab can also determine melting temperature, surface temperatures, and phase transition temperatures of additive manufactured materials. This presentation will provide background on the ESL lab and its capabilities, provide an approach to using the ESL

  2. A Study of Additive Noise Model for Robust Speech Recognition

    NASA Astrophysics Data System (ADS)

    Awatade, Manisha H.

    2011-12-01

    A model of how speech amplitude spectra are affected by additive noise is studied. Acoustic features are extracted based on the noise robust parts of speech spectra without losing discriminative information. An existing two non-linear processing methods, harmonic demodulation and spectral peak-to-valley ratio locking, are designed to minimize mismatch between clean and noisy speech features. Previously studied methods, including peak isolation [1], do not require noise estimation and are effective in dealing with both stationary and non-stationary noise.

  3. Feasibility study of a dual detector configuration concept for simultaneous megavoltage imaging and dose verification in radiotherapy

    SciTech Connect

    Deshpande, Shrikant; McNamara, Aimee L.; Holloway, Lois; Metcalfe, Peter; Vial, Philip

    2015-04-15

    Purpose: To test the feasibility of a dual detector concept for comprehensive verification of external beam radiotherapy. Specifically, the authors test the hypothesis that a portal imaging device coupled to a 2D dosimeter provides a system capable of simultaneous imaging and dose verification, and that the presence of each device does not significantly detract from the performance of the other. Methods: The dual detector configuration comprised of a standard radiotherapy electronic portal imaging device (EPID) positioned directly on top of an ionization-chamber array (ICA) with 2 cm solid water buildup material (between EPID and ICA) and 5 cm solid backscatter material. The dose response characteristics of the ICA and the imaging performance of the EPID in the dual detector configuration were compared to the performance in their respective reference clinical configurations. The reference clinical configurations were 6 cm solid water buildup material, an ICA, and 5 cm solid water backscatter material as the reference dosimetry configuration, and an EPID with no additional buildup or solid backscatter material as the reference imaging configuration. The dose response of the ICA was evaluated by measuring the detector’s response with respect to off-axis position, field size, and transit object thickness. Clinical dosimetry performance was evaluated by measuring a range of clinical intensity-modulated radiation therapy (IMRT) beams in transit and nontransit geometries. The imaging performance of the EPID was evaluated quantitatively by measuring the contrast-to-noise ratio (CNR) and spatial resolution. Images of an anthropomorphic phantom were also used for qualitative assessment. Results: The measured off-axis and field size response with the ICA in both transit and nontransit geometries for both dual detector configuration and reference dosimetry configuration agreed to within 1%. Transit dose response as a function of object thickness agreed to within 0.5%. All

  4. Fixed-point arithmetic for mobile devices: a fingerprinting verification case study

    NASA Astrophysics Data System (ADS)

    Moon, Yiu S.; Luk, Franklin T.; Ho, Ho C.; Tang, T. Y.; Chan, Kit C.; Leung, C. W.

    2002-12-01

    Mobile devices use embedded processors with low computing capabilities to reduce power consumption. Since floating-point arithmetic units are power hungry, computationally intensive jobs must be accomplished with either digital signal processors or hardware co-processors. In this paper, we propose to perform fixed-point arithmetic on an integer hardware unit. We illustrate the advantages of our approach by implementing fingerprint verification on mobile devices.

  5. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  6. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  7. Microwave sanitization of color additives used in cosmetics: feasibility study.

    PubMed

    Jasnow, S B; Smith, J L

    1975-08-01

    Microwave exposure has been explored as a method of microbiologically sanitizing color additives used in cosmetic products. Selected microbiologically unacceptable cosmetic color additives, D&C red no. 7 Ca lake (certified synthetic organic color), carmine (natural organic color not subject to certification), and chromium hydroxide green (inorganic color not subject to certification), were submitted to microwave exposure. Gram-negative bacteria were eliminated, as verified by enrichment procedures, and levels of gram-positive bacteria were reduced. Generally, analytical and dermal safety studies indicated no significant alterations in physical, chemical, and toxicological properties of the colors. Sanitization was also successfully performed on other colors (D&C red no. 9 Ba lake, D&C red no. 12 Ba lake, D&C green no. 5, and FD&C red no. 4); initial physical and chemical tests were satisfactory. Results indicated that this method of sanitization is feasible and warrants further investigation.

  8. BIG FROG WILDERNESS STUDY AREA AND ADDITIONS, TENNESSEE AND GEORGIA.

    USGS Publications Warehouse

    Slack, John F.; Gazdik, Gertrude C.

    1984-01-01

    A mineral-resource survey was made of the Big Frog Wilderness Study Area and additions, Tennessee-Georgia. Geochemical sampling found traces of gold, zinc, copper, and arsenic in rocks, stream sediments, and panned concentrates, but not in sufficient quantities to indicate the presence of deposits of these metals. The results of the survey indicate that there is little promise for the occurrence of metallic mineral deposits within the study area. The only apparent resources are nonmetallic commodities including rock suitable for construction materials, and small amounts of sand and gravel; however, these commodities are found in abundance outside the study area. A potential may exist for oil and natural gas at great depths, but this cannot be evaluated by the present study.

  9. Recommended Protocol for Round Robin Studies in Additive Manufacturing

    PubMed Central

    Moylan, Shawn; Brown, Christopher U.; Slotwinski, John

    2016-01-01

    One way to improve confidence and encourage proliferation of additive manufacturing (AM) technologies and parts is by generating more high quality data describing the performance of AM processes and parts. Many in the AM community see round robin studies as a way to generate large data sets while distributing the cost among the participants, thereby reducing the cost to individual users. The National Institute of Standards and Technology (NIST) has conducted and participated in several of these AM round robin studies. While the results of these studies are interesting and informative, many of the lessons learned in conducting these studies concern the logistics and methods of the study and unique issues presented by AM. Existing standards for conducting interlaboratory studies of measurement methods, along with NIST’s experience, form the basis for recommended protocols for conducting AM round robin studies. The role of round robin studies in AM qualification, some of the limitations of round robin studies, and the potential benefit of less formal collaborative experiments where multiple factors, AM machine being only one, are varied simultaneously are also discussed. PMID:27274602

  10. Making intelligent systems team players: Additional case studies

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.; Rhoads, Ron W.

    1993-01-01

    Observations from a case study of intelligent systems are reported as part of a multi-year interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces. A series of studies were conducted to investigate issues in designing intelligent fault management systems in aerospace applications for effective human-computer interaction. The results of the initial study are documented in two NASA technical memoranda: TM 104738 Making Intelligent Systems Team Players: Case Studies and Design Issues, Volumes 1 and 2; and TM 104751, Making Intelligent Systems Team Players: Overview for Designers. The objective of this additional study was to broaden the investigation of human-computer interaction design issues beyond the focus on monitoring and fault detection in the initial study. The results of this second study are documented which is intended as a supplement to the original design guidance documents. These results should be of interest to designers of intelligent systems for use in real-time operations, and to researchers in the areas of human-computer interaction and artificial intelligence.

  11. Is flow verification necessary

    SciTech Connect

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper.

  12. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  13. RAMSEYS DRAFT WILDERNESS STUDY AREA AND ADDITION, VIRGINIA.

    USGS Publications Warehouse

    Lesure, Frank G.; Mory, Peter C.

    1984-01-01

    Mineral-resource surveys of the Ramseys Draft Wilderness Study Area and adjoining roadless area addition in George Washington National Forest in the western valley and ridge province, Augusta and Highland Counties, Virginia, were done. The surveys outlined three small areas containing anomalous amounts of copper, lead, and zinc related to stratabound red-bed copper mineralization, but these occurrences are not large and are not considered as having mineral-resource potential. The area contains abundant sandstone suitable for construction materials and shale suitable for making brick, tile, and other low-grade ceramic products, but these commodities occur in abundance outside the wilderness study area. Structural conditions are probably favorable for the accumulation of natural gas, but exploratory drilling has not been done sufficiently near the area to evaluate the gas potential.

  14. Characteristics of scale models of large deployable mesh reflector antennas and study on space verification test plan

    NASA Astrophysics Data System (ADS)

    Ebisui, Takashi; Iso, Akio; Orikasa, Teruaki; Sugimoto, Toshio; Okamoto, Teruki; Ueno, Miyoshi

    A large deployable antenna is essential for effective mobile communication satellites. This paper describes the characteristics of various scale models of the large deployable mesh reflector antennas and a study on space verification test plan using the scale models. Two electrical scale models of the mesh reflectors have been constructed to evaluate the electrical performance of a mesh reflector antenna. One of the models is the Hexa-Link Truss structure, and the other model is the TETRUS structure. Diameter of these models are 3 m. The experimental measurements and calculations of these electrical and mechanical models are also described.

  15. Expert system verification and validation study. Phase 2: Requirements identification. Delivery 1: Updated survey report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.

  16. Verification of Multiphysics software: Space and time convergence studies for nonlinearly coupled applications

    SciTech Connect

    Jean C. Ragusa; Vijay Mahadevan; Vincent A. Mousseau

    2009-05-01

    High-fidelity modeling of nuclear reactors requires the solution of a nonlinear coupled multi-physics stiff problem with widely varying time and length scales that need to be resolved correctly. A numerical method that converges the implicit nonlinear terms to a small tolerance is often referred to as nonlinearly consistent (or tightly coupled). This nonlinear consistency is still lacking in the vast majority of coupling techniques today. We present a tightly coupled multiphysics framework that tackles this issue and present code-verification and convergence analyses in space and time for several models of nonlinear coupled physics.

  17. Study of defect verification based on lithography simulation with a SEM system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-07-01

    In a Photomask manufacturing process, mask defect inspection is an increasingly important topic for 193nm optical lithography. Further extension of 193nm optical lithography to the next technology nodes, staying at a maximum numerical aperture (NA) of 1.35, pushes lithography to its utmost limits. This extension from technologies like ILT and SMO requires more complex mask patterns. In mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask features. One of the solutions is lithography simulation like AIMS. An issue with AIMS, however, is the low throughput of measurement, analysis etc.

  18. Requirements of Operational Verification of the NWSRFS-ESP Forecasts

    NASA Astrophysics Data System (ADS)

    Imam, B.; Werner, K.; Hartmann, H.; Sorooshian, S.; Pritchard, E.

    2006-12-01

    Forecast verification is the process of determining the quality of forecasts. This requires the utilization of quality measures that summarize one or more aspects of the relationship between forecasts and observations. Technically, the three main objectives of forecast verification are (a) monitoring, (b) improving, and (c) comparing the quality of different forecasting systems. However, users of forecast verification results range from administrators, who want to know the value of investing in forecast system improvement to forecasters and modelers, who want to assess areas of improving their own predictions, to forecast users, who weigh their decision based not only on the forecast but also on the perceived quality of such forecast. Our discussions with several forecasters and hydrologists in charge at various River Forecast Centers (RFCs) indicated that operational hydrologists view verification in a broader sense than their counterparts within the meteorological community. Their view encompasses verification as a possible tool in determining whether a forecast is ready for issuance as an "official" product or that it needs more work. In addition to the common challenges associated with verification of monthly and seasonal probabilistic forecasts. which include determining and obtaining the appropriate size of "forecast-observation" pairs data set, operational verification also requires the consideration of verification strategies for short-term forecasts. Under such condition, the identification of conditional verification (i.e., similar conditions) samples, tracking model states, input, and output, relative to their climatology, and the establishment of links between the forecast issuance, verification, and simulation components of the forecast system become important. In this presentation, we address the impacts of such view on the potential requirements of an operational verification system for the Ensemble Streamflow Prediction (ESP) component of the

  19. Experimental Study of Additives on Viscosity biodiesel at Low Temperature

    NASA Astrophysics Data System (ADS)

    Fajar, Berkah; Sukarno

    2015-09-01

    An experimental investigation was performed to find out the viscosity of additive and biodiesel fuel mixture in the temperature range from 283 K to 318 K. Solutions to reduce the viscosity of biodiesel is to add the biodiesel with some additive. The viscosity was measured using a Brookfield Rheometer DV-II. The additives were the generic additive (Diethyl Ether/DDE) and the commercial additive Viscoplex 10-330 CFI. Each biodiesel blends had a concentration of the mixture: 0.0; 0.25; 0.5; 0.75; 1.0; and 1.25% vol. Temperature of biodiesel was controlled from 40°C to 0°C. The viscosity of biodiesel and additive mixture at a constant temperature can be approximated by a polynomial equation and at a constant concentration by exponential equation. The optimum mixture is at 0.75% for diethyl ether and 0.5% for viscoplex.

  20. Satellite Power Systems (SPS) concept definition study. Volume 6: SPS technology requirements and verification

    NASA Technical Reports Server (NTRS)

    Hanley, G.

    1978-01-01

    Volume 6 of the SPS Concept Definition Study is presented and also incorporates results of NASA/MSFC in-house effort. This volume includes a supporting research and technology summary. Other volumes of the final report that provide additional detail are as follows: (1) Executive Summary; (2) SPS System Requirements; (3) SPS Concept Evolution; (4) SPS Point Design Definition; (5) Transportation and Operations Analysis; and Volume 7, SPS Program Plan and Economic Analysis.

  1. Additive Manufacturing in Production: A Study Case Applying Technical Requirements

    NASA Astrophysics Data System (ADS)

    Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni

    Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.

  2. Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.

    PubMed

    Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M

    2000-02-01

    Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.

  3. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  4. Soil moisture verification study of the ESTAR microwave radiometer - Walnut Gulch, AZ 1991

    NASA Technical Reports Server (NTRS)

    Jackson, T. J.; Le Vine, D. M.; Griffis, A.; Goodrich, D. C.; Schmugge, T. J.; Swift, C. T.; O'Neill, P. E.; Roberts, R. R.; Parry, R.

    1992-01-01

    The application of an electronically steered thinned array L-band radiometer (ESTAR) for soil moisture mapping is investigated over the arid rangeland Walnut Gulch Watershed. Antecedent rainfall and evaporation for the flights are very different and result in a wide range of soil moisture conditions. The high spatial variability of rainfall events within this region results in moisture conditions with dramatic spatial patterns. Sensor performance is verified using two approaches. Microwave data are used in conjunction with a microwave emission model to predict soil moisture. These predictions are compared to ground observations of soil moisture. A second verification is possible using an extensive data set. Both tests showed that the ESTAR is capable of providing soil moisture with the same level of accuracy as existing systems.

  5. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  6. Expert system verification and validation study. ES V/V guidelines/workshop conference summary

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    The intent of the workshop was to start moving research on the verification and validation (V&V) of knowledge based systems (KBSs) in the direction of providing tangible 'products' that a KBS developer could use. In the near term research will focus on identifying the kinds of experiences encountered during KBS development of 'real' KBSs. These will be stored in a repository and will serve as the foundation for the rest of the activities described here. One specific approach to be pursued is 'benchmarking'. With this approach, a KBS developer can use either 'canned' KBSs with seeded errors or existing KBSs with known errors to evaluate a given tool's ability to satisfactorily identify errors.

  7. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  8. Health studies indicate MTBE is safe gasoline additive

    SciTech Connect

    Anderson, E.V.

    1993-09-01

    Implementation of the oxygenated fuels program by EPA in 39 metropolitan areas, including Fairbanks and Anchorage, Alaska, in the winter of 1992, encountered some unexpected difficulties. Complaints of headaches, dizziness, nausea, and irritated eyes started in Fairbanks, jumped to Anchorage, and popped up in various locations in the lower 48 states. The suspected culprit behind these complaints was the main additive for oxygenation of gasoline is methyl tert-butyl ether (MTBE). A test program, hastily organized in response to these complaints, has indicated that MTBE is a safe gasoline additive. However, official certification of the safety of MTBE is still awaited.

  9. Kinetic study of additions of dialkylmagnesium compounds to a cycloprene

    SciTech Connect

    Watkins, E.K.; Richey, H.G. Jr.

    1992-11-01

    Reaction of Et{sub 2}Mg and spiro[2.4]hept-1-ene (1) in tetahydrofuran followed by hydrolysis furnishes mainly 1-ethylspiro[2.4]heptane (3); when hydrolysis is with D{sub 2}O, {ge}98% of this (Z)-1-ethylspirol[2.4]heptane-2-d (4). Some metalation of 1 and formation of higher molecular weight products incorporating two or three molecules of 1 also take place. Formation of 3 is first order in 1 and in Et{sub 2}Mg, and at 35.47{degrees}C the rate constant is 1.2 x 10{sup -5} L M{sup -1}s{sup -1}. Under the same conditions, the rate of addition (1.5 x 10{sup -5} L M{sup -1}{sub s}{sup -1}). Under the same conditions, the rate of addition (1.5 x 10{sup -5} L mol{sup -1} {sub s}{sup -1}) of the Grignard reagent prepared from EtBr is similar. Reactions of 1 with Me{sub 2}Mg, I-Pr{sub 2}Mg, and t-Bu{sub 2}Mg. Added Fe(acac){sub 3} increases the rate of formation of 3 from reactions of 1 with either Et{sub 2}Mg or the Grignard reagent prepared from EtBr, but additional products also are formed. 55 refs., 2 tabs.

  10. Climate studies from satellite observations - Special problems in the verification of earth radiation balance, cloud climatology, and related climate experiments

    NASA Technical Reports Server (NTRS)

    Vonder Haar, T. H.

    1982-01-01

    A body of techniques that have been developed and planned for use during the Earth Radiation Budget Experiment (ERBE), the International Satellite Cloud Climatology Project (ISCCP), and related climate experiments of the 1980's are reviewed. Validation and verification methods must apply for systems of satellites. They include: (1) use of a normalization or intercalibration satellite, (2) special intensive observation areas located over ground-truth sites, and (3) monitoring of sun and earth by several satellites and/or several instruments at the same time. Since each climate application area has a hierarchy of user communities, validation techniques vary from very detailed methods to those that simply assure high relative accuracy in detecting space and time variations for climate studies. It is shown that climate experiments generally require more emphasis on long-term stability and internal consistency of satellite data sets than high absolute accuracy.

  11. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  12. A Preliminary Study of In-House Monte Carlo Simulations: An Integrated Monte Carlo Verification System

    SciTech Connect

    Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hidek; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki

    2009-10-01

    Purpose: To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. Methods and Materials: The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. Results: The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Conclusions: Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.

  13. Psychiatric Residents' Attitudes toward and Experiences with the Clinical-Skills Verification Process: A Pilot Study on U.S. and International Medical Graduates

    ERIC Educational Resources Information Center

    Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.

    2012-01-01

    Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…

  14. NMR relaxometry study of plaster mortar with polymer additives

    SciTech Connect

    Jumate, E.; Manea, D.; Moldovan, D.; Fechete, R.

    2013-11-13

    The cement mixed with water forms a plastic paste or slurry which stiffness in time and finally hardens into a resistant stone. The addition of sand aggregates, polymers (Walocel) and/or calcium carbonate will modify dramatically the final mortar mechanic and thermal properties. The hydration processes can be observed using the 1D NMR measurements of transverse T{sub 2} relaxation times distributions analysed by a Laplace inversion algorithm. These distributions were obtained for mortar pasta measured at 2 hours after preparation then at 3, 7 and 28 days after preparation. Multiple components are identified in the T{sub 2} distributions. These can be associated with the proton bounded chemical or physical to the mortar minerals characterized by a short T{sub 2} relaxation time and to water protons in pores with three different pore sizes as observed from SEM images. The evaporation process is faster in the first hours after preparation, while the mortar hydration (bonding of water molecules to mortar minerals) can be still observed after days or months from preparation. Finally, the mechanic resistance was correlated with the transverse T{sub 2} relaxation rates corresponding to the bound water.

  15. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    SciTech Connect

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  16. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  17. Monte Carlo verification of polymer gel dosimetry applied to radionuclide therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Gear, J. I.; Charles-Edwards, E.; Partridge, M.; Flux, G. D.

    2011-11-01

    This study evaluates the dosimetric performance of the polymer gel dosimeter 'Methacrylic and Ascorbic acid in Gelatin, initiated by Copper' and its suitability for quality assurance and analysis of I-131-targeted radionuclide therapy dosimetry. Four batches of gel were manufactured in-house and sets of calibration vials and phantoms were created containing different concentrations of I-131-doped gel. Multiple dose measurements were made up to 700 h post preparation and compared to equivalent Monte Carlo simulations. In addition to uniformly filled phantoms the cross-dose distribution from a hot insert to a surrounding phantom was measured. In this example comparisons were made with both Monte Carlo and a clinical scintigraphic dosimetry method. Dose-response curves generated from the calibration data followed a sigmoid function. The gels appeared to be stable over many weeks of internal irradiation with a delay in gel response observed at 29 h post preparation. This was attributed to chemical inhibitors and slow reaction rates of long-chain radical species. For this reason, phantom measurements were only made after 190 h of irradiation. For uniformly filled phantoms of I-131 the accuracy of dose measurements agreed to within 10% when compared to Monte Carlo simulations. A radial cross-dose distribution measured using the gel dosimeter compared well to that calculated with Monte Carlo. Small inhomogeneities were observed in the dosimeter attributed to non-uniform mixing of monomer during preparation. However, they were not detrimental to this study where the quantitative accuracy and spatial resolution of polymer gel dosimetry were far superior to that calculated using scintigraphy. The difference between Monte Carlo and gel measurements was of the order of a few cGy, whilst with the scintigraphic method differences of up to 8 Gy were observed. A manipulation technique is also presented which allows 3D scintigraphic dosimetry measurements to be compared to polymer

  18. Additional EIPC Study Analysis: Interim Report on High Priority Topics

    SciTech Connect

    Hadley, Stanton W

    2013-11-01

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations were developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 13 topics was developed for further analysis; this paper discusses the first five.

  19. Additional studies for the spectrophotometric measurement of iodine in water

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Previous work in iodine spectroscopy is briefly reviewed. Continued studies of the direct spectrophotometric determination of aqueous iodine complexed with potassium iodide show that free iodine is optimally determined at the isosbestic point for these solutions. The effects on iodine determinations of turbidity and chemical substances (in trace amounts) is discussed and illustrated. At the levels tested, iodine measurements are not significantly altered by such substances. A preliminary design for an on-line, automated iodine monitor with eventual capability of operating also as a controller was analyzed and developed in detail with respect single beam colorimeter operating at two wavelengths (using a rotating filter wheel). A flow-through sample cell allows the instrument to operate continuously, except for momentary stop flow when measurements are made. The timed automatic cycling of the system may be interrupted whenever desired, for manual operation. An analog output signal permits controlling an iodine generator.

  20. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  1. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies.

    PubMed

    Caswell, Joseph M; Singh, Manraj; Persinger, Michael A

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings. PMID:27662787

  2. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies

    NASA Astrophysics Data System (ADS)

    Caswell, Joseph M.; Singh, Manraj; Persinger, Michael A.

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings.

  3. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  4. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  5. Implementation of the Short-Term Ensemble Prediction System (STEPS) in Belgium and verification of case studies

    NASA Astrophysics Data System (ADS)

    Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent

    2014-05-01

    The Short-Term Ensemble Prediction System (STEPS) is a probabilistic precipitation nowcasting scheme developed at the Australian Bureau of Meteorology in collaboration with the UK Met Office. In order to account for the multiscaling nature of rainfall structures, the radar field is decomposed into an 8 levels multiplicative cascade using a Fast Fourier Transform. The cascade is advected using the velocity field estimated with optical flow and evolves stochastically according to a hierarchy of auto-regressive processes. This allows reproducing the empirical observation that the rate of temporal evolution of the small scales is faster than the large scales. The uncertainty in radar rainfall measurement and the unknown future development of the velocity field are also considered by stochastic modelling in order to reflect their typical spatial and temporal variability. Recently, a 4 years national research program has been initiated by the University of Leuven, the Royal Meteorological Institute (RMI) of Belgium and 3 other partners: PLURISK ("forecasting and management of extreme rainfall induced risks in the urban environment"). The project deals with the nowcasting of rainfall and subsequent urban inundations, as well as socio-economic risk quantification, communication, warning and prevention. At the urban scale it is widely recognized that the uncertainty of hydrological and hydraulic models is largely driven by the input rainfall estimation and forecast uncertainty. In support to the PLURISK project the RMI aims at integrating STEPS in the current operational deterministic precipitation nowcasting system INCA-BE (Integrated Nowcasting through Comprehensive Analysis). This contribution will illustrate examples of STEPS ensemble and probabilistic nowcasts for a few selected case studies of stratiform and convective rain in Belgium. The paper focuses on the development of STEPS products for potential hydrological users and a preliminary verification of the nowcasts

  6. Hybrid plan verification for intensity-modulated radiation therapy (IMRT) using the 2D ionization chamber array I'mRT MatriXX--a feasibility study.

    PubMed

    Dobler, Barbara; Streck, Natalia; Klein, Elisabeth; Loeschel, Rainer; Haertl, Petra; Koelbl, Oliver

    2010-01-21

    The 2D ionization chamber array I'mRT MatriXX (IBA, Schwarzenbruck, Germany) has been developed for absolute 2D dosimetry and verification of intensity-modulated radiation therapy (IMRT) for perpendicular beam incidence. The aim of this study is to evaluate the applicability of I'mRT MatriXX for oblique beam incidence and hybrid plan verification of IMRT with original gantry angles. For the assessment of angular dependence, open fields with gantry angles in steps of 10 degrees were calculated on a CT scan of I'mRT MatriXX. For hybrid plan verification, 17 clinical IMRT plans and one rotational plan were used. Calculations were performed with pencil beam (PB), collapsed cone (CC) and Monte Carlo (MC) methods, which had been previously validated. Measurements were conducted on an Elekta SynergyS linear accelerator. To assess the potential and limitations of the system, gamma evaluation was performed with different dose tolerances and distances to agreement. Hybrid plan verification passed the gamma test with 4% dose tolerance and 3 mm distance to agreement in all cases, in 82-88% of the cases for tolerances of 3%/3 mm, and in 59-76% of the cases if 3%/2 mm were used. Separate evaluation of the low dose and high dose regions showed that I'mRT MatriXX can be used for hybrid plan verification of IMRT plans within 3% dose tolerance and 3 mm distance to agreement with a relaxed dose tolerance of 4% in the low dose region outside the multileaf collimator (MLC).

  7. Environmental Technology Verification (ETV) Program Case Studies: Demonstrating Program Outcomes, Volume III

    EPA Science Inventory

    This booklet, ETV Program Case Studies: Demonstrating Program Outcomes, Volume III contains two case studies, addressing verified environmental technologies for decentalized wastewater treatment and converting animal waste to energy. Each case study contains a brief description ...

  8. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  9. A simulation study investigating a Cherenkov material for use with the prompt gamma range verification in proton therapy.

    PubMed

    Lau, Andy; Ahmad, Salahuddin; Chen, Yong

    2016-05-01

    In vivo range verification methods will reveal information about the penetration depth into a patient for an incident proton beam. The prompt gamma (PG) method is a promising in vivo technique that has been shown to yield this range information by measuring the escaping MeV photons given a suitable detector system. The majority of current simulations investigating PG detectors utilize common scintillating materials ideal for photons within a low neutron background radiation field using complex geometries or novel designs. In this work we simulate a minimal detector system using a material ideal for MeV photon detection in the presence of a significant neutron field based on the Cherenkov phenomenon. The response of this selected material was quantified for the escaping particles commonly found in proton therapy applications and the feasibility of using the PG technique for this detector material was studied. Our simulations found that the majority of the range information can be determined by detecting photons emitted with a timing window less than ∼50 ns after the interaction of the proton beam with the water phantom and with an energy threshold focusing on the energy range of the de-excitation of 16O photons (∼6 MeV). The Cherenkov material investigated is able to collect these photons and estimate the range with timescales on the order of tens of nanoseconds as well as greatly suppress the signal due to neutron. PMID:27163377

  10. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-01

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment.

  11. Theoretical study of closed-loop recycling liquid-liquid chromatography and experimental verification of the theory.

    PubMed

    Kostanyan, Artak E; Erastov, Andrey A

    2016-09-01

    The non-ideal recycling equilibrium-cell model including the effects of extra-column dispersion is used to simulate and analyze closed-loop recycling counter-current chromatography (CLR CCC). Previously, the operating scheme with the detector located before the column was considered. In this study, analysis of the process is carried out for a more realistic and practical scheme with the detector located immediately after the column. Peak equation for individual cycles and equations describing the transport of single peaks and complex chromatograms inside the recycling closed-loop, as well as equations for the resolution between single solute peaks of the neighboring cycles, for the resolution of peaks in the recycling chromatogram and for the resolution between the chromatograms of the neighboring cycles are presented. It is shown that, unlike conventional chromatography, increasing of the extra-column volume (the recycling line length) may allow a better separation of the components in CLR chromatography. For the experimental verification of the theory, aspirin, caffeine, coumarin and the solvent system hexane/ethyl acetate/ethanol/water (1:1:1:1) were used. Comparison of experimental and simulated processes of recycling and distribution of the solutes in the closed-loop demonstrated a good agreement between theory and experiment. PMID:27492599

  12. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines

    EPA Science Inventory

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  13. Working memory mechanism in proportional quantifier verification.

    PubMed

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-12-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g., "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow dots". The second study reveals that both types of sentences are correlated with memory storage, however, only proportional sentences are associated with the cognitive control. This result suggests that the cognitive mechanism underlying the verification of proportional quantifiers is crucially related to the integration process, in which an individual has to compare in memory the cardinalities of two sets. In the third study we find that the numerical distance between two cardinalities that must be compared significantly influences the verification time and accuracy. The results of our studies are discussed in the broader context of processing complex sentences. PMID:24374596

  14. Fate of surfactants in activated sludge wastewater treatment plants: A model verification study

    SciTech Connect

    Feijtel, T.; Vits, H.; Murray-Smith, R.; Wijk, R. van; Koch, V.; Schroeder, R.; Birch, R.; Ten Berge, W.

    1995-12-31

    The European Chemical Industry has commissioned a joint industry Task Force of the Association International de la Savonnerie et la Detergence (AIS) and the Comite Europeen de Agents de Surface et Intermediares Organiques (CESIO) to develop and apply specific methodology for the environmental monitoring of surfactants. The objectives of the TF was (1) to establish the fate, distribution and concentrations of major surfactants in relevant environmental compartments and (2) to provide the necessary data for checking the applicability of mathematical models to predict their fate and concentrations in these environmental compartments. This presentation will focus on the results of this AIS/CESIO surfactant, monitoring program and how the measured removals, effluent concentrations, and sludge concentrations compare to predicted levels for different wastewater treatment plants across Europe. Mathematical models that predict the fate of chemicals in Wastewater Treatment Plants (WWTPS) are used as an integral part of risk assessment process in many countries. In this paper, the predictive power of two mathematical models, SIMPLETREAT and WWTREAT, is checked against linear alkylbenzene sulfonate (LAS) fate data collected at five WWTPs located across Europe. Magnitudes and time-scales of the variability in WWTP streams were identified using statistical methods. In addition, the performance of these models is also checked against measured removal and effluent data for alcohol ethoxylates and alcohol ethoxysulfates.

  15. RESULTS OF A METHOD VERIFICATION STUDY FOR ANALYSES OF PCP IN SOIL

    EPA Science Inventory

    As a prelude to a field demonstration of the fungal treatment technology by the SITE Program, a field treatability study was performed to select optimal fungal species and loading rates.using the site-specific soil matrix contaminated with Wood preserving wastes: PCP and PAHS. ur...

  16. PEPT: An invaluable tool for 3-D particle tracking and CFD simulation verification in hydrocyclone studies

    NASA Astrophysics Data System (ADS)

    Chang, Yu-Fen; Adamsen, Tom C. H.; Pisarev, Gleb I.; Hoffmann, Alex C.

    2013-05-01

    Particle tracks in a hydrocyclone generated both experimentally by positron emission particle tracking (PEPT) and numerically with Eulerian-Lagranian CFD have been studied and compared. A hydrocyclone with a cylinder-on-cone design was used in this study, the geometries used in the CFD simulations and in the experiments being identical. It is shown that it is possible to track a fast-moving particle in a hydrocyclone using PEPT with high temporal and spatial resolutions. The numerical 3-D particle trajectories were generated using the Large Eddy Simulation (LES) turbulence model for the fluid and Lagrangian particle tracking for the particles. The behaviors of the particles were analyzed in detail and were found to be consistent between experiments and CFD simulations. The tracks of the particles are discussed and related to the fluid flow field visualized in the CFD simulations using the cross-sectional static pressure distribution.

  17. Case studies of thermal energy storage (TES) systems: Evaluation and verification of system performance

    SciTech Connect

    Akbari, H.; Sezgen, O.

    1992-01-01

    We have developed two case studies to review and analyze energy performance of thermal energy storage CMS systems in commercial buildings. Our case studies considered two partial ice storage systems in Northern California. For each case, we compiled historical data on TES design, installation, and operation. This information was further enhanced by data obtained through interviews with the building owners and operators. The performance and historical data of the TES systems and their components were grouped into issues related to design, installation, operation, and maintenance of the systems. Our analysis indicated that (1) almost all problems related to the operation of TES and non-TES systems could be traced back to the design of the system, and (2) the identified problems were not unique to the TES systems. There were as many original problems with conventional'' HVAC systems and components as with TES systems. Judging from the problems related to non-TES components identified in these two case studies, it is reasonable to conclude that conventional systems have as many problems as TES systems, but a failure, in a TES system may have a more dramatic impact on thermal comfort and electricity charges. The objective of the designers of the TES systems in the case-study buildings was to design just-the-right-size systems so that both the initial investment and operating costs would be minimized. Given such criteria, a system is typically designed only for normal and steady-state operating conditions-which often precludes due consideration to factors such as maintenance, growth in the needed capacity, ease of the operation, and modularity of the systems. Therefore, it is not surprising to find that these systems, at least initially, did not perform to the design intent and expectation and that they had to go through extended periods of trouble-shooting.

  18. Case studies of thermal energy storage (TES) systems: Evaluation and verification of system performance. Final report

    SciTech Connect

    Akbari, H.; Sezgen, O.

    1992-01-01

    We have developed two case studies to review and analyze energy performance of thermal energy storage CMS systems in commercial buildings. Our case studies considered two partial ice storage systems in Northern California. For each case, we compiled historical data on TES design, installation, and operation. This information was further enhanced by data obtained through interviews with the building owners and operators. The performance and historical data of the TES systems and their components were grouped into issues related to design, installation, operation, and maintenance of the systems. Our analysis indicated that (1) almost all problems related to the operation of TES and non-TES systems could be traced back to the design of the system, and (2) the identified problems were not unique to the TES systems. There were as many original problems with ``conventional`` HVAC systems and components as with TES systems. Judging from the problems related to non-TES components identified in these two case studies, it is reasonable to conclude that conventional systems have as many problems as TES systems, but a failure, in a TES system may have a more dramatic impact on thermal comfort and electricity charges. The objective of the designers of the TES systems in the case-study buildings was to design just-the-right-size systems so that both the initial investment and operating costs would be minimized. Given such criteria, a system is typically designed only for normal and steady-state operating conditions-which often precludes due consideration to factors such as maintenance, growth in the needed capacity, ease of the operation, and modularity of the systems. Therefore, it is not surprising to find that these systems, at least initially, did not perform to the design intent and expectation and that they had to go through extended periods of trouble-shooting.

  19. Marine induction studies based on sea surface scalar magnetic field measurements. A concept and its verification

    NASA Astrophysics Data System (ADS)

    Kuvshinov, A. V.; Poedjono, B.; Matzka, J.; Olsen, N.; Pai, S.; Samrock, F.

    2013-12-01

    Most marine EM studies are based on sea-bottom measurements which are expensive and logistically demanding. We propose a low-cost and easy-to-deploy magnetic survey concept which exploits sea surface measurements. It is assumed that the exciting source can be described by a plane wave. The concept is based on responses that relate variations of the scalar magnetic field at the survey sites with variations of the horizontal magnetic field at a base site. It can be shown that these scalar responses are a mixture of standard tipper responses and elements of the horizontal magnetic tensor and thus can be used to probe the electrical conductivity of the subsoil. This opens an avenue for sea-surface induction studies which so far was believed very difficult to conduct if conventional approaches based on vector measurements are invoked. We perform 3-D realistic model studies where the target region was Oahu Island and its surroundings, and USGS operated Honolulu geomagnetic observatory was chosen as the base site. We compare the predicted responses with the responses estimated from the scalar data collected at a few locations around Oahu Island by the unmanned, autonomous, wave and solar powered 'Wave Glider' developed and operated by Liquid Robotics Oil and Gas/Schlumberger. The marine robots observation platform is equipped with a tow Overhauser magnetometer (validated by USGS). The studies show an encouraging agreement between predictions and experiment in both components of the scalar response at all locations and we consider this as a proof of the suggested concept.

  20. Microbial ureolysis in the seawater-catalysed urine phosphorus recovery system: Kinetic study and reactor verification.

    PubMed

    Tang, Wen-Tao; Dai, Ji; Liu, Rulong; Chen, Guang-Hao

    2015-12-15

    Our previous study has confirmed the feasibility of using seawater as an economical precipitant for urine phosphorus (P) precipitation. However, we still understand very little about the ureolysis in the Seawater-based Urine Phosphorus Recovery (SUPR) system despite its being a crucial step for urine P recovery. In this study, batch experiments were conducted to investigate the kinetics of microbial ureolysis in the seawater-urine system. Indigenous bacteria from urine and seawater exhibited relatively low ureolytic activity, but they adapted quickly to the urine-seawater mixture during batch cultivation. During cultivation, both the abundance and specific ureolysis rate of the indigenous bacteria were greatly enhanced as confirmed by a biomass-dependent Michaelis-Menten model. The period for fully ureolysis was decreased from 180 h to 2.5 h after four cycles of cultivation. Based on the successful cultivation, a lab-scale SUPR reactor was set up to verify the fast ureolysis and efficient P recovery in the SUPR system. Nearly complete urine P removal was achieved in the reactor in 6 h without adding any chemicals. Terminal Restriction Fragment Length Polymorphism (TRFLP) analysis revealed that the predominant groups of bacteria in the SUPR reactor likely originated from seawater rather than urine. Moreover, batch tests confirmed the high ureolysis rates and high phosphorus removal efficiency induced by cultivated bacteria in the SUPR reactor under seawater-to-urine mixing ratios ranging from 1:1 to 9:1. This study has proved that the enrichment of indigenous bacteria in the SUPR system can lead to sufficient ureolytic activity for phosphate precipitation, thus providing an efficient and economical method for urine P recovery. PMID:26378727

  1. Microbial ureolysis in the seawater-catalysed urine phosphorus recovery system: Kinetic study and reactor verification.

    PubMed

    Tang, Wen-Tao; Dai, Ji; Liu, Rulong; Chen, Guang-Hao

    2015-12-15

    Our previous study has confirmed the feasibility of using seawater as an economical precipitant for urine phosphorus (P) precipitation. However, we still understand very little about the ureolysis in the Seawater-based Urine Phosphorus Recovery (SUPR) system despite its being a crucial step for urine P recovery. In this study, batch experiments were conducted to investigate the kinetics of microbial ureolysis in the seawater-urine system. Indigenous bacteria from urine and seawater exhibited relatively low ureolytic activity, but they adapted quickly to the urine-seawater mixture during batch cultivation. During cultivation, both the abundance and specific ureolysis rate of the indigenous bacteria were greatly enhanced as confirmed by a biomass-dependent Michaelis-Menten model. The period for fully ureolysis was decreased from 180 h to 2.5 h after four cycles of cultivation. Based on the successful cultivation, a lab-scale SUPR reactor was set up to verify the fast ureolysis and efficient P recovery in the SUPR system. Nearly complete urine P removal was achieved in the reactor in 6 h without adding any chemicals. Terminal Restriction Fragment Length Polymorphism (TRFLP) analysis revealed that the predominant groups of bacteria in the SUPR reactor likely originated from seawater rather than urine. Moreover, batch tests confirmed the high ureolysis rates and high phosphorus removal efficiency induced by cultivated bacteria in the SUPR reactor under seawater-to-urine mixing ratios ranging from 1:1 to 9:1. This study has proved that the enrichment of indigenous bacteria in the SUPR system can lead to sufficient ureolytic activity for phosphate precipitation, thus providing an efficient and economical method for urine P recovery.

  2. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies

    PubMed Central

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  3. An Interoperability Platform Enabling Reuse of Electronic Health Records for Signal Verification Studies.

    PubMed

    Yuksel, Mustafa; Gonul, Suat; Laleci Erturkmen, Gokce Banu; Sinaci, Ali Anil; Invernizzi, Paolo; Facchinetti, Sara; Migliavacca, Andrea; Bergvall, Tomas; Depraetere, Kristof; De Roo, Jos

    2016-01-01

    Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information. PMID:27123451

  4. A kinetic study of lipase-catalyzed reversible kinetic resolution involving verification at miniplant-scale.

    PubMed

    Berendsen, W R; Gendrot, G; Freund, A; Reuss, M

    2006-12-01

    Lipase-catalyzed kinetic resolution of racemates is a popular method for synthesis of chiral synthons. Most of these resolutions are reversible equilibrium limited reactions. For the first time, an extensive kinetic model is proposed for kinetic resolution reactions, which takes into account the full reversibility of the reaction, substrate inhibition by an acyl donor and an acyl acceptor as well as alternative substrate inhibition by each enantiomer. For this purpose, the reversible enantioselective transesterification of (R/S)-1-methoxy-2-propanol with ethyl acetate catalyzed by Candida antarctica lipase B (CAL-B) is investigated. The detailed model presented here is valid for a wide range of substrate and product concentrations. Following model discrimination and the application of Haldane equations to reduce the degree of freedom in parameter estimation, the 11 free parameters are successfully identified. All parameters are fitted to the complete data set simultaneously. Six types of independent initial rate studies provide a solid data basis for the model. The effect of changes in substrate and product concentration on reaction kinetics is discussed. The developed model is used for simulations to study the behavior of reaction kinetics in a fixed bed reactor. The typical plot of enantiomeric excess versus conversion of substrate and product is evaluated at various initial substrate mixtures. The model is validated by comparison with experimental results obtained with a fixed bed reactor, which is part of a fully automated state-of-the-art miniplant.

  5. Time-of-flight neutron rejection to improve prompt gamma imaging for proton range verification: a simulation study.

    PubMed

    Biegun, Aleksandra K; Seravalli, Enrica; Lopes, Patrícia Cambraia; Rinaldi, Ilaria; Pinto, Marco; Oxley, David C; Dendooven, Peter; Verhaegen, Frank; Parodi, Katia; Crespo, Paulo; Schaart, Dennis R

    2012-10-21

    Therapeutic proton and heavier ion beams generate prompt gamma photons that may escape from the patient. In principle, this allows for real-time, in situ monitoring of the treatment delivery, in particular, the hadron range within the patient, by imaging the emitted prompt gamma rays. Unfortunately, the neutrons simultaneously created with the prompt photons create a background that may obscure the prompt gamma signal. To enhance the accuracy of proton dose verification by prompt gamma imaging, we therefore propose a time-of-flight (TOF) technique to reject this neutron background, involving a shifting time window to account for the propagation of the protons through the patient. Time-resolved Monte Carlo simulations of the generation and transport of prompt gamma photons and neutrons upon irradiation of a PMMA phantom with 100, 150 and 200 MeV protons were performed using Geant4 (version 9.2.p02) and MCNPX (version 2.7.D). The influence of angular collimation and TOF selection on the prompt gamma and neutron longitudinal profiles is studied. Furthermore, the implications of the proton beam microstructure (characterized by the proton bunch width and repetition period) are investigated. The application of a shifting TOF window having a width of ΔTOF(z) = 1.0 ns appears to reduce the neutron background by more than 99%. Subsequent application of an energy threshold does not appear to sharpen the distal falloff of the prompt gamma profile but reduces the tail that is observed beyond the proton range. Investigations of the influence of the beam time structure show that TOF rejection of the neutron background is expected to be effective for typical therapeutic proton cyclotrons.

  6. A study on the estimation and verification of the blended precipitation forecast for hydrological use in Korea

    NASA Astrophysics Data System (ADS)

    Yang, H.; Jeong, J.; Nam, K.; Ko, H.; Choi, Y.

    2012-12-01

    Quantitative precipitation forecasts of nowcasting based on the extrapolation of radar and numerical weather prediction models is the crucial information for severe weather such as floods, droughts, debris flows, and water quality, and to determine current and future availability of water resources. Meso-scale models well represent the cumulus convection process and the change of magnitude of precipitation, but need the spin-up time defined as the time needed to reach from the initial non-existent cloud to the actual state of cloud cumulus. The spin-up problem of meso-scale model yields the low skill score within the short range forecast lead time. Nowcasting model including the advection process of the rainfall is one of the alternatives to avoid this problem. The purpose of this study is to produce the optimized quantitative precipitation forecast by blending both the forecasted precipitation of the nowcasting and numerical weather prediction (NWP) forecast at catchment scale for hydrometeorological application. Korea Meteorological Administration (KMA) have been operating nowcasting models, which are VSRF (Very Short Range Forecast of precipitation) and MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian), and short term forecast model, which are UMRG (Unified Model of Regional Grid) and KWRF (Korean WRF). The blended precipitation forecast are estimated by using a weight scheme based on the long term average value of critical success index of each individual component of the model. The hydrological verification of the blended precipitation forecast has been conducted to 117 mid-watersheds of Korea for summertime in 2011. The performance of the blended precipitation has shown to be better than that individual forecasted precipitation.

  7. Geostatistical modeling of the spatial distribution of soil dioxin in the vicinity of an incinerator. 2. Verification and calibration study.

    PubMed

    Goovaerts, Pierre; Trinh, Hoa T; Demond, Avery H; Towey, Timothy; Chang, Shu-Chi; Gwinn, Danielle; Hong, Biling; Franzblau, Alfred; Garabrant, David; Gillespie, Brenda W; Lepkowski, James; Adriaens, Peter

    2008-05-15

    A key component in any investigation of cause-effect relationships between point source pollution, such as an incinerator, and human health is the availability of measurements and/or accurate models of exposure at the same scale or geography as the health data. Geostatistics allows one to simulate the spatial distribution of pollutant concentrations over various spatial supports while incorporating both field data and predictions of deterministic dispersion models. This methodology was used in a companion paper to identify the census blocks that have a high probability of exceeding a given level of dioxin TEQ (toxic equivalents) around an incinerator in Midland, MI. This geostatistical model, along with population data, provided guidance for the collection of 51 new soil data, which permits the verification of the geostatistical predictions, and calibration of the model. Each new soil measurement was compared to the set of 100 TEQ values simulated at the closest grid node. The correlation between the measured concentration and the averaged simulated value is moderate (0.44), and the actual concentrations are clearly overestimated in the vicinity of the plant property line. Nevertheless, probability intervals computed from simulated TEQ values provide an accurate model of uncertainty: the proportion of observations that fall within these intervals exceeds what is expected from the model. Simulation-based probability intervals are also narrower than the intervals derived from the global histogram of the data, which demonstrates the greater precision of the geostatistical approach. Log-normal ordinary kriging provided fairly similar estimation results for the small and well-sampled area used in this validation study; however, the model of uncertainty was not always accurate. The regression analysis and geostatistical simulation were then conducted using the combined set of 53 original and 51 new soil samples, leading to an updated model for the spatial distribution of

  8. A simulation study of a C-shaped in-beam PET system for dose verification in carbon ion therapy

    NASA Astrophysics Data System (ADS)

    Jung An, Su; Beak, Cheol-Ha; Lee, Kisung; Hyun Chung, Yong

    2013-01-01

    The application of hadrons such as carbon ions is being developed for the treatment of cancer. The effectiveness of such a technique is due to the eligibility of charged particles in delivering most of their energy near the end of the range, called the Bragg peak. However, accurate verification of dose delivery is required since misalignment of the hadron beam can cause serious damage to normal tissue. PET scanners can be utilized to track the carbon beam to the tumor by imaging the trail of the hadron-induced positron emitters in the irradiated volume. In this study, we designed and evaluated (through Monte Carlo simulations) an in-beam PET scanner for monitoring patient dose in carbon beam therapy. A C-shaped PET and a partial-ring PET were designed to avoid interference between the PET detectors and the therapeutic carbon beam delivery. Their performance was compared with that of a full-ring PET scanner. The C-shaped, partial-ring, and full-ring scanners consisted of 14, 12, and 16 detector modules, respectively, with a 30.2 cm inner diameter for brain imaging. Each detector module was composed of a 13×13 array of 4.0 mm×4.0 mm×20.0 mm LYSO crystals and four round 25.4 mm diameter PMTs. To estimate the production yield of positron emitters such as 10C, 11C, and 15O, a cylindrical PMMA phantom (diameter, 20 cm; thickness, 20 cm) was irradiated with 170, 290, and 350 AMeV 12C beams using the GATE code. Phantom images of the three types of scanner were evaluated by comparing the longitudinal profile of the positron emitters, measured along the carbon beam as it passed a simulated positron emitter distribution. The results demonstrated that the development of a C-shaped PET scanner to characterize carbon dose distribution for therapy planning is feasible.

  9. Dynamic model verification studies for the thermal response of the Fort St. Vrain HTGR Core

    SciTech Connect

    Ball, S J

    1980-01-01

    The safety research program for high-temperature gas-cooled reactors at ORNL is directed primarily at addressing licensing questions on the Fort St. Vrain reactor near Denver, CO. An important part of the program is to make use of experimental data from the reactor to at least partially verify the dynamic simulations that are used to predict the effects of postulated accident sequences. Comparisons were made of predictions with data from four different reactor scram (trip) events from operating power levels between 30 and 50%. An optimization program was used to rationalize the differences between predictions and measurements, and, in general, excellent agreement can be obtained by adjustment of models and parameters within their uncertainty ranges. Although the optimized models are not necessarily unique, results of the study have identified areas in which some of the models were deficient.

  10. High-Dose-Rate 192Ir Brachytherapy Dose Verification: A Phantom Study

    PubMed Central

    Nikoofar, Alireza; Hoseinpour, Zohreh; Rabi Mahdavi, Seied; Hasanzadeh, Hadi; Rezaei Tavirani, Mostafa

    2015-01-01

    Background: The high-dose-rate (HDR) brachytherapy might be an effective tool for palliation of dysphagia. Because of some concerns about adverse effects due to absorbed radiation dose, it is important to estimate absorbed dose in risky organs during this treatment. Objectives: This study aimed to measure the absorbed dose in the parotid, thyroid, and submandibular gland, eye, trachea, spinal cord, and manubrium of sternum in brachytherapy in an anthropomorphic phantom. Materials and Methods: To measure radiation dose, eye, parotid, thyroid, and submandibular gland, spine, and sternum, an anthropomorphic phantom was considered with applicators to set thermoluminescence dosimeters (TLDs). A specific target volume of about 23 cm3 in the upper thoracic esophagus was considered as target, and phantom planned computed tomography (CT) for HDR brachytherapy, then with a micro-Selectron HDR (192Ir) remote after-loading unit. Results: Absorbed doses were measured with calibrated TLDs and were expressed in centi-Gray (cGy). In regions far from target (≥ 16 cm) such as submandibular, parotid and thyroid glands, mean measured dose ranged from 1.65 to 5.5 cGy. In closer regions (≤ 16 cm), the absorbed dose might be as high as 113 cGy. Conclusions: Our study showed similar depth and surface doses; in closer regions, the surface and depth doses differed significantly due to the role of primary radiation that had imposed a high-dose gradient and difference between the plan and measurement, which was more severe because of simplifications in tissue inhomogeneity, considered in TPS relative to phantom. PMID:26413250

  11. Health Checkup and Telemedical Intervention Program for Preventive Medicine in Developing Countries: Verification Study

    PubMed Central

    Kai, Eiko; Ghosh, Partha Pratim; Islam, Rafiqul; Ahmed, Ashir; Kuroda, Masahiro; Inoue, Sozo; Hiramatsu, Tatsuo; Kimura, Michio; Shimizu, Shuji; Kobayashi, Kunihisa; Baba, Yukino; Kashima, Hisashi; Tsuda, Koji; Sugiyama, Masashi; Blondel, Mathieu; Ueda, Naonori; Kitsuregawa, Masaru; Nakashima, Naoki

    2015-01-01

    Background The prevalence of non-communicable diseases is increasing throughout the world, including developing countries. Objective The intent was to conduct a study of a preventive medical service in a developing country, combining eHealth checkups and teleconsultation as well as assess stratification rules and the short-term effects of intervention. Methods We developed an eHealth system that comprises a set of sensor devices in an attaché case, a data transmission system linked to a mobile network, and a data management application. We provided eHealth checkups for the populations of five villages and the employees of five factories/offices in Bangladesh. Individual health condition was automatically categorized into four grades based on international diagnostic standards: green (healthy), yellow (caution), orange (affected), and red (emergent). We provided teleconsultation for orange- and red-grade subjects and we provided teleprescription for these subjects as required. Results The first checkup was provided to 16,741 subjects. After one year, 2361 subjects participated in the second checkup and the systolic blood pressure of these subjects was significantly decreased from an average of 121 mmHg to an average of 116 mmHg (P<.001). Based on these results, we propose a cost-effective method using a machine learning technique (random forest method) using the medical interview, subject profiles, and checkup results as predictor to avoid costly measurements of blood sugar, to ensure sustainability of the program in developing countries. Conclusions The results of this study demonstrate the benefits of an eHealth checkup and teleconsultation program as an effective health care system in developing countries. PMID:25630348

  12. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  13. Theory for noise of propellers in angular inflow with parametric studies and experimental verification

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.; Parzych, David J.

    1993-01-01

    This report presents the derivation of a frequency domain theory and working equations for radiation of propeller harmonic noise in the presence of angular inflow. In applying the acoustic analogy, integration over the tangential coordinate of the source region is performed numerically, permitting the equations to be solved without approximation for any degree of angular inflow. Inflow angle is specified in terms of yaw, pitch, and roll angles of the aircraft. Since these can be arbitrarily large, the analysis applies with equal accuracy to propellers and helicopter rotors. For thickness and loading, the derivation is given in complete detail with working equations for near and far field. However, the quadrupole derivation has been carried only far enough to show feasibility of the numerical approach. Explicit formulas are presented for computation of source elements, evaluation of Green's functions, and location of observer points in various visual and retarded coordinate systems. The resulting computer program, called WOBBLE has been written in FORTRAN and follows the notation of this report very closely. The new theory is explored to establish the effects of varying inflow angle on axial and circumferential directivity. Also, parametric studies were performed to evaluate various phenomena outside the capabilities of earlier theories, such as an unsteady thickness effect. Validity of the theory was established by comparison with test data from conventional propellers and Prop Fans in flight and in wind tunnels under a variety of operating conditions and inflow angles.

  14. Verification of equations for incipient motion studies for a rigid rectangular channel.

    PubMed

    Bong, Charles Hin Joo; Lau, Tze Liang; Ghani, Aminuddin Ab

    2013-01-01

    The current study aims to verify the existing equations for incipient motion for a rigid rectangular channel. Data from experimental work on incipient motion from a rectangular flume with two different widths, namely 0.3 and 0.6 m, were compared with the critical velocity value predicted by the equations of Novak & Nalluri and El-Zaemey. The equation by El-Zaemey performed better with an average discrepancy ratio value of 1.06 compared with the equation by Novak & Nalluri with an average discrepancy ratio value of 0.87. However, as the sediment deposit thickness increased, the equation by El-Zaemey became less accurate. A plot on the Shields Diagram using the experimental data had shown the significant effect of the sediment deposit thickness where, as the deposit becomes thicker, the dimensionless shear stress θ value also increased. A new equation had been proposed by incorporating the sediment deposit thickness. The new equation gave improved prediction with an average discrepancy ratio value of 1.02.

  15. First in situ TOF-PET study using digital photon counters for proton range verification

    NASA Astrophysics Data System (ADS)

    Cambraia Lopes, P.; Bauer, J.; Salomon, A.; Rinaldi, I.; Tabacchini, V.; Tessonnier, T.; Crespo, P.; Parodi, K.; Schaart, D. R.

    2016-08-01

    Positron emission tomography (PET) is the imaging modality most extensively tested for treatment monitoring in particle therapy. Optimal use of PET in proton therapy requires in situ acquisition of the relatively strong 15O signal due to its relatively short half-life (~2 min) and high oxygen content in biological tissues, enabling shorter scans that are less sensitive to biological washout. This paper presents the first performance tests of a scaled-down in situ time-of-flight (TOF) PET system based on digital photon counters (DPCs) coupled to Cerium-doped Lutetium Yttrium Silicate (LYSO:Ce) crystals, providing quantitative results representative of a dual-head tomograph that complies with spatial constraints typically encountered in clinical practice (2  ×  50°, of 360°, transaxial angular acceptance). The proton-induced activity inside polymethylmethacrylate (PMMA) and polyethylene (PE) phantoms was acquired within beam pauses (in-beam) and immediately after irradiation by an actively-delivered synchrotron pencil-beam, with clinically relevant 125.67 MeV/u, 4.6  ×  108 protons s-1, and 1010 total protons. 3D activity maps reconstructed with and without TOF information are compared to FLUKA simulations, demonstrating the benefit of TOF-PET to reduce limited-angle artefacts using a 382 ps full width at half maximum coincidence resolving time. The time-dependent contributions from different radionuclides to the total count-rate are investigated. We furthermore study the impact of the acquisition time window on the laterally integrated activity depth-profiles, with emphasis on 2 min acquisitions starting at different time points. The results depend on phantom composition and reflect the differences in relative contributions from the radionuclides originating from carbon and oxygen. We observe very good agreement between the shapes of the simulated and measured activity depth-profiles for post-beam protocols. However, our results also

  16. First in situ TOF-PET study using digital photon counters for proton range verification.

    PubMed

    Cambraia Lopes, P; Bauer, J; Salomon, A; Rinaldi, I; Tabacchini, V; Tessonnier, T; Crespo, P; Parodi, K; Schaart, D R

    2016-08-21

    Positron emission tomography (PET) is the imaging modality most extensively tested for treatment monitoring in particle therapy. Optimal use of PET in proton therapy requires in situ acquisition of the relatively strong (15)O signal due to its relatively short half-life (~2 min) and high oxygen content in biological tissues, enabling shorter scans that are less sensitive to biological washout. This paper presents the first performance tests of a scaled-down in situ time-of-flight (TOF) PET system based on digital photon counters (DPCs) coupled to Cerium-doped Lutetium Yttrium Silicate (LYSO:Ce) crystals, providing quantitative results representative of a dual-head tomograph that complies with spatial constraints typically encountered in clinical practice (2  ×  50°, of 360°, transaxial angular acceptance). The proton-induced activity inside polymethylmethacrylate (PMMA) and polyethylene (PE) phantoms was acquired within beam pauses (in-beam) and immediately after irradiation by an actively-delivered synchrotron pencil-beam, with clinically relevant 125.67 MeV/u, 4.6  ×  10(8) protons s(-1), and 10(10) total protons. 3D activity maps reconstructed with and without TOF information are compared to FLUKA simulations, demonstrating the benefit of TOF-PET to reduce limited-angle artefacts using a 382 ps full width at half maximum coincidence resolving time. The time-dependent contributions from different radionuclides to the total count-rate are investigated. We furthermore study the impact of the acquisition time window on the laterally integrated activity depth-profiles, with emphasis on 2 min acquisitions starting at different time points. The results depend on phantom composition and reflect the differences in relative contributions from the radionuclides originating from carbon and oxygen. We observe very good agreement between the shapes of the simulated and measured activity depth-profiles for post-beam protocols. However, our results

  17. First in situ TOF-PET study using digital photon counters for proton range verification

    NASA Astrophysics Data System (ADS)

    Cambraia Lopes, P.; Bauer, J.; Salomon, A.; Rinaldi, I.; Tabacchini, V.; Tessonnier, T.; Crespo, P.; Parodi, K.; Schaart, D. R.

    2016-08-01

    Positron emission tomography (PET) is the imaging modality most extensively tested for treatment monitoring in particle therapy. Optimal use of PET in proton therapy requires in situ acquisition of the relatively strong 15O signal due to its relatively short half-life (~2 min) and high oxygen content in biological tissues, enabling shorter scans that are less sensitive to biological washout. This paper presents the first performance tests of a scaled-down in situ time-of-flight (TOF) PET system based on digital photon counters (DPCs) coupled to Cerium-doped Lutetium Yttrium Silicate (LYSO:Ce) crystals, providing quantitative results representative of a dual-head tomograph that complies with spatial constraints typically encountered in clinical practice (2  ×  50°, of 360°, transaxial angular acceptance). The proton-induced activity inside polymethylmethacrylate (PMMA) and polyethylene (PE) phantoms was acquired within beam pauses (in-beam) and immediately after irradiation by an actively-delivered synchrotron pencil-beam, with clinically relevant 125.67 MeV/u, 4.6  ×  108 protons s‑1, and 1010 total protons. 3D activity maps reconstructed with and without TOF information are compared to FLUKA simulations, demonstrating the benefit of TOF-PET to reduce limited-angle artefacts using a 382 ps full width at half maximum coincidence resolving time. The time-dependent contributions from different radionuclides to the total count-rate are investigated. We furthermore study the impact of the acquisition time window on the laterally integrated activity depth-profiles, with emphasis on 2 min acquisitions starting at different time points. The results depend on phantom composition and reflect the differences in relative contributions from the radionuclides originating from carbon and oxygen. We observe very good agreement between the shapes of the simulated and measured activity depth-profiles for post-beam protocols. However, our results also

  18. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  19. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  20. Programmable RET Mask Layout Verification

    NASA Astrophysics Data System (ADS)

    Beale, Daniel F.; Mayhew, Jeffrey P.; Rieger, Michael L.; Tang, Zongwu

    2002-12-01

    Emerging resolution enhancement techniques (RET) and OPC are dramatically increasing the complexity of mask layouts and, in turn, mask verification. Mask shapes needed to achieve required results on the wafer diverge significantly from corresponding shapes in the physical design, and in some cases a single chip layer may be decomposed into two masks used in multiple exposures. The mask verification challenge is to certify that a RET-synthesized mask layout will produce an acceptable facsimile of the design intent expressed in the design layout. Furthermore costs, tradeoffs between mask-complexity, design intent, targeted process latitude, and other factors are playing a growing role in helping to control rising mask costs. All of these considerations must in turn be incorporated into the mask layout verification strategy needed for data prep sign-off. In this paper we describe a technique for assessing the lithographic quality of mask layouts for diverse RET methods while effectively accommodating various manufacturing objectives and specifications. It leverages the familiar DRC paradigm for identifying errors and producing DRC-like error shapes in its output layout. It integrates a unique concept of "check figures" - layer-based geometries that dictate where and how simulations of shapes on the wafer are to be compared to the original desired layout. We will show how this provides a highly programmable environment that makes it possible to engage in "compound" check strategies that vary based on design intent and adaptive simulation with multiple checks. Verification may be applied at the "go/no go" level or can be used to build a body of data for quantitative analysis of lithographic behavior at multiple process conditions or for specific user-defined critical features. In addition, we will outline automated methods that guide the selection of input parameters controlling specific verification strategies.

  1. Practical mask inspection system with printability and pattern priority verification

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Hideo; Ozaki, Fumio; Takahara, Kenichi; Inoue, Takafumi; Kikuiri, Nobutaka

    2011-05-01

    Through the four years of study in Association of Super-Advanced Electronics Technologies (ASET) on reducing mask manufacturing Turn Around Time (TAT) and cost, we have been able to establish a technology to improve the efficiency of the review process by applying a printability verification function that utilizes computational lithography simulations to analyze defects detected by a high-resolution mask inspection system. With the advent of Source-Mask Optimization (SMO) and other technologies that extend the life of existing optical lithography, it is becoming extremely difficult to judge a defect only by the shape of a mask pattern, while avoiding pseudo-defects. Thus, printability verification is indispensable for filtering out nuisance defects from high-resolution mask inspection results. When using computational lithography simulations to verify printability with high precision, the image captured by the inspection system must be prepared with extensive care. However, for practical applications, this preparation process needs to be simplified. In addition, utilizing Mask Data Rank (MDR) to vary the defect detection sensitivity according to the patterns is also useful for simultaneously inspecting minute patterns and avoiding pseudo-defects. Combining these two technologies, we believe practical mask inspection for next generation lithography is achievable. We have been improving the estimation accuracy of the printability verification function through discussion with several customers and evaluation of their masks. In this report, we will describe the progress of these practical mask verification functions developed through customers' evaluations.

  2. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  3. Study of wood plastic composite in the presence of nitrogen containing additives

    NASA Astrophysics Data System (ADS)

    Ali, K. M. Idriss; Khan, Mubarak A.; Husain, M. M.

    1994-10-01

    Effect of nitrogen-containing additives in the study of wood plastic composites of MMA with simul and mango wood of Bangladesh has been investigated. Nine different additives were used and the additives containing carboamide group induce the highest tensile strength to the composite.

  4. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  5. National Energy Efficiency Evaluation, Measurement and Verification (EM&V) Standard: Scoping Study of Issues and Implementation Requirements

    SciTech Connect

    Schiller Consulting, Inc.; Schiller, Steven R.; Goldman, Charles A.; Galawish, Elsia

    2011-02-04

    This report is a scoping study that identifies issues associated with developing a national evaluation, measurement and verification (EM&V) standard for end-use, non-transportation, energy efficiency activities. The objectives of this study are to identify the scope of such a standard and define EM&V requirements and issues that will need to be addressed in a standard. To explore these issues, we provide and discuss: (1) a set of definitions applicable to an EM&V standard; (2) a literature review of existing guidelines, standards, and 'initiatives' relating to EM&V standards as well as a review of 'bottom-up' versus 'top-down' evaluation approaches; (3) a summary of EM&V related provisions of two recent federal legislative proposals (Congressman Waxman's and Markey's American Clean Energy and Security Act of 2009 and Senator Bingaman's American Clean Energy Leadership Act of 2009) that include national efficiency resource requirements; (4) an annotated list of issues that that are likely to be central to, and need to be considered when, developing a national EM&V standard; and (5) a discussion of the implications of such issues. There are three primary reasons for developing a national efficiency EM&V standard. First, some policy makers, regulators and practitioners believe that a national standard would streamline EM&V implementation, reduce costs and complexity, and improve comparability of results across jurisdictions; although there are benefits associated with each jurisdiction setting its own EM&V requirements based on their specific portfolio and evaluation budgets and objectives. Secondly, if energy efficiency is determined by the US Environmental Protection Agency to be a Best Available Control Technology (BACT) for avoiding criteria pollutant and/or greenhouse gas emissions, then a standard can be required for documenting the emission reductions resulting from efficiency actions. The third reason for a national EM&V standard is that such a standard is

  6. Verification and arms control

    SciTech Connect

    Potter, W.C.

    1985-01-01

    Recent years have witnessed an increased stress upon the verification of arms control agreements, both as a technical problem and as a political issue. As one contribution here points out, the middle ground has shrunk between those who are persuaded that the Soviets are ''cheating'' and those who are willing to take some verification risks for the sake of achieving arms control. One angle, according to a Lawrence Livermore physicist who served as a member of the delegation to the various test-ban treaty negotiations, is the limited effectiveness of on-site inspection as compared to other means of verification.

  7. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGES

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  8. Study on the Tritium Behaviors in the VHTR System. Part 1: Development of Tritium Analysis Code for VHTR and Verification

    SciTech Connect

    Eung Soo Kim; Chang Ho Oh; Mike Patterson

    2010-07-01

    A tritium permeation analyses code (TPAC) has been developed in Idaho National Laboratory (INL) by using MATLAB SIMULINK package for analysis of tritium behaviors in the VHTRs integrated with hydrogen production and process heat application systems. The modeling is based on the mass balance of tritium containing species and hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. The code includes (1) tritium sources from ternary fission and neutron reactions with 6Li, 7Li 10B, 3He, (2) tritium purification system, (3) leakage of tritium with coolant, (4) permeation through pipes, vessels, and heat exchangers, (4) electrolyzer for high temperature steam electrolysis (HTSE), and (5) isotope exchange for SI process. Verification of the code has been performed by comparisons with the analytical solutions, the experimental data, and the benchmark code results based on the Peach Bottom reactor design. The results showed that all the governing equations are well implemented into the code and correctly solved. This paper summarizes all the background, the theory, the code structures, and some verification results related to the TPAC code development in Idaho National Laboratory (INL).

  9. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  10. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  11. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  12. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  13. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  14. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  15. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 11 2014-01-01 2014-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  16. 7 CFR 1710.253 - Engineering and cost studies-addition of generation capacity.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Engineering and cost studies-addition of generation... TO ELECTRIC LOANS AND GUARANTEES Construction Work Plans and Related Studies § 1710.253 Engineering... engineering and cost studies as specified by RUS. The studies shall cover a period from the beginning of...

  17. A patient-specific quality assurance study on absolute dose verification using ionization chambers of different volumes in RapidArc treatments

    SciTech Connect

    Syam Kumar, S.A.; Sukumar, Prabakar; Sriram, Padmanaban; Rajasekaran, Dhanabalan; Aketi, Srinu; Vivekanandan, Nagarajan

    2012-01-01

    The recalculation of 1 fraction from a patient treatment plan on a phantom and subsequent measurements have become the norms for measurement-based verification, which combines the quality assurance recommendations that deal with the treatment planning system and the beam delivery system. This type of evaluation has prompted attention to measurement equipment and techniques. Ionization chambers are considered the gold standard because of their precision, availability, and relative ease of use. This study evaluates and compares 5 different ionization chambers: phantom combinations for verification in routine patient-specific quality assurance of RapidArc treatments. Fifteen different RapidArc plans conforming to the clinical standards were selected for the study. Verification plans were then created for each treatment plan with different chamber-phantom combinations scanned by computed tomography. This includes Medtec intensity modulated radiation therapy (IMRT) phantom with micro-ionization chamber (0.007 cm{sup 3}) and pinpoint chamber (0.015 cm{sup 3}), PTW-Octavius phantom with semiflex chamber (0.125 cm{sup 3}) and 2D array (0.125 cm{sup 3}), and indigenously made Circular wax phantom with 0.6 cm{sup 3} chamber. The measured isocenter absolute dose was compared with the treatment planning system (TPS) plan. The micro-ionization chamber shows more deviations when compared with semiflex and 0.6 cm{sup 3} with a maximum variation of -4.76%, -1.49%, and 2.23% for micro-ionization, semiflex, and farmer chambers, respectively. The positive variations indicate that the chamber with larger volume overestimates. Farmer chamber shows higher deviation when compared with 0.125 cm{sup 3}. In general the deviation was found to be <1% with the semiflex and farmer chambers. A maximum variation of 2% was observed for the 0.007 cm{sup 3} ionization chamber, except in a few cases. Pinpoint chamber underestimates the calculated isocenter dose by a maximum of 4.8%. Absolute dose

  18. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  19. Speaker Verification in Realistic Noisy Environment in Forensic Science

    NASA Astrophysics Data System (ADS)

    Kamada, Toshiaki; Minematsu, Nobuaki; Osanai, Takashi; Makinae, Hisanori; Tanimoto, Masumi

    In forensic voice telephony speaker verification, we may be requested to identify a speaker in a very noisy environment, unlike the conditions in general research. In a noisy environment, we process speech first by clarifying it. However, the previous study of speaker verification from clarified speech did not yield satisfactory results. In this study, we experimented on speaker verification with clarification of speech in a noisy environment, and we examined the relationship between improving acoustic quality and speaker verification results. Moreover, experiments with realistic noise such as a crime prevention alarm and power supply noise was conducted, and speaker verification accuracy in a realistic environment was examined. We confirmed the validity of speaker verification with clarification of speech in a realistic noisy environment.

  20. Studies of jet fuel additives using the quartz crystal microbalance and pressure monitoring at 140 C

    SciTech Connect

    Zabarnick, S.; Grinstead, R.R. . Aerospace Mechanics Div./KL-463)

    1994-11-01

    Recent advances in jet aircraft and engine technology have placed an ever increasing heat load on the aircraft. The bulk of this excess heat is absorbed by the aircraft fuel, as jet fuel is used as the primary coolant for the numerous heat sources. The quartz crystal microbalance (QCM) and pressure monitoring are used for the evaluation of jet fuel additives for the improvement of jet fuel thermal stability. The mechanisms of additive behavior are determined by measuring the time-dependent deposition with the QCM and oxidation by pressure measurements. Studies at various additive concentrations permits the determination of optimum additive concentrations. Additive packages made of mixtures of antioxidants, detergent/dispersants, and metal deactivators are shown to yield good improvements in thermal stability over a wide range of jet fuel types.

  1. SU-E-J-82: Intra-Fraction Proton Beam-Range Verification with PET Imaging: Feasibility Studies with Monte Carlo Simulations and Statistical Modeling

    SciTech Connect

    Lou, K; Mirkovic, D; Sun, X; Zhu, X; Poenisch, F; Grosshans, D; Shao, Y; Clark, J

    2014-06-01

    Purpose: To study the feasibility of intra-fraction proton beam-range verification with PET imaging. Methods: Two phantoms homogeneous cylindrical PMMA phantoms (290 mm axial length, 38 mm and 200 mm diameter respectively) were studied using PET imaging: a small phantom using a mouse-sized PET (61 mm diameter field of view (FOV)) and a larger phantom using a human brain-sized PET (300 mm FOV). Monte Carlo (MC) simulations (MCNPX and GATE) were used to simulate 179.2 MeV proton pencil beams irradiating the two phantoms and be imaged by the two PET systems. A total of 50 simulations were conducted to generate 50 positron activity distributions and correspondingly 50 measured activity-ranges. The accuracy and precision of these activity-ranges were calculated under different conditions (including count statistics and other factors, such as crystal cross-section). Separate from the MC simulations, an activity distribution measured from a simulated PET image was modeled as a noiseless positron activity distribution corrupted by Poisson counting noise. The results from these two approaches were compared to assess the impact of count statistics on the accuracy and precision of activity-range calculations. Results: MC Simulations show that the accuracy and precision of an activity-range are dominated by the number (N) of coincidence events of the reconstructed image. They are improved in a manner that is inversely proportional to 1/sqrt(N), which can be understood from the statistical modeling. MC simulations also indicate that the coincidence events acquired within the first 60 seconds with 10{sup 9} protons (small phantom) and 10{sup 10} protons (large phantom) are sufficient to achieve both sub-millimeter accuracy and precision. Conclusion: Under the current MC simulation conditions, the initial study indicates that the accuracy and precision of beam-range verification are dominated by count statistics, and intra-fraction PET image-based beam-range verification is

  2. Monte Carlo patient study on the comparison of prompt gamma and PET imaging for range verification in proton therapy.

    PubMed

    Moteabbed, M; España, S; Paganetti, H

    2011-02-21

    The purpose of this work was to compare the clinical adaptation of prompt gamma (PG) imaging and positron emission tomography (PET) as independent tools for non-invasive proton beam range verification and treatment validation. The PG range correlation and its differences with PET have been modeled for the first time in a highly heterogeneous tissue environment, using different field sizes and configurations. Four patients with different tumor locations (head and neck, prostate, spine and abdomen) were chosen to compare the site-specific behaviors of the PG and PET images, using both passive scattered and pencil beam fields. Accurate reconstruction of dose, PG and PET distributions was achieved by using the planning computed tomography (CT) image in a validated GEANT4-based Monte Carlo code capable of modeling the treatment nozzle and patient anatomy in detail. The physical and biological washout phenomenon and decay half-lives for PET activity for the most abundant isotopes such as (11)C, (15)O, (13)N, (30)P and (38)K were taken into account in the data analysis. The attenuation of the gamma signal after traversing the patient geometry and respective detection efficiencies were estimated for both methods to ensure proper comparison. The projected dose, PG and PET profiles along many lines in the beam direction were analyzed to investigate the correlation consistency across the beam width. For all subjects, the PG method showed on average approximately 10 times higher gamma production rates than the PET method before, and 60 to 80 times higher production after including the washout correction and acquisition time delay. This rate strongly depended on tissue density and elemental composition. For broad passive scattered fields, it was demonstrated that large differences exist between PG and PET signal falloff positions and the correlation with the dose distribution for different lines in the beam direction. These variations also depended on the treatment site and the

  3. Development of Genetic Markers for Triploid Verification of the Pacific Oyster, Crassostrea gigas

    PubMed Central

    Kang, Jung-Ha; Lim, Hyun Jeong; Kang, Hyun-Soek; Lee, Jung-Mee; Baby, Sumy; Kim, Jong-Joo

    2013-01-01

    The triploid Pacific oyster, which is produced by mating tetraploid and diploid oysters, is favored by the aquaculture industry because of its better flavor and firmer texture, particularly during the summer. However, tetraploid oyster production is not feasible in all oysters; the development of tetraploid oysters is ongoing in some oyster species. Thus, a method for ploidy verification is necessary for this endeavor, in addition to ploidy verification in aquaculture farms and in the natural environment. In this study, a method for ploidy verification of triploid and diploid oysters was developed using multiplex polymerase chain reaction (PCR) panels containing primers for molecular microsatellite markers. Two microsatellite multiplex PCR panels consisting of three markers each were developed using previously developed microsatellite markers that were optimized for performance. Both panels were able to verify the ploidy levels of 30 triploid oysters with 100% accuracy, illustrating the utility of microsatellite markers as a tool for verifying the ploidy of individual oysters. PMID:25049868

  4. Verification of heterogeneous multi-agent system using MCMAS

    NASA Astrophysics Data System (ADS)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  5. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  6. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  7. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  8. Complementary technologies for verification of excess plutonium

    SciTech Connect

    Langner, , D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-12-31

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of {sup 240}Pu to {sup 239}Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime.

  9. Verification of COSMO model over Poland

    NASA Astrophysics Data System (ADS)

    Linkowska, Joanna; Mazur, Andrzej; Wyszogrodzki, Andrzej

    2014-05-01

    The Polish National Weather Service and Institute of Meteorology and Water Management - National Research Institute (IMWM-NRI, Warsaw, Poland) joined the Consortium for Small-Scale Modeling (COSMO) in 2002. Thanks to cooperation in the consortium the meteorological model COSMO is run operationally at IMWM-NRI at both 2.8km and 7km horizontal resolutions. In research mode, data assimilation tests have been carried out using a 6-hourly cycle nudging scheme. We would like to present verification results of the COSMO model, comparing model generated surface temperature, wind and rain fall rates with the Synop measurements. In addition, verification results of vertical profiles for chosen variables will also be analyzed and presented. The verification is divided into the following areas: i) assessing impact of data assimilation on the quality of 2.8km resolution model forecasts by switching data assimilation on and off, ii) spatio-temporal verification of model results at 7km resolution and iii) conditional verification of selected parameters against chosen meteorological condition(s).

  10. Evaluating Drugs and Food Additives for Public Use: A Case Studies Approach.

    ERIC Educational Resources Information Center

    Merritt, Sheridan V.

    1980-01-01

    Described is a case study used in an introductory college biology course that provides a basis for generating debate on an issue concerning the regulation of controversial food additives and prescription drugs. The case study contained within this article deals with drug screening, specifically with information related to thalidomide. (CS)

  11. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  12. Study raises questions about measurement of 'additionality,'or maintaining domestic health spending amid foreign donations.

    PubMed

    Garg, Charu C; Evans, David B; Dmytraczenko, Tania; Izazola-Licea, José-Antonio; Tangcharoensathien, Viroj; Ejeder, Tessa Tan-Torres

    2012-02-01

    Donor nations and philanthropic organizations increasingly require that funds provided for a specific health priority such as HIV should supplement domestic spending on that priority-a concept known as "additionality." We investigated the "additionality" concept using data from Honduras, Rwanda, and Thailand, and we found that the three countries increased funding for HIV in response to increased donor funding. In contrast, the study revealed that donors, faced with increased Global Fund resources for HIV in certain countries, tended to decrease their funding for HIV or shift funds for use in non-HIV health areas. More broadly, we found many problems in the measurement and interpretation of additionality. These findings suggest that it would be preferable for donors and countries to agree on how best to use available domestic and external funds to improve population health, and to develop better means of tracking outcomes, than to try to develop more sophisticated methods to track additionality.

  13. SHEEP MOUNTAIN WILDERNESS STUDY AREA AND CUCAMONGA WILDERNESS AND ADDITIONS, CALIFORNIA.

    USGS Publications Warehouse

    Evans, James G.; Ridenour, James

    1984-01-01

    The Sheep Mountain Wilderness Study Area and Cucamonga Wilderness and additions encompass approximately 104 sq mi of the eastern San Gabriel Mountains, Los Angeles and San Bernardino Counties, California. A mineral survey indicates areas of probable and substantiated tungsten and gold resource potential for parts of the Sheep Mountain Wilderness Study Area and an area of probable tungsten and gold resource potential in the Cucamonga Wilderness and additions. The rugged topography, withdrawal of lands from mineral entry to protect watershed, and restricted entry of lands during periods of high fire danger have contributed to the continuing decline in mineral exploration. The geologic setting precludes the presence of energy resources.

  14. Influence of Polarization on Carbohydrate Hydration: A Comparative Study Using Additive and Polarizable Force Fields.

    PubMed

    Pandey, Poonam; Mallajosyula, Sairam S

    2016-07-14

    Carbohydrates are known to closely modulate their surrounding solvent structures and influence solvation dynamics. Spectroscopic investigations studying far-IR regions (below 1000 cm(-1)) have observed spectral shifts in the libration band (around 600 cm(-1)) of water in the presence of monosaccharides and polysaccharides. In this paper, we use molecular dynamics simulations to gain atomistic insight into carbohydrate-water interactions and to specifically highlight the differences between additive (nonpolarizable) and polarizable simulations. A total of six monosaccharide systems, α and β anomers of glucose, galactose, and mannose, were studied using additive and polarizable Chemistry at HARvard Macromolecular Mechanics (CHARMM) carbohydrate force fields. Solvents were modeled using three additive water models TIP3P, TIP4P, and TIP5P in additive simulations and polarizable water model SWM4 in polarizable simulations. The presence of carbohydrate has a significant effect on the microscopic water structure, with the effects being pronounced for proximal water molecules. Notably, disruption of the tetrahedral arrangement of proximal water molecules was observed due to the formation of strong carbohydrate-water hydrogen bonds in both additive and polarizable simulations. However, the inclusion of polarization resulted in significant water-bridge occupancies, improved ordered water structures (tetrahedral order parameter), and longer carbohydrate-water H-bond correlations as compared to those for additive simulations. Additionally, polarizable simulations also allowed the calculation of power spectra from the dipole-dipole autocorrelation function, which corresponds to the IR spectra. From the power spectra, we could identify spectral signatures differentiating the proximal and bulk water structures, which could not be captured from additive simulations. PMID:27266974

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  16. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  17. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  18. Verification and Validation of RADTRAN 5.5.

    SciTech Connect

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  19. Comparative study of electrolyte additives using electrochemical impedance spectroscopy on symmetric cells

    NASA Astrophysics Data System (ADS)

    Petibon, R.; Sinha, N. N.; Burns, J. C.; Aiken, C. P.; Ye, Hui; VanElzen, Collette M.; Jain, Gaurav; Trussler, S.; Dahn, J. R.

    2014-04-01

    The effect of various electrolyte additives and additive combinations added to a 1 M LiPF6 EC:EMC electrolyte on the positive and negative electrodes surface of 1 year old wound LiCoO2/graphite cells and Li[Ni0.4Mn0.4Co0.2])O2/graphite cells was studied using electrochemical impedance spectroscopy (EIS) on symmetric cells. The additives tested were: vinylene carbonate (VC), trimethoxyboroxine (TMOBX), fluoroethylene carbonate (FEC), lithium bis(trifluoromethanesulfonyl)imide (LiTFSI), and H2O alone or in combination. In general, compared to control electrolyte, the additives tested reduced the impedance of the positive electrode and increased the impedance of the negative electrode with the exception of LiTFSI in Li[Ni0.4Mn0.4Co0.2]O2/graphite wound cells. Higher charge voltage led to higher positive electrode impedance, with the exception of 2%VC + 2% FEC, and 2% LiTFSI. In some cases, some additives when mixed with another controlled the formation of the SEI at one electrode, and shared the formation of the SEI at one electrode when mixed with a different additive.

  20. A phantom study of an in vivo dosimetry system using plastic scintillation detectors for real-time verification of 192Ir HDR brachytherapy

    PubMed Central

    Therriault-Proulx, Francois; Briere, Tina M.; Mourtada, Firas; Aubin, Sylviane; Beddar, Sam; Beaulieu, Luc

    2011-01-01

    detected by the PSD system from 78% to 100% of the time depending on the acceptable range value. The implementation of a stem effect removal technique was shown to be necessary, particularly when calculating doses at specific dwell positions, and allowed decreasing the number of false-error detections—the detection of an error when it should not be the case—from 19 to 1 for a 5% threshold out of 43 measurements. The use of the PSD system to perform temporal verification of elapsed time by the source in each catheter—generally on the order of minutes—was shown to be in agreement within a couple of seconds with the treatment plan. Conclusions: We showed that the PSD system used in this study, which was capable of stem effect removal, can perform accurate dosimetry during 192Ir HDR brachytherapy treatment in a water phantom. The system presented here shows some clear advantages over previously proposed dosimetry systems for HDR brachytherapy, and it has the potential for various online verifications of treatment delivery quality. PMID:21776789

  1. Generating Scenarios of Addition and Subtraction: A Study of Japanese University Students

    ERIC Educational Resources Information Center

    Kinda, Shigehiro

    2013-01-01

    Students are presented with problems involving three scenario types of addition and subtraction in elementary mathematics: one dynamic ("Change") and two static ("Combine, Compare"). Previous studies have indicated that the dynamic type is easier for school children, whereas the static types are more difficult and comprehended only gradually…

  2. Experimental study of combustion of decane, dodecane and hexadecane with polymeric and nano-particle additives

    NASA Astrophysics Data System (ADS)

    Ghamari, Mohsen; Ratner, Albert

    2015-11-01

    Recent studies have shown that adding combustible nano-particles could have promising effects on increasing burning rate of liquid fuels. Combustible nano-particles could enhance the heat conduction and mixing within the droplet. Polymers have also higher burning rate than regular hydrocarbon fuels because of having the flame closer to the droplet surface. Therefore adding polymeric additive could have the potential to increase the burning rate. In this study, combustion of stationary fuel droplets of n-Decane, n-Dodecane and n-Hexadecane doped with different percentages of a long chain polymer and also a very fine nano carbon was examined and compared with the pure hydrocarbon behavior. In contrast with hydrocarbon droplets with no polymer addition, several zones of combustion including a slow and steady burning zone, a strong swelling zone and a final fast and fairly steady combustion zone were also detected. In addition, increasing polymer percentage resulted in a more extended swelling zone and shorter slow burning zone in addition to a shorter total burning time. Addition of nano-particles also resulted in an overall increased burning rate and shortened burning time which is due to enhanced heat conduction within the droplet.

  3. Mental addition in bilinguals: an FMRI study of task-related and performance-related activation.

    PubMed

    Lin, Jo-Fu Lotus; Imada, Toshiaki; Kuhl, Patricia K

    2012-08-01

    Behavioral studies show that bilinguals are slower and less accurate when performing mental calculation in their nondominant (second; L2) language than in their dominant (first; L1) language. However, little is known about the neural correlates associated with the performance differences observed between bilinguals' 2 languages during arithmetic processing. To address the cortical activation differences between languages, the current study examined task-related and performance-related brain activation during mental addition when problems were presented auditorily in participants' L1 and L2. Eleven Chinese-English bilinguals heard 2-digit addition problems that required exact or approximate calculations. Functional magnetic resonance imaging results showed that auditorily presented multidigit addition in bilinguals activates bilateral inferior parietal and inferior frontal regions in both L1 and L2. Language differences were observed in the form of greater activation for L2 exact addition in the left inferior frontal area. A negative correlation between brain activation and behavioral performance during mental addition in L2 was observed in the left inferior parietal area. Current results provide further evidence for the effects of language-specific experience on arithmetic processing in bilinguals at the cortical level.

  4. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  5. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    SciTech Connect

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.; Sauer, Jeremy A.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verification test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.

  6. Sequential neural processes in abacus mental addition: an EEG and FMRI case study.

    PubMed

    Ku, Yixuan; Hong, Bo; Zhou, Wenjing; Bodner, Mark; Zhou, Yong-Di

    2012-01-01

    Abacus experts are able to mentally calculate multi-digit numbers rapidly. Some behavioral and neuroimaging studies have suggested a visuospatial and visuomotor strategy during abacus mental calculation. However, no study up to now has attempted to dissociate temporally the visuospatial neural process from the visuomotor neural process during abacus mental calculation. In the present study, an abacus expert performed the mental addition tasks (8-digit and 4-digit addends presented in visual or auditory modes) swiftly and accurately. The 100% correct rates in this expert's task performance were significantly higher than those of ordinary subjects performing 1-digit and 2-digit addition tasks. ERPs, EEG source localizations, and fMRI results taken together suggested visuospatial and visuomotor processes were sequentially arranged during the abacus mental addition with visual addends and could be dissociated from each other temporally. The visuospatial transformation of the numbers, in which the superior parietal lobule was most likely involved, might occur first (around 380 ms) after the onset of the stimuli. The visuomotor processing, in which the superior/middle frontal gyri were most likely involved, might occur later (around 440 ms). Meanwhile, fMRI results suggested that neural networks involved in the abacus mental addition with auditory stimuli were similar to those in the visual abacus mental addition. The most prominently activated brain areas in both conditions included the bilateral superior parietal lobules (BA 7) and bilateral middle frontal gyri (BA 6). These results suggest a supra-modal brain network in abacus mental addition, which may develop from normal mental calculation networks.

  7. Numerical study of water entry supercavitating flow around a vertical circular cylinder influenced by turbulent drag-reducing additives

    NASA Astrophysics Data System (ADS)

    Jiang, C. X.; Cheng, J. P.; Li, F. C.

    2015-01-01

    This paper attempts to introduce a numerical simulation procedure to simulate water-entry problems influenced by turbulent drag-reducing additives in a viscous incompressible medium. Firstly we performed a numerical investigation on water-entry supercavities in water and turbulent drag-reducing solution at the impact velocity of 28.4 m/s to confirm the accuracy of the numerical method. Based on the verification, projectile entering water and turbulent drag-reducing solution at relatively high velocity of 142.7 m/s (phase transition is considered) is simulated. The cross viscosity equation was adopted to represent the shear-thinning characteristic of aqueous solution of drag-reducing additives. The configuration and dynamic characteristics of water entry supercavity, flow resistance were discussed respectively. It was obtained that the numerical simulation results are in consistence with experimental data. Numerical results show that the supercavity length in drag-reducing solution is larger than one in water and the velocity attenuates faster at high velocity than at low velocity; the influence of drag-reducing solution is more obvious at high impact velocity. Turbulent drag-reducing additives have the great potential for enhancement of supercavity.

  8. Microstructural Study Of Zinc Hot Dip Galvanized Coatings with Titanium Additions In The Zinc Melt

    NASA Astrophysics Data System (ADS)

    Konidaris, S.; Pistofidis, N.; Vourlias, G.; Pavlidou, E.; Stergiou, A.; Stergioudis, G.; Polychroniadis, E. K.

    2007-04-01

    Zinc hot-dip galvanizing is a method for protecting iron and steel against corrosion. Galvanizing with pure Zn or Zn with additions like Ni, Al, Pb and Bi has been extensively studied, but there is a lack of scientific information about other additions. The present work examines the effect of a 0.5 wt% Ti addition in the Zn melt. The samples were exposed to accelerated corrosion in a salt spray chamber (SSC). The microstructure and chemical composition of the coatings were determined by Optical Microscopy, XRD and SEM associated with an EDS Analyzer. The results indicate that the coatings have a typical morphology, while Zn-Ti phases were also detected.

  9. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy.

    PubMed

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-01-01

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. The observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys. PMID:26446425

  10. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    SciTech Connect

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-10-08

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. In conclusion, the observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys.

  11. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    PubMed Central

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-01-01

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused by a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. The observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys. PMID:26446425

  12. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  13. Summary of Previous Chamber or Controlled Anthrax Studies and Recommendations for Possible Additional Studies

    SciTech Connect

    Piepel, Gregory F.; Amidan, Brett G.; Morrow, Jayne B.

    2010-12-29

    This report and an associated Excel file(a) summarizes the investigations and results of previous chamber and controlled studies(b) to characterize the performance of methods for collecting, storing and/or transporting, extracting, and analyzing samples from surfaces contaminated by Bacillus anthracis (BA) or related simulants. This report and the Excel are the joint work of the Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) for the Department of Homeland Security, Science and Technology Directorate. The report was originally released as PNNL-SA-69338, Rev. 0 in November 2009 with limited distribution, but was subsequently cleared for release with unlimited distribution in this Rev. 1. Only minor changes were made to Rev. 0 to yield Rev. 1. A more substantial update (including summarizing data from other studies and more condensed summary tables of data) is underway

  14. Studies of levels of biogenic amines in meat samples in relation to the content of additives.

    PubMed

    Jastrzębska, Aneta; Kowalska, Sylwia; Szłyk, Edward

    2016-01-01

    The impact of meat additives on the concentration of biogenic amines and the quality of meat was studied. Fresh white and red meat samples were fortified with the following food additives: citric and lactic acids, disodium diphosphate, sodium nitrite, sodium metabisulphite, potassium sorbate, sodium chloride, ascorbic acid, α-tocopherol, propyl 3,4,5-trihydroxybenzoate (propyl gallate) and butylated hydroxyanisole. The content of spermine, spermidine, putrescine, cadaverine, histamine, tyramine, tryptamine and 2-phenylethylamine was determined by capillary isotachophoretic methods in meat samples (fresh and fortified) during four days of storage at 4°C. The results were applied to estimate the impact of the tested additives on the formation of biogenic amines in white and red meat. For all tested meats, sodium nitrite, sodium chloride and disodium diphosphate showed the best inhibition. However, cadaverine and putrescine were characterised by the biggest changes in concentration during the storage time of all the additives. Based on the presented data for the content of biogenic amines in meat samples analysed as a function of storage time and additives, we suggest that cadaverine and putrescine have a significant impact on meat quality. PMID:26515667

  15. Studies of levels of biogenic amines in meat samples in relation to the content of additives.

    PubMed

    Jastrzębska, Aneta; Kowalska, Sylwia; Szłyk, Edward

    2016-01-01

    The impact of meat additives on the concentration of biogenic amines and the quality of meat was studied. Fresh white and red meat samples were fortified with the following food additives: citric and lactic acids, disodium diphosphate, sodium nitrite, sodium metabisulphite, potassium sorbate, sodium chloride, ascorbic acid, α-tocopherol, propyl 3,4,5-trihydroxybenzoate (propyl gallate) and butylated hydroxyanisole. The content of spermine, spermidine, putrescine, cadaverine, histamine, tyramine, tryptamine and 2-phenylethylamine was determined by capillary isotachophoretic methods in meat samples (fresh and fortified) during four days of storage at 4°C. The results were applied to estimate the impact of the tested additives on the formation of biogenic amines in white and red meat. For all tested meats, sodium nitrite, sodium chloride and disodium diphosphate showed the best inhibition. However, cadaverine and putrescine were characterised by the biggest changes in concentration during the storage time of all the additives. Based on the presented data for the content of biogenic amines in meat samples analysed as a function of storage time and additives, we suggest that cadaverine and putrescine have a significant impact on meat quality.

  16. A Study of Aluminum Combustion in Solids, Powders, Foams, Additively-Manufactured Lattices, and Composites

    NASA Astrophysics Data System (ADS)

    Black, James; Trammell, Norman; Batteh, Jad; Curran, Nicholas; Rogers, John; Littrell, Donald

    2015-06-01

    This study examines the fireball characteristics, blast parameters, and combustion efficiency of explosively-shocked aluminum-based materials. The materials included structural and non-structural aluminum forms - such as solid cylinders, foams, additively-manufactured lattices, and powders - and some polytetrafluoroethylene-aluminum (PTFE-Al) composites. The materials were explosively dispersed in a small blast chamber, and the blast properties and products were measured with pressure transducers, thermocouples, slow and fast ultraviolet/visible spectrometers, and high-speed video.

  17. Feasibility study on using fast calorimetry technique to measure a mass attribute as part of a treaty verification regime

    SciTech Connect

    Hauck, Danielle K; Bracken, David S; Mac Arthur, Duncan W; Santi, Peter A; Thron, Jonathan

    2010-01-01

    The attribute measurement technique provides a method for determining whether or not an item containing special nuclear material (SNM) possesses attributes that fall within an agreed upon range of values. One potential attribute is whether the mass of an SNM item is larger than some threshold value that has been negotiated as part of a nonproliferation treaty. While the historical focus on measuring mass attributes has been on using neutron measurements, calorimetry measurements may be a viable alternative for measuring mass attributes for plutonium-bearing items. Traditionally, calorimetry measurements have provided a highly precise and accurate determination of the thermal power that is being generated by an item. In order to achieve this high level of precision and accuracy, the item must reach thermal equilibrium inside the calorimeter prior to determining the thermal power of the item. Because the approach to thermal equilibrium is exponential in nature, a large portion of the time spent approaching equilibrium is spent with the measurement being within {approx}10% of its final equilibrium value inside the calorimeter. Since a mass attribute measurement only needs to positively determine if the mass of a given SNM item is greater than a threshold value, performing a short calorimetry measurement to determine how the system is approaching thermal equilibrium may provide sufficient information to determine if an item has a larger mass than the agreed upon threshold. In previous research into a fast calorimetry attribute technique, a two-dimensional heat flow model of a calorimeter was used to investigate the possibility of determining a mass attribute for plutonium-bearing items using this technique. While the results of this study looked favorable for developing a fast calorimetry attribute technique, additional work was needed to determine the accuracy of the model used to make the calculations. In this paper, the results from the current work investigating

  18. Spectra-temporal patterns underlying mental addition: an ERP and ERD/ERS study.

    PubMed

    Ku, Yixuan; Hong, Bo; Gao, Xiaorong; Gao, Shangkai

    2010-03-12

    Functional neuroimaging data have shown that mental calculation involves fronto-parietal areas that are composed of different subsystems shared with other cognitive functions such as working memory and language. Event-related potential (ERP) analysis has also indicated sequential information changes during the calculation process. However, little is known about the dynamic properties of oscillatory networks in this process. In the present study, we applied both ERP and event-related (de-)synchronization (ERS/ERD) analyses to EEG data recorded from normal human subjects performing tasks for sequential visual/auditory mental addition. Results in the study indicate that the late positive components (LPCs) can be decomposed into two separate parts. The earlier element LPC1 (around 360ms) reflects the computing attribute and is more prominent in calculation tasks. The later element LPC2 (around 590ms) indicates an effect of number size and appears larger only in a more complex 2-digit addition task. The theta ERS and alpha ERD show modality-independent frontal and parietal differential patterns between the mental addition and control groups, and discrepancies are noted in the beta ERD between the 2-digit and 1-digit mental addition groups. The 2-digit addition (both visual and auditory) results in similar beta ERD patterns to the auditory control, which may indicate a reliance on auditory-related resources in mental arithmetic, especially with increasing task difficulty. These results coincide with the theory of simple calculation relying on the visuospatial process and complex calculation depending on the phonological process. PMID:20105450

  19. Data verification in the residue laboratory.

    PubMed

    Ault, J A; Cassidy, P S; Crawford, C J; Jablonski, J E; Kenyon, R G

    1994-12-01

    Residue analysis frequently presents a challenge to the quality assurance (QA) auditor due to the sheer volume of data to be audited. In the face of multiple boxes of raw data, some process must be defined that assures the scientist and the QA auditor of the quality and integrity of the data. A program that ensures that complete and appropriate verification of data before it reaches the Quality Assurance Unit (QAU) is presented. The "Guidelines for Peer Review of Data" were formulated by the Residue Analysis Business Center at Ricerca, Inc. to accommodate efficient use of review time and to define any uncertainties concerning what are acceptable data. The core of this program centers around five elements: Study initiation (definitional) meetings, calculations, verification, approval, and the use of a verification checklist.

  20. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  1. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  2. SU-E-T-600: Patient Specific IMRT Verification Using a Phosphor-Screen Based Geometric QA System: A Preliminary Study

    SciTech Connect

    Lee, M; Hu, E; Yi, B

    2015-06-15

    Purpose: Raven QA (JPLC, MD) is a unified and comprehensive quality assurance system for QA of TG-142, which use a phosphor screen, a mirror system and a camera. It is to test if this device can be used for IMRT QA dosimetry. Methods: A lung IMRT case is used deliver dose to Raven QA. Accuracy of dose distribution of 5cm slab phantom using Eclipse planning system (Varian) has been confirmed both from a Monte Carlo Simulation and from a MapCheck (SunNuclear) measurement. Geometric distortion and variation of spatial dose response are corrected after background subtraction. A pin-hole grid plate is designed and used to determine the light scatter in the Raven QA box and the spatial dose response. Optic scatter model was not applied in this preliminary study. Dose is normalized to the response of the 10×10 field and the TMR of 5cm depth was considered. Results: Time to setup the device for IMRT QA takes less than 5 minutes as other commercially available devices. It shows excellent dose linearity and dose rate independent, less than 1 %. Background signal, however, changes for different field sizes. It is believed to be due to inaccurate correction of optic scatter. Absolute gamma (5%, 5mm) passing rate was higher than 95%. Conclusion: This study proves that the Raven QA can be used for a patient specific IMRT verification. Part of this study is supported by the Maryland Industrial Partnership Grant.

  3. A study on the relationship between the protein supplements intake satisfaction level and repurchase intention: Verification of mediation effects of word-of-mouth intention.

    PubMed

    Kim, Ill-Gwang

    2016-05-18

    The purpose of this study is to examine the relationship between the protein supplements intake satisfaction level and repurchase intention of university students majoring in physical education and verify the mediation effects of word-of-mouth intention. To achieve the purpose of this study, 700 university students majoring in physical education from 10 universities in Korea were selected from October 2013 to December 2013 as the target of this study through the cluster random sampling and data of 228 university students who had experience in the intake of protein supplements among them was analyzed. The composite reliability of each factor was in between 0.869 and 0.958, and the convergent validity and discriminant validity were verified. SPSS 18.0 and Amos 22.0 were utilized as data processing methods and the verification of significance on the medication effects and indirect effects of word-of-mouth intention was carried out using the frequency analysis, correlation analysis, CFA, SEM, and Amos bootstrapping. The result is as follows. The protein supplements intake satisfaction level had a positive effect on the word-of-mouth intention and the word-of-mouth intention had a positive effect on the repurchase intention. Also, it was shown that the word-of-mouth intention played a full mediation role between the intake satisfaction level and the repurchase intention.

  4. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  5. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  6. Effect of Exogenous Phytase Addition on Soil Phosphatase Activities: a Fluorescence Spectroscopy Study.

    PubMed

    Yang, Xiao-zhu; Chen, Zhen-hua; Zhang, Yu-lan; Chen, Li-jun

    2015-05-01

    The utilization of organic phosphorus (P) has directly or indirectly improved after exogenous phytase was added to soil. However, the mechanism by which exogenous phytase affected the soil phosphatases (phosphomonoesterase and phosphodiesterase) activities was not clear. The present work was aimed to study red soil, brown soil and cinnamon soil phosphomonoesterase (acid and alkaline) (AcP and AlP) and phosphodiesterase (PD) activities responding to the addition of exogenous phytase (1 g phytase/50 g air dry soil sample) based on the measurements performed via a fluorescence detection method combined with 96 microplates using a TECAN Infinite 200 Multi-Mode Microplate Reader. The results indicated that the acid phosphomonoesterase activity was significantly enhanced in red soil (p≤0. 01), while it was significantly reduced in cinnamon soil; alkaline phosphomonoesterase activity was significantly enhanced in cinnamon soil (p≤ 0. 01), while it was significantly reduced in red soil; phosphodiesterase activity was increased in three soils but it was significantly increased in brown soil (p≤0. 01) after the addition of exogenous phytase. The activities still remained strong after eight days in different soils, which indicated that exogenous phytase addition could be enhance soil phosphatases activities effectively. This effect was not only related to soil properties, such as pH and phosphorus forms, but might also be related to the excreted enzyme amount of the stimulating microorganism. Using fluorescence spectroscopy to study exogenous phytase addition influence on soil phosphatase activities was the first time at home and abroad. Compared with the conventional spectrophotometric method, the fluorescence microplate method is an accurate, fast and simple to use method to determine the relationships among the soil phosphatases activities.

  7. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  8. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  9. Using a multi-scale approach to identify and quantify oil and gas emissions: a case study for GHG emissions verification

    NASA Astrophysics Data System (ADS)

    Sobel, A. H.; Daleu, C. L.; Woolnough, S. J.; Plant, R.; Raymond, D. J.; Sessions, S. L.; Wang, S.; Bellon, G.

    2014-12-01

    Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.

  10. Bone Marrow Stromal Antigen 2 Is a Novel Plasma Biomarker and Prognosticator for Colorectal Carcinoma: A Secretome-Based Verification Study

    PubMed Central

    Chiang, Sum-Fu; Kan, Chih-Yen; Hsiao, Yung-Chin; Tang, Reiping; Hsieh, Ling-Ling; Chiang, Jy-Ming; Tsai, Wen-Sy; Yeh, Chien-Yuh; Hsieh, Pao-Shiu; Liang, Ying; Chen, Jinn-Shiun; Yu, Jau-Song

    2015-01-01

    Background. The cancer cell secretome has been recognized as a valuable reservoir for identifying novel serum/plasma biomarkers for different cancers, including colorectal cancer (CRC). This study aimed to verify four CRC cell-secreted proteins (tumor-associated calcium signal transducer 2/trophoblast cell surface antigen 2 (TACSTD2/TROP2), tetraspanin-6 (TSPAN6), bone marrow stromal antigen 2 (BST2), and tumor necrosis factor receptor superfamily member 16 (NGFR)) as potential plasma CRC biomarkers. Methods. The study population comprises 152 CRC patients and 152 controls. Target protein levels in plasma and tissue samples were assessed by ELISA and immunohistochemistry, respectively. Results. Among the four candidate proteins examined by ELISA in a small sample set, only BST2 showed significantly elevated plasma levels in CRC patients versus controls. Immunohistochemical analysis revealed the overexpression of BST2 in CRC tissues, and higher BST2 expression levels correlated with poorer 5-year survival (46.47% versus 65.57%; p = 0.044). Further verification confirmed the elevated plasma BST2 levels in CRC patients (2.35 ± 0.13 ng/mL) versus controls (1.04 ± 0.03 ng/mL) (p < 0.01), with an area under the ROC curve (AUC) being 0.858 comparable to that of CEA (0.867). Conclusion. BST2, a membrane protein selectively detected in CRC cell secretome, may be a novel plasma biomarker and prognosticator for CRC. PMID:26494939

  11. Using a multi-scale approach to identify and quantify oil and gas emissions: a case study for GHG emissions verification

    NASA Astrophysics Data System (ADS)

    Sweeney, C.; Kort, E. A.; Rella, C.; Conley, S. A.; Karion, A.; Lauvaux, T.; Frankenberg, C.

    2015-12-01

    Along with a boom in oil and natural gas production in the US, there has been a substantial effort to understand the true environmental impact of these operations on air and water quality, as well asnet radiation balance. This multi-institution effort funded by both governmental and non-governmental agencies has provided a case study for identification and verification of emissions using a multi-scale, top-down approach. This approach leverages a combination of remote sensing to identify areas that need specific focus and airborne in-situ measurements to quantify both regional and large- to mid-size single-point emitters. Ground-based networks of mobile and stationary measurements provide the bottom tier of measurements from which process-level information can be gathered to better understand the specific sources and temporal distribution of the emitters. The motivation for this type of approach is largely driven by recent work in the Barnett Shale region in Texas as well as the San Juan Basin in New Mexico and Colorado; these studies suggest that relatively few single-point emitters dominate the regional emissions of CH4.

  12. Cone beam CT imaging with limited angle of projections and prior knowledge for volumetric verification of non-coplanar beam radiation therapy: a proof of concept study

    NASA Astrophysics Data System (ADS)

    Meng, Bowen; Xing, Lei; Han, Bin; Koong, Albert; Chang, Daniel; Cheng, Jason; Li, Ruijiang

    2013-11-01

    Non-coplanar beams are important for treatment of both cranial and noncranial tumors. Treatment verification of such beams with couch rotation/kicks, however, is challenging, particularly for the application of cone beam CT (CBCT). In this situation, only limited and unconventional imaging angles are feasible to avoid collision between the gantry, couch, patient, and on-board imaging system. The purpose of this work is to develop a CBCT verification strategy for patients undergoing non-coplanar radiation therapy. We propose an image reconstruction scheme that integrates a prior image constrained compressed sensing (PICCS) technique with image registration. Planning CT or CBCT acquired at the neutral position is rotated and translated according to the nominal couch rotation/translation to serve as the initial prior image. Here, the nominal couch movement is chosen to have a rotational error of 5° and translational error of 8 mm from the ground truth in one or more axes or directions. The proposed reconstruction scheme alternates between two major steps. First, an image is reconstructed using the PICCS technique implemented with total-variation minimization and simultaneous algebraic reconstruction. Second, the rotational/translational setup errors are corrected and the prior image is updated by applying rigid image registration between the reconstructed image and the previous prior image. The PICCS algorithm and rigid image registration are alternated iteratively until the registration results fall below a predetermined threshold. The proposed reconstruction algorithm is evaluated with an anthropomorphic digital phantom and physical head phantom. The proposed algorithm provides useful volumetric images for patient setup using projections with an angular range as small as 60°. It reduced the translational setup errors from 8 mm to generally <1 mm and the rotational setup errors from 5° to <1°. Compared with the PICCS algorithm alone, the integration of rigid

  13. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  14. Using color for face verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Mariusz

    2009-06-01

    This paper presents research on importance of color information in face verification system. Four most popular color spaces where used: RGB, YIQ, YCbCr, luminance and compared using four types of discriminant classifiers. Experiments conducted on facial databases with complex background, different poses and light condition show that color information can improve the verification accuracy compared to the traditionally used luminance information. To achieve the best performance we recommend to use multi frames verification encoded to YIQ color space.

  15. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  16. A synchrotron study of microstructure gradient in laser additively formed epitaxial Ni-based superalloy

    DOE PAGES

    Xue, Jiawei; Zhang, Anfeng; Li, Yao; Qian, Dan; Wan, Jingchun; Qi, Baolu; Tamura, Nobumichi; Song, Zhongxiao; Chen, Kai

    2015-10-08

    Laser additive forming is considered to be one of the promising techniques to repair single crystal Ni-based superalloy parts to extend their life and reduce the cost. Preservation of the single crystalline nature and prevention of thermal mechanical failure are two of the most essential issues for the application of this technique. Here we employ synchrotron X-ray microdiffraction to evaluate the quality in terms of crystal orientation and defect distribution of a Ni-based superalloy DZ125L directly formed by a laser additive process rooted from a single crystalline substrate of the same material. We show that a disorientation gradient caused bymore » a high density of geometrically necessary dislocations and resultant subgrains exists in the interfacial region between the epitaxial and stray grains. This creates a potential relationship of stray grain formation and defect accumulation. In conclusion, the observation offers new directions on the study of performance control and reliability of the laser additive manufactured superalloys.« less

  17. Couples Counseling in Alzheimer's Disease: Additional Clinical Findings from a Novel Intervention Study.

    PubMed

    Auclair, Ursula; Epstein, Cynthia; Mittelman, Mary

    2009-04-01

    This article describes the clinical findings of a study designed to assess the benefit of counseling for couples, one of whom is in the early stage of Alzheimer's disease (AD). We previously reported our findings based on the first 12 couples that enrolled in the study. Based on the treatment of 30 additional couples, we have refined our treatment strategy to include concepts of Gestalt Therapy and Transactional Analysis and identified prevalent issues of concern to this cohort. The study design has remained as described in the earlier article (Epstein et al., 2006), and has proven to be appropriate to meet the goals of this intervention as indicated by our clinical experience and feedback from the participating couples. Case vignettes demonstrate how to conduct the sessions so that the experience of each member of the dyad is validated, while acknowledging the differential impact of the disease on them. PMID:19865591

  18. Addition of fluoride to pit and fissure sealants--a feasibility study.

    PubMed

    Swartz, M L; Phillips, R W; Norman, R D; Elliason, S; Rhodes, B F; Clark, H E

    1976-01-01

    The data obtained in this in vitro study indicate that contact with pit and fissure sealants to which NaF has been added in amounts ranging from 2 to 5% substantially increases the fluoride content of the enamel and reduces its solubility in acid. The properties of the materials do not seem to be impaired by the addition of fluoride in these amounts. It thus appears that this approach to providing a backup anticariogenic mechanism may, indeed, be feasible. However, further investigation must be done to confirm the anticariogenic effect and to establish the most efficacious means of fluoride incorporation in the materials.

  19. Verification and validation studies of the time-averaged velocity field in the very near-wake of a finite elliptical cylinder

    NASA Astrophysics Data System (ADS)

    Flynn, Michael R.; Eisner, Alfred D.

    2004-04-01

    This paper presents verification and validation results for the time-averaged, three-dimensional velocity field immediately downstream of a finite elliptic cylinder at a Reynolds number of 1.35 × 10 4. Numerical simulations were performed with the finite element package, Fidap, using the steady state, standard k-epsilon model. The ratio of the cylinder height to the major axis of the elliptical cross section is 5.0; the aspect ratio of the cross section is 0.5625. This particular geometry is selected as a crude surrogate for the human form in consideration of further applied occupational and environmental health studies. Predictions of the velocity and turbulence kinetic energy fields in the very near-wake are compared to measurements taken in a wind tunnel using laser Doppler anemometry. Results show that at all locations where a reliable grid convergence index can be calculated there is not a demonstrable difference between simulated and measured values. The overall topology of the time-averaged flow field is reasonably well predicted, although the simulated near-wake is narrower than the measured one.

  20. NES++: number system for encryption based privacy preserving speaker verification

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  1. A water soluble additive to suppress respirable dust from concrete-cutting chainsaws: a case study.

    PubMed

    Summers, Michael P; Parmigiani, John P

    2015-01-01

    Respirable dust is of particular concern in the construction industry because it contains crystalline silica. Respirable forms of silica are a severe health threat because they heighten the risk of numerous respirable diseases. Concrete cutting, a common work practice in the construction industry, is a major contributor to dust generation. No studies have been found that focus on the dust suppression of concrete-cutting chainsaws, presumably because, during normal operation water is supplied continuously and copiously to the dust generation points. However, there is a desire to better understand dust creation at low water flow rates. In this case study, a water-soluble surfactant additive was used in the chainsaw's water supply. Cutting was performed on a free-standing concrete wall in a covered outdoor lab with a hand-held, gas-powered, concrete-cutting chainsaw. Air was sampled at the operator's lapel, and around the concrete wall to simulate nearby personnel. Two additive concentrations were tested (2.0% and 0.2%), across a range of fluid flow rates (0.38-3.8 Lpm [0.1-1.0 gpm] at 0.38 Lpm [0.1 gpm] increments). Results indicate that when a lower concentration of additive is used exposure levels increase. However, all exposure levels, once adjusted for 3 hours of continuous cutting in an 8-hour work shift, are below the Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) of 5 mg/m(3). Estimates were made using trend lines to predict the fluid flow rates that would cause respirable dust exposure to exceed both the OSHA PEL and the American Conference of Governmental Industrial Hygienists (ACGIH®) threshold limit value (TLV).

  2. Magneto-optical study of uranium additions to amorphous TbxFe1 - x

    NASA Astrophysics Data System (ADS)

    Dillon, J. F., Jr.; van Dover, R. B.; Hong, M.; Gyorgy, E. M.; Albiston, S. D.

    1987-02-01

    Recent reports of huge magneto-optical Kerr rotations in certain crystalline metallic uranium compounds prompted a study of the magnetic and magneto-optical effects of uranium additions to a rare-earth transition metal amorphous alloy. Using variable composition samples, the polar Kerr effect at a small spot (e.g., 0.5 mm diam) was measured as field, temperature, and composition were varied. Points on the Curie line and the edges of the compensation region were determined from these observations. The compositions studied included (TbxFe1-x)1-yUy with 0.125≤x≤0.550 and y=0.0, 0.04, 0.07, 0.16. The addition of uranium to TbxFe1-x depresses the TC of Tb-rich material much more strongly than that of Tb-poor material. The compensation region does not shift at all with increasing y. It appears that uranium does not contribute to the magnetization of these amorphous alloys, nor does it significantly affect the magneto-optical effects.

  3. Characterization studies on the additives mixed L-arginine phosphate monohydrate (LAP) crystals

    NASA Astrophysics Data System (ADS)

    Haja Hameed, A. S.; Karthikeyan, C.; Ravi, G.; Rohani, S.

    2011-04-01

    L-arginine phosphate monohydrate (LAP), potassium thiocyanate (KSCN) mixed LAP (LAP:KSCN) and sodium sulfite (Na 2SO 3) mixed LAP (LAP:Na 2SO 3) single crystals were grown by slow cooling technique. The effect of microbial contamination and coloration on the growth solutions was studied. The crystalline powders of the grown crystals were examined by X-ray diffraction and the lattice parameters of the crystals were estimated. From the FTIR spectroscopic analysis, various functional group frequencies associated with the crystals were assigned. Vickers microhardness studies were done on {1 0 0} faces for pure and additives mixed LAP crystals. From the preliminary surface second harmonic generation (SHG) results, it was found that the SHG intensity at (1 0 0) face of LAP:KSCN crystal was much stronger than that of pure LAP.

  4. The guanidine and maleic acid (1:1) complex. The additional theoretical and experimental studies

    NASA Astrophysics Data System (ADS)

    Drozd, Marek; Dudzic, Damian

    2012-04-01

    On the basis of experimental literature data the theoretical studies for guanidinium and maleic acid complex with using DFT method are performed. In these studies the experimental X-ray data for two different forms of investigated crystal were used. During the geometry optimization process one equilibrium structure was found, only. According to this result the infrared spectrum for one theoretical molecule was calculated. On the basis of potential energy distribution (PED) analysis the clear-cut assignments of observed bands were performed. For the calculated molecule with energy minimum the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO) were obtained and graphically illustrated. The energy difference (GAP) between HOMO and LUMO was analyzed. Additionally, the nonlinear properties of this molecule were calculated. The α and β (first and second order) hyperpolarizability values are obtained. On the basis of these results the title crystal was classified as new second order NLO generator.

  5. Prazosin addition to fluvoxamine: A preclinical study and open clinical trial in OCD.

    PubMed

    Feenstra, Matthijs G P; Klompmakers, André; Figee, Martijn; Fluitman, Sjoerd; Vulink, Nienke; Westenberg, Herman G M; Denys, Damiaan

    2016-02-01

    The efficacy of selective serotonin reuptake inhibitors (SRIs) in psychiatric disorders may be "augmented" through the addition of atypical antipsychotic drugs. A synergistic increase in dopamine (DA) release in the prefrontal cortex has been suggested to underlie this augmentation effect, though the mechanism of action is not clear yet. We used in vivo microdialysis in rats to study DA release following the administration of combinations of fluvoxamine (10 mg/kg) and quetiapine (10 mg/kg) with various monoamine-related drugs. The results confirmed that the selective 5-HT1A antagonist WAY-100635 (0.05 mg/kg) partially blocked the fluvoxamine-quetiapine synergistic effect (maximum DA increase dropped from 325% to 214%). A novel finding is that the α1-adrenergic blocker prazosin (1 mg/kg), combined with fluvoxamine, partially mimicked the effect of augmentation (maximum DA increase 205%; area-under-the-curve 163%). As this suggested that prazosin augmentation might be tested in a clinical study, we performed an open clinical trial of prazosin 20 mg addition to SRI in therapy-resistant patients with obsessive-compulsive disorder applying for neurosurgery. A small, non-significant reduction in Yale Brown Obsessive Compulsive Scale (Y-BOCS) scores was observed in 10 patients and one patient was classified as a responder with a reduction in Y-BOCS scores of more than 25%. We suggest that future clinical studies augmenting SRIs with an α1-adrenergic blocker in less treatment resistant cases should be considered. The clinical trial "Prazosin in combination with a serotonin reuptake inhibitor for patients with Obsessive Compulsive disorder: an open label study" was registered at 24/05/2011 under trial number ISRCTN61562706: http://www.controlled-trials.com/ISRCTN61562706. PMID:26712326

  6. Verification watermarks on fingerprint recognition and retrieval

    NASA Astrophysics Data System (ADS)

    Yeung, Minerva M.; Pankanti, Sharatchandra

    2000-10-01

    Current `invisible' watermarking techniques aim at producing watermarked data that suffer no or little quality degradation and are perceptually identical to the original versions. The most common utility of a watermarked image is (1) for image viewing and display, and (2) for extracting the embedded watermark in subsequent copy protection applications. The issue is often centered on the robustness of the watermark for detection and extraction. In addition to robustness studies, a fundamental question will center on the utilization value of the watermarked images beyond perceptual quality evaluation. Essentially we have to study how the watermarks inserted affect the subsequent processing and utility of images, and what watermarking schemes we can develop that will cater to these processing tasks. This work focuses on the study of watermarking on images used in automatic personal identification technology based on fingerprints. We investigate the effects of watermarking fingerprint images on the recognition and retrieval accuracy using a proposed invisible fragile watermarking technique for image verification applications on a specific fingerprint recognition system. We shall also describe the watermarking scheme, fingerprint recognition and feature extraction techniques used. We believe that watermarking of images will provide value-added protection, as well as copyright notification capability, to the fingerprint data collection processes and subsequent usage.

  7. Verification watermarks on fingerprint recognition and retrieval

    NASA Astrophysics Data System (ADS)

    Pankanti, Sharatchandra; Yeung, Minerva M.

    1999-04-01

    Current 'invisible' watermarking techniques aim at producing watermarked data that suffer no or little quality degradation and perceptually identical to the original versions. The most common utility of a watermarked image is (1) for image viewing and display, and (2) for extracting the embedded watermark in subsequent copy protection applications. The issue is often centered on the robustness of the watermark for detection and extraction. In addition to robustness studies, a fundamental question will center on the utilization value of the watermarked images beyond perceptual quality evaluation. Essentially we have to study how the watermarks inserted affect the subsequent processing and utility of images, and what watermarking schemes we can develop that will cater to these processing tasks. This work focuses on the study of watermarking on images used in automatic personal identification technology based fingerprints. We investigate the effects of watermarking fingerprint images on the recognition and retrieval accuracy using a proposed invisible fragile watermarking technique for image verification applications on a specific fingerprint recognition system. We shall also describe the watermarking scheme, fingerprint recognition and feature extraction techniques used. We believe that watermarking of images will provided value-added protection, as well as copyright notification capability, to the fingerprint data collection processes and subsequent usage.

  8. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  9. A combined toxicity study of zinc oxide nanoparticles and vitamin C in food additives

    NASA Astrophysics Data System (ADS)

    Wang, Yanli; Yuan, Lulu; Yao, Chenjie; Ding, Lin; Li, Chenchen; Fang, Jie; Sui, Keke; Liu, Yuanfang; Wu, Minghong

    2014-11-01

    At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the cytotoxicity significantly compared with that of the ZnO only NPs. When the cells were exposed to ZnO NPs at a concentration less than 15 mg L-1, or to Vc at a concentration less than 300 mg L-1, there was no significant cytotoxicity, both in the case of gastric epithelial cell line (GES-1) and neural stem cells (NSCs). However, when 15 mg L-1 of ZnO NPs and 300 mg L-1 of Vc were introduced to cells together, the cell viability decreased sharply indicating significant cytotoxicity. Moreover, the significant increase in toxicity was also shown in the in vivo experiments. The dose of the ZnO NPs and Vc used in the in vivo study was calculated according to the state of food and nutrition enhancer standard. After repeated oral exposure to ZnO NPs plus Vc, the injury of the liver and kidneys in mice has been indicated by the change of these indices. These findings demonstrate that the synergistic toxicity presented in a complex system is essential for the toxicological evaluation and safety assessment of nanofood.At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the

  10. Impact of contacting study authors to obtain additional data for systematic reviews: diagnostic accuracy studies for hepatic fibrosis

    PubMed Central

    2014-01-01

    Background Seventeen of 172 included studies in a recent systematic review of blood tests for hepatic fibrosis or cirrhosis reported diagnostic accuracy results discordant from 2 × 2 tables, and 60 studies reported inadequate data to construct 2 × 2 tables. This study explores the yield of contacting authors of diagnostic accuracy studies and impact on the systematic review findings. Methods Sixty-six corresponding authors were sent letters requesting additional information or clarification of data from 77 studies. Data received from the authors were synthesized with data included in the previous review, and diagnostic accuracy sensitivities, specificities, and positive and likelihood ratios were recalculated. Results Of the 66 authors, 68% were successfully contacted and 42% provided additional data for 29 out of 77 studies (38%). All authors who provided data at all did so by the third emailed request (ten authors provided data after one request). Authors of more recent studies were more likely to be located and provide data compared to authors of older studies. The effects of requests for additional data on the conclusions regarding the utility of blood tests to identify patients with clinically significant fibrosis or cirrhosis were generally small for ten out of 12 tests. Additional data resulted in reclassification (using median likelihood ratio estimates) from less useful to moderately useful or vice versa for the remaining two blood tests and enabled the calculation of an estimate for a third blood test for which previously the data had been insufficient to do so. We did not identify a clear pattern for the directional impact of additional data on estimates of diagnostic accuracy. Conclusions We successfully contacted and received results from 42% of authors who provided data for 38% of included studies. Contacting authors of studies evaluating the diagnostic accuracy of serum biomarkers for hepatic fibrosis and cirrhosis in hepatitis C patients

  11. A study of pyrazines in cigarettes and how additives might be used to enhance tobacco addiction

    PubMed Central

    Alpert, Hillel R; Agaku, Israel T; Connolly, Gregory N

    2016-01-01

    Background Nicotine is known as the drug that is responsible for the addicted behaviour of tobacco users, but it has poor reinforcing effects when administered alone. Tobacco product design features enhance abuse liability by (A) optimising the dynamic delivery of nicotine to central nervous system receptors, and affecting smokers’ withdrawal symptoms, mood and behaviour; and (B) effecting conditioned learning, through sensory cues, including aroma, touch and visual stimulation, to create perceptions of pending nicotine reward. This study examines the use of additives called ‘pyrazines’, which may enhance abuse potential, their introduction in ‘lights’ and subsequently in the highly market successful Marlboro Lights (Gold) cigarettes and eventually many major brands. Methods We conducted internal tobacco industry research using online databases in conjunction with published scientific literature research, based on an iterative feedback process. Results Tobacco manufacturers developed the use of a range of compounds, including pyrazines, in order to enhance ‘light’ cigarette products’ acceptance and sales. Pyrazines with chemosensory and pharmacological effects were incorporated in the first ‘full-flavour, low-tar’ product achieving high market success. Such additives may enhance dependence by helping to optimise nicotine delivery and dosing and through cueing and learned behaviour. Conclusions Cigarette additives and ingredients with chemosensory effects that promote addiction by acting synergistically with nicotine, increasing product appeal, easing smoking initiation, discouraging cessation or promoting relapse should be regulated by the US Food and Drug Administration. Current models of tobacco abuse liability could be revised to include more explicit roles with regard to non-nicotine constituents that enhance abuse potential. PMID:26063608

  12. A combined toxicity study of zinc oxide nanoparticles and vitamin C in food additives.

    PubMed

    Wang, Yanli; Yuan, Lulu; Yao, Chenjie; Ding, Lin; Li, Chenchen; Fang, Jie; Sui, Keke; Liu, Yuanfang; Wu, Minghong

    2014-12-21

    At present, safety evaluation standards for nanofood additives are made based on the toxic effects of a single additive. Since the size, surface properties and chemical nature influence the toxicity of nanomaterials, the toxicity may have dramatically changed when nanomaterials are used as food additives in a complex system. Herein, we investigated the combined toxicity of zinc oxide nanoparticles (ZnO NPs) and vitamin C (Vc, ascorbic acid). The results showed that Vc increased the cytotoxicity significantly compared with that of the ZnO only NPs. When the cells were exposed to ZnO NPs at a concentration less than 15 mg L(-1), or to Vc at a concentration less than 300 mg L(-1), there was no significant cytotoxicity, both in the case of gastric epithelial cell line (GES-1) and neural stem cells (NSCs). However, when 15 mg L(-1) of ZnO NPs and 300 mg L(-1) of Vc were introduced to cells together, the cell viability decreased sharply indicating significant cytotoxicity. Moreover, the significant increase in toxicity was also shown in the in vivo experiments. The dose of the ZnO NPs and Vc used in the in vivo study was calculated according to the state of food and nutrition enhancer standard. After repeated oral exposure to ZnO NPs plus Vc, the injury of the liver and kidneys in mice has been indicated by the change of these indices. These findings demonstrate that the synergistic toxicity presented in a complex system is essential for the toxicological evaluation and safety assessment of nanofood.

  13. Rate constants of hydroperoxyl radical addition to cyclic nitrones: a DFT study.

    PubMed

    Villamena, Frederick A; Merle, John K; Hadad, Christopher M; Zweier, Jay L

    2007-10-01

    Nitrones are potential synthetic antioxidants against the reduction of radical-mediated oxidative damage in cells and as analytical reagents for the identification of HO2* and other such transient species. In this work, the PCM/B3LYP/6-31+G(d,p)//B3LYP/6-31G(d) and PCM/mPW1K/6-31+G(d,p) density functional theory (DFT) methods were employed to predict the reactivity of HO2* with various functionalized nitrones as spin traps. The calculated second-order rate constants and free energies of reaction at both levels of theory were in the range of 100-103 M-1 s-1 and 1 to -12 kcal mol-1, respectively, and the rate constants for some nitrones are on the same order of magnitude as those observed experimentally. The trend in HO2* reactivity to nitrones could not be explained solely on the basis of the relationship of the theoretical positive charge densities on the nitronyl-C, with their respective ionization potentials, electron affinities, rate constants, or free energies of reaction. However, various modes of intramolecular H-bonding interaction were observed at the transition state (TS) structures of HO2* addition to nitrones. The presence of intramolecular H-bonding interactions in the transition states were predicted and may play a significant role toward a facile addition of HO2* to nitrones. In general, HO2* addition to ethoxycarbonyl- and spirolactam-substituted nitrones, as well as those nitrones without electron-withdrawing substituents, such as 5,5-dimethyl-pyrroline N-oxide (DMPO) and 5-spirocyclopentyl-pyrroline N-oxide (CPPO), are most preferred compared to the methylcarbamoyl-substituted nitrones. This study suggests that the use of specific spin traps for efficient trapping of HO2* could pave the way toward improved radical detection and antioxidant protection. PMID:17845014

  14. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  15. Additional follow-up telephone counselling and initial smoking relapse: a longitudinal, controlled study

    PubMed Central

    Wu, Lei; He, Yao; Jiang, Bin; Zuo, Fang; Liu, Qinghui; Zhang, Li; Zhou, Changxi

    2016-01-01

    Objectives Smoking cessation services can help smokers to quit; however, many smoking relapse cases occur over time. Initial relapse prevention should play an important role in achieving the goal of long-term smoking cessation. Several studies have focused on the effect of extended telephone support in relapse prevention, but the conclusions remain conflicting. Design and setting From October 2008 to August 2013, a longitudinal, controlled study was performed in a large general hospital of Beijing. Participants The smokers who sought treatment at our smoking cessation clinic were non-randomised and divided into 2 groups: face-to-face individual counselling group (FC group), and face-to-face individual counselling plus telephone follow-up counselling group (FCF group). No pharmacotherapy was offered. Outcomes The timing of initial smoking relapse was compared between FC and FCF groups. Predictors of initial relapse were investigated during the first 180 days, using the Cox proportional hazards model. Results Of 547 eligible male smokers who volunteered to participate, 457 participants (117 in FC group and 340 in FCF group) achieved at least 24 h abstinence. The majority of the lapse episodes occurred during the first 2 weeks after the quit date. Smokers who did not receive the follow-up telephone counselling (FC group) tended to relapse to smoking earlier than those smokers who received the additional follow-up telephone counselling (FCF group), and the log-rank test was statistically significant (p=0.003). A Cox regression model showed that, in the FCF group, being married, and having a lower Fagerström test score, normal body mass index and doctor-diagnosed tobacco-related chronic diseases, were significantly independent protective predictors of smoking relapse. Conclusions Within the limitations of this study, it can be concluded that additional follow-up telephone counselling might be an effective strategy in preventing relapse. Further research is still

  16. Study of mandible reconstruction using a fibula flap with application of additive manufacturing technology

    PubMed Central

    2014-01-01

    Background This study aimed to establish surgical guiding techniques for completing mandible lesion resection and reconstruction of the mandible defect area with fibula sections in one surgery by applying additive manufacturing technology, which can reduce the surgical duration and enhance the surgical accuracy and success rate. Methods A computer assisted mandible reconstruction planning (CAMRP) program was used to calculate the optimal cutting length and number of fibula pieces and design the fixtures for mandible cutting, registration, and arrangement of the fibula segments. The mandible cutting and registering fixtures were then generated using an additive manufacturing system. The CAMRP calculated the optimal fibula cutting length and number of segments based on the location and length of the defective portion of the mandible. The mandible cutting jig was generated according to the boundary surface of the lesion resection on the mandible STL model. The fibular cutting fixture was based on the length of each segment, and the registered fixture was used to quickly arrange the fibula pieces into the shape of the defect area. In this study, the mandibular lesion was reconstructed using registered fibular sections in one step, and the method is very easy to perform. Results and conclusion The application of additive manufacturing technology provided customized models and the cutting fixtures and registered fixtures, which can improve the efficiency of clinical application. This study showed that the cutting fixture helped to rapidly complete lesion resection and fibula cutting, and the registered fixture enabled arrangement of the fibula pieces and allowed completion of the mandible reconstruction in a timely manner. Our method can overcome the disadvantages of traditional surgery, which requires a long and different course of treatment and is liable to cause error. With the help of optimal cutting planning by the CAMRP and the 3D printed mandible resection jig and

  17. Toxicogenomics concepts and applications to study hepatic effects of food additives and chemicals

    SciTech Connect

    Stierum, Rob . E-mail: stierum@voeding.tno.nl; Heijne, Wilbert; Kienhuis, Anne; Ommen, Ben van; Groten, John

    2005-09-01

    Transcriptomics, proteomics and metabolomics are genomics technologies with great potential in toxicological sciences. Toxicogenomics involves the integration of conventional toxicological examinations with gene, protein or metabolite expression profiles. An overview together with selected examples of the possibilities of genomics in toxicology is given. The expectations raised by toxicogenomics are earlier and more sensitive detection of toxicity. Furthermore, toxicogenomics will provide a better understanding of the mechanism of toxicity and may facilitate the prediction of toxicity of unknown compounds. Mechanism-based markers of toxicity can be discovered and improved interspecies and in vitro-in vivo extrapolations will drive model developments in toxicology. Toxicological assessment of chemical mixtures will benefit from the new molecular biological tools. In our laboratory, toxicogenomics is predominantly applied for elucidation of mechanisms of action and discovery of novel pathway-supported mechanism-based markers of liver toxicity. In addition, we aim to integrate transcriptome, proteome and metabolome data, supported by bioinformatics to develop a systems biology approach for toxicology. Transcriptomics and proteomics studies on bromobenzene-mediated hepatotoxicity in the rat are discussed. Finally, an example is shown in which gene expression profiling together with conventional biochemistry led to the discovery of novel markers for the hepatic effects of the food additives butylated hydroxytoluene, curcumin, propyl gallate and thiabendazole.

  18. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region. PMID:23242683

  19. Covalent binding of aniline to humic substances. 2. 15N NMR studies of nucleophilic addition reactions

    USGS Publications Warehouse

    Thorn, K.A.; Pettigrew, P.J.; Goldenberg, W.S.; Weber, E.J.

    1996-01-01

    Aromatic amines are known to undergo covalent binding with humic substances in the environment. Although previous studies have examined reaction conditions and proposed mechanisms, there has been no direct spectroscopic evidence for the covalent binding of the amines to the functional groups in humic substances. In order to further elucidate the reaction mechanisms, the Suwannee River and IHSS soil fulvic and humic acids were reacted with 15N-labeled aniline at pH 6 and analyzed using 15N NMR spectrometry. Aniline underwent nucleophilic addition reactions with the quinone and other carbonyl groups in the samples and became incorporated in the form of anilinohydroquinone, anilinoquinone, anilide, imine, and heterocyclic nitrogen, the latter comprising 50% or more of the bound amine. The anilide and anilinohydroquinone nitrogens were determined to be susceptible to chemical exchange by ammonia. In the case of Suwannee River fulvic acid, reaction under anoxic conditions and pretreatment with sodium borohydride or hydroxylamine prior to reaction under oxic conditions resulted in a decrease in the proportion of anilinohydroquinone nitrogen incorporated. The relative decrease in the incorporation of anilinohydroquinone nitrogen with respect to anilinoquinone nitrogen under anoxic conditions suggested that inter- or intramolecular redox reactions accompanied the nucleophilic addition reactions.

  20. Studies on the reuse of waste printed circuit board as an additive for cement mortar.

    PubMed

    Ban, Bong-Chan; Song, Jong-Yoon; Lim, Joong-Yeon; Wang, Soo-Kyoon; An, Kwang-Guk; Kim, Dong-Su

    2005-01-01

    The recent development in electronic industries has generated a drastic increase in production of printed circuit boards (PCB). Accordingly, the amount of waste PCB from electronic productions and waste electronics and its environmental impact such as soil and groundwater contamination have become a great concern. This study aims to propose a method for reuse of waste PCB as an additive for cement mortar. Although the expansibility of waste PCB powder finer than 0.08 mm in water was observed to be greater than 2.0%, the maximum expansion rates in water for 0.08 to approximately 0.15 and 0.15 to approximately 0.30 mm sized PCB powders were less than 2.0%, which satisfied the necessary condition as an alternative additive for cement mortar in place of sand. The difference in the compressive strength of standard mortar and waste PCB added mortar was observed to be less than 10% and their difference was expected to be smaller after prolonged aging. The durability of waste PCB added cement mortar was also examined through dry/wet conditioning cyclic tests and acidic/alkaline conditioning tests. From the tests, both weight and compressive strength of cement mortar were observed to be recovered with aging. The leaching test for heavy metals from waste PCB added mortar showed that no heavy metal ions such as copper, lead, or cadmium were detected in the leachate, which resulted from fixation effect of the cement hydrates.

  1. Assessment of Nano Cellulose from Peach Palm Residue as Potential Food Additive: Part II: Preliminary Studies.

    PubMed

    Andrade, Dayanne Regina Mendes; Mendonça, Márcia Helena; Helm, Cristiane Vieira; Magalhães, Washington L E; de Muniz, Graciela Ines Bonzon; Kestur, Satyanarayana G

    2015-09-01

    High consumption of dietary fibers in the diet is related to the reduction of the risk of non-transmitting of chronic diseases, prevention of the constipation etc. Rich diets in dietary fibers promote beneficial effects for the metabolism. Considering the above and recognizing the multifaceted advantages of nano materials, there have been many attempts in recent times to use the nano materials in the food sector including as food additive. However, whenever new product for human and animal consumption is developed, it has to be tested for their effectiveness regarding improvement in the health of consumers, safety aspects and side effects. However, before it is tried with human beings, normally such materials would be assessed through biological tests on a living organism to understand its effect on health condition of the consumer. Accordingly, based on the authors' finding reported in a previous paper, this paper presents body weight, biochemical (glucose, cholesterol and lipid profile in blood, analysis of feces) and histological tests carried out with biomass based cellulose nano fibrils prepared by the authors for its possible use as food additive. Preliminary results of the study with mice have clearly brought out potential of these fibers for the said purpose. PMID:26344977

  2. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region.

  3. Mechanisms on electrical breakdown strength increment of polyethylene by aromatic carbonyl compounds addition: a theoretical study.

    PubMed

    Zhang, Hui; Shang, Yan; Wang, Xuan; Zhao, Hong; Han, Baozhong; Li, Zesheng

    2013-12-01

    A theoretical investigation is accomplished on the mechanisms of electrical breakdown strength increment of polyethylene at the atomic and molecular levels. It is found that the addition of aromatic carbonyl compounds as voltage stabilizers is one of the important factors for increasing electrical breakdown strength of polyethylene, as the additives can trap hot electrons, obtain energy of hot electrons, and transform the aliphatic cation to relatively stable aromatic cation to prevent the degradation of the polyethylene matrix. The HOMO-LUMO energy gaps (E(g)), the ionization potentials (IPs), and electron affinities (EAs) at the ground states of a series of aromatic carbonyl compounds are obtained at the B3LYP/6-311+G(d,p) level. The theoretical results are in good agreement with the available experimental findings, show that 2,4-dioctyloxybenzophenone (Bzo) and 4,4'-didodecyloxybenzil (Bd) molecules can effectively increase the electrical breakdown strength when they are doped into polyethylene because of their much smaller E g values than all the other studied aromatic carbonyl molecules and excellent compatibility with polymers matrix.

  4. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  5. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  6. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  7. Study on Friction and Wear Properties of Silver Matrix Brush Material with Different Additives

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoli; Wang, Wenfang; Hong, Yu; Wu, Yucheng

    2013-07-01

    Friction and wear processes of AgCuX (G, CF and AlN) composites-CuAgV alloy friction pair and effects of different additive content in silver based composite on friction and wear behavior are studied in this paper. The microstructure of the brush wear surface is observed by SEM. The results show that when graphite content is up to 9 wt.%, Ag-Cu-CF-G composite exhibits the best wear properties; when the content of aluminum nitride is up to 0.5 wt.%, Ag-Cu-AlN-G composites has the most comprehensive performance. The wear loss of both composites arises with the increase of both pressure and speed, but when speed reaches a critical value, the increased amplitude of wear loss tends to be steady.

  8. Comparative study of dimensional accuracy of different impression techniques using addition silicone impression material.

    PubMed

    Penaflor, C F; Semacio, R C; De Las Alas, L T; Uy, H G

    1998-01-01

    This study compared dimensional accuracy of the single, double with spacer, double with cut-out and double mix impression technique using addition silicone impression material. A typhodont containing Ivorine teeth model with six (6) full-crown tooth preparations were used as the positive control. Two stone replication models for each impression technique were made as test materials. Accuracy of the techniques were assessed by measuring four dimensions on the stone dies poured from the impression of the Ivorine teeth model. Results indicated that most of the measurements for the height, width and diameter slightly decreased and a few increased compared with the Ivorine teeth model. The double with cut-out and double mix technique presents the least difference from the master model as compared to the two latter impression techniques. PMID:10202524

  9. Study of cadmium, zinc and lead biosorption by orange wastes using the subsequent addition method.

    PubMed

    Pérez-Marín, A B; Ballester, A; González, F; Blázquez, M L; Muñoz, J A; Sáez, J; Zapata, V Meseguer

    2008-11-01

    The biosorption of several metals (Cd2+, Zn2+ and Pb2+) by orange wastes has been investigated in binary systems. Multicomponent sorption isotherms were obtained using an original procedure, similar to that proposed by Pagnanelli et al. [Pagnanelli, F., Petrangeli, M.P., Toro, L., Trifoni, M., Veglio, F., 2001a. Biosorption of metal ions on Arthrobacter sp.: biomass characterization and biosorption modelling. Environ. Sci. Technol. 34, 2773-2778] for monoelement systems, known as subsequent addition method (SAM). Experimental sorption data were analysed using an extended multicomponent Langmuir equation. The maximum sorption uptake was approximately 0.25mmol/g for the three binary systems studied. The reliability of the proposed procedure for obtaining the equilibrium data in binary systems was verified by means of a statistical F-test. PMID:18440805

  10. Spectroscopic studies of nucleic acid additions during seed-mediated growth of gold nanoparticles

    PubMed Central

    Tapp, Maeling; Sullivan, Rick; Dennis, Patrick; Naik, Rajesh R.

    2015-01-01

    The effect of adding nucleic acids to gold seeds during the growth stage of either nanospheres or nanorods was investigated using UV-Vis spectroscopy to reveal any oligonucleotide base or structure-specific effects on nanoparticle growth kinetics or plasmonic signatures. Spectral data indicate that the presence of DNA duplexes during seed ageing drastically accelerated nanosphere growth while the addition of single-stranded polyadenine at any point during seed ageing induces nanosphere aggregation. For seeds added to a gold nanorod growth solution, single-stranded polythymine induces a modest blue-shift in the longitudinal peak wavelength. Moreover, a particular sequence comprised of 50% thymine bases was found to induce a faster, more dramatic blue-shift in the longitudinal peak wavelength compared to any of the homopolymer incubation cases. Monomeric forms of the nucleic acids, however, do not yield discernable spectral differences in any of the gold suspensions studied. PMID:25960601

  11. Rapid identification of color additives, using the C18 cartridge: collaborative study.

    PubMed

    Young, M L

    1988-01-01

    Nine laboratories collaboratively studied a method for the separation and identification of the 7 permitted FD&C color additives (Red Nos. 3 and 40; Blue Nos. 1 and 2; Yellow Nos. 5 and 6; Green No. 3) and the banned FD&C Red No. 2 in foods. The method is based on use of a commercial C18 cartridge and spectrophotometry or thin layer chromatography. Collaborators analyzed 5 commercial products (noodles, candy, carbonated soda, flavored gelatin, and powdered drink) and 2 dye mixtures (one containing FD&C Red Nos. 2, 3, and 40; the other containing FD&C Green No. 3 and Red No. 3). All of the colors were identified with little or no difficulty by 8 collaborators. The method has been adopted official first action.

  12. Genetic assessment of additional endophenotypes from the Consortium on the Genetics of Schizophrenia Family Study.

    PubMed

    Greenwood, Tiffany A; Lazzeroni, Laura C; Calkins, Monica E; Freedman, Robert; Green, Michael F; Gur, Raquel E; Gur, Ruben C; Light, Gregory A; Nuechterlein, Keith H; Olincy, Ann; Radant, Allen D; Seidman, Larry J; Siever, Larry J; Silverman, Jeremy M; Stone, William S; Sugar, Catherine A; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L

    2016-01-01

    The Consortium on the Genetics of Schizophrenia Family Study (COGS-1) has previously reported our efforts to characterize the genetic architecture of 12 primary endophenotypes for schizophrenia. We now report the characterization of 13 additional measures derived from the same endophenotype test paradigms in the COGS-1 families. Nine of the measures were found to discriminate between schizophrenia patients and controls, were significantly heritable (31 to 62%), and were sufficiently independent of previously assessed endophenotypes, demonstrating utility as additional endophenotypes. Genotyping via a custom array of 1536 SNPs from 94 candidate genes identified associations for CTNNA2, ERBB4, GRID1, GRID2, GRIK3, GRIK4, GRIN2B, NOS1AP, NRG1, and RELN across multiple endophenotypes. An experiment-wide p value of 0.003 suggested that the associations across all SNPs and endophenotypes collectively exceeded chance. Linkage analyses performed using a genome-wide SNP array further identified significant or suggestive linkage for six of the candidate endophenotypes, with several genes of interest located beneath the linkage peaks (e.g., CSMD1, DISC1, DLGAP2, GRIK2, GRIN3A, and SLC6A3). While the partial convergence of the association and linkage likely reflects differences in density of gene coverage provided by the distinct genotyping platforms, it is also likely an indication of the differential contribution of rare and common variants for some genes and methodological differences in detection ability. Still, many of the genes implicated by COGS through endophenotypes have been identified by independent studies of common, rare, and de novo variation in schizophrenia, all converging on a functional genetic network related to glutamatergic neurotransmission that warrants further investigation. PMID:26597662

  13. Genetic assessment of additional endophenotypes from the Consortium on the Genetics of Schizophrenia Family Study.

    PubMed

    Greenwood, Tiffany A; Lazzeroni, Laura C; Calkins, Monica E; Freedman, Robert; Green, Michael F; Gur, Raquel E; Gur, Ruben C; Light, Gregory A; Nuechterlein, Keith H; Olincy, Ann; Radant, Allen D; Seidman, Larry J; Siever, Larry J; Silverman, Jeremy M; Stone, William S; Sugar, Catherine A; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L

    2016-01-01

    The Consortium on the Genetics of Schizophrenia Family Study (COGS-1) has previously reported our efforts to characterize the genetic architecture of 12 primary endophenotypes for schizophrenia. We now report the characterization of 13 additional measures derived from the same endophenotype test paradigms in the COGS-1 families. Nine of the measures were found to discriminate between schizophrenia patients and controls, were significantly heritable (31 to 62%), and were sufficiently independent of previously assessed endophenotypes, demonstrating utility as additional endophenotypes. Genotyping via a custom array of 1536 SNPs from 94 candidate genes identified associations for CTNNA2, ERBB4, GRID1, GRID2, GRIK3, GRIK4, GRIN2B, NOS1AP, NRG1, and RELN across multiple endophenotypes. An experiment-wide p value of 0.003 suggested that the associations across all SNPs and endophenotypes collectively exceeded chance. Linkage analyses performed using a genome-wide SNP array further identified significant or suggestive linkage for six of the candidate endophenotypes, with several genes of interest located beneath the linkage peaks (e.g., CSMD1, DISC1, DLGAP2, GRIK2, GRIN3A, and SLC6A3). While the partial convergence of the association and linkage likely reflects differences in density of gene coverage provided by the distinct genotyping platforms, it is also likely an indication of the differential contribution of rare and common variants for some genes and methodological differences in detection ability. Still, many of the genes implicated by COGS through endophenotypes have been identified by independent studies of common, rare, and de novo variation in schizophrenia, all converging on a functional genetic network related to glutamatergic neurotransmission that warrants further investigation.

  14. A randomized trial of the effect of a plant-based dietary pattern on additional breast cancer events and survival: the Women's Healthy Eating and Living (WHEL) Study.

    PubMed

    Pierce, John P; Faerber, Susan; Wright, Fred A; Rock, Cheryl L; Newman, Vicky; Flatt, Shirley W; Kealey, Sheila; Jones, Vicky E; Caan, Bette J; Gold, Ellen B; Haan, Mary; Hollenbach, Kathryn A; Jones, Lovell; Marshall, James R; Ritenbaugh, Cheryl; Stefanick, Marcia L; Thomson, Cynthia; Wasserman, Linda; Natarajan, Loki; Thomas, Ronald G; Gilpin, Elizabeth A

    2002-12-01

    The Women's Healthy Eating and Living (WHEL) Study is a multisite randomized controlled trial of the effectiveness of a high-vegetable, low-fat diet, aimed at markedly raising circulating carotenoid concentrations from food sources, in reducing additional breast cancer events and early death in women with early-stage invasive breast cancer (within 4 years of diagnosis). The study randomly assigned 3088 such women to an intensive diet intervention or to a comparison group between 1995 and 2000 and is expected to follow them through 2006. Two thirds of these women were under 55 years of age at randomization. This research study has a coordinating center and seven clinical sites. Randomization was stratified by age, stage of tumor and clinical site. A comprehensive intervention program that includes intensive telephone counseling, cooking classes and print materials helps shift the dietary pattern of women in the intervention. Through an innovative telephone counseling program, dietary counselors encourage women in the intervention group to meet the following daily behavioral targets: five vegetable servings, 16 ounces of vegetable juice, three fruit servings, 30 g of fiber and 15-20% energy from fat. Adherence assessments occur at baseline, 6, 12, 24 or 36, 48 and 72 months. These assessments can include dietary intake (repeated 24-hour dietary recalls and food frequency questionnaire), circulating carotenoid concentrations, physical measures and questionnaires about health symptoms, quality of life, personal habits and lifestyle patterns. Outcome assessments are completed by telephone interview every 6 months with medical record verification. We will assess evidence of effectiveness by the length of the breast cancer event-free interval, as well as by overall survival separately in all the women in the study as well as specifically in women under and over the age of 55 years.

  15. Study of triallyl phosphate as an electrolyte additive for high voltage lithium-ion cells

    NASA Astrophysics Data System (ADS)

    Xia, J.; Madec, L.; Ma, L.; Ellis, L. D.; Qiu, W.; Nelson, K. J.; Lu, Z.; Dahn, J. R.

    2015-11-01

    The role of triallyl phosphate as an electrolyte additive in Li(Ni0.42Mn0.42Co0.16)O2/graphite pouch cells was studied using ex-situ gas measurements, ultra high precision coulometry, automated storage experiments, electrochemical impedance spectroscopy, long-term cycling and X-ray photoelectron spectroscopy. Cells containing triallyl phosphate produced less gas during formation, cycling and storage than control cells. The use of triallyl phosphate led to higher coulombic efficiency and smaller charge endpoint capacity slippage during ultra high precision charger testing. Cells containing triallyl phosphate showed smaller potential drop during 500 h storage at 40 °C and 60 °C and the voltage drop decreased as the triallyl phosphate content in the electrolyte increased. However, large amounts of triallyl phosphate (>3% by weight in the electrolyte) led to large impedance after cycling and storage. Symmetric cell studies showed large amounts of triallyl phosphate (5% or more) led to significant impedance increase at both negative and positive electrodes. X-ray photoelectron spectroscopy studies suggested that the high impedance came from the polymerization of triallyl phosphate molecules which formed thick solid electrolyte interphase films at the surfaces of both negative and positive electrodes. An optimal amount of 2%-3% triallyl phosphate led to better capacity retention during long term cycling.

  16. Synthesis, verification, and optimization of systolic arrays

    SciTech Connect

    Rajopadhye, S.V.

    1986-01-01

    This dissertation addresses the issue of providing a sound theoretical basis for three important issues relating to systolic arrays, namely synthesis, verification, and optimization. Former research has concentrated on analysis of the dependency structure of the computation, and there have been numerous approaches to map this dependency structure onto a locally interconnected network. This study pursues a similar approach, but with a major generalization of the class of problems analyzed. In earlier research, it was essential that the dependencies were expressible as constant vectors (from a point in the domain to the points that it depended on); here they are permitted to be arbitrary linear functions of the point. Theory for synthesizing systolic architectures from such generalized specifications is developed. Also a systematic (mechanizable) approach to the synthesis of systolic architectures that have control signals is presented. In the areas of verification and optimization, a rigorous mathematical framework is presented that permits reasoning about the behavior of systolic arrays as functions on streams of data. Using this approach, the verification of such architectures reduces to the problem of verification of functional program.s

  17. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  18. Associations Between Reasons for Dating, Orientation, Commitment and Behavior: Verification of a Study by McDaniel.

    ERIC Educational Resources Information Center

    Gittman, Elizabeth

    A study of dating behavior by McDaniel was replicated, using a single adult population instead of college undergraduates. The hypothesis stated that recreational dating was associated with peer orientation, low commitment and assertive behavior; that mate selection dating was associated with family orientation, medium commitment and…

  19. The impact of bismuth addition to sequential treatment on Helicobacter pylori eradication: A pilot study.

    PubMed

    Basyigit, Sebahat; Kefeli, Ayse; Sapmaz, Ferdane; Yeniova, Abdullah Ozgür; Asilturk, Zeliha; Hokkaomeroglu, Murat; Uzman, Metin; Nazligul, Yasar

    2015-10-25

    The success of the current anti-Helicobacter pylori (H. pylori) treatment protocols is reported to decrease by years, and research is needed to strengthen the H. pylori eradication treatment. Sequential treatment (ST), one of the treatment modalities for H. pylori eradication, includes amoxicillin 1 gr b.i.d and proton pump inhibitor b.i.d for first 5 days and then includes clarithromycin 500 mg b.i.d, metronidazole 500 mg b.i.d and a proton pump inhibitor b.i.d for remaining 5 days. In this study, we investigated efficacy and tolerability of bismuth addition in to ST. We included patients that underwent upper gastrointestinal endoscopy in which H. pylori infection was diagnosed by histological examination of antral and corporal gastric mucosa biopsy. Participants were randomly administered ST or bismuth containing ST (BST) protocols for the first-line H. pylori eradication therapy. Participants have been tested by urea breath test for eradication success 6 weeks after the completion of treatment. One hundred and fifty patients (93 female, 57 male) were enrolled. There were no significant differences in eradication rates for both intention to treat population (70.2%, 95% confidence interval [CI]: 66.3-74.1% vs. 71.8%, 95% CI: 61.8-81.7%, for ST and BST, respectively, p>0.05) and per protocol population (74.6%, 95% CI: 63.2-85.8% vs. 73.7%, 95% CI: 63.9-83.5% for ST and BST, respectively, p>0.05). Despite the undeniable effect of bismuth, there may be several possible reasons of unsatisfactory eradication success. Drug administration time, coadministration of other drugs, possible H. pylori resistance to bismuth may affect the eradication success. The addition of bismuth subcitrate to ST regimen does not provide significant increase in eradication rates.

  20. Experimental Study of Disruption of Columnar Grains During Rapid Solidification in Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Manogharan, Guha; Yelamanchi, Bharat; Aman, Ronald; Mahbooba, Zaynab

    2016-03-01

    Over the years, many studies have been conducted to study and analyze the grain structures of metal alloys during additive manufacturing to improve mechanical properties. In particular, columnar grains are observed predominantly during rapid solidification of molten metal. This leads to lower mechanical properties and requires expensive secondary heat-treatment processes. This study is aimed at disrupting the formation of columnar grain growth during rapid solidification using ultrasonic vibration and analyzes the effects on grain structure and mechanical properties. A gas-metal arc welder mounted on a Rep-Rap-based low-cost metal 3 Dimension printer was used to deposit ER70S-6 mild steel layers on a plate. A contact-type ultrasonic transducer with a control system to vary the frequency and power of the vibration was used. The effects of ultrasonic vibration were determined from the statistical analysis of microstructure and micro-indentation techniques on the deposited layer and heat-affected zone. It was found that both frequency and interaction between frequency and power had significant impact on the refinement of average grain size up to 10.64% and increased the number of grains by approximately 41.78%. Analysis of micro-indentation tests showed that there was an increase of approximately 14.30% in micro-hardness due to the applied frequency during rapid solidification. A pole diagram shows that application of vibration causes randomization of grain orientation. Along with the results from this study, further efforts in modeling and experimentation of multi-directional vibrations would lead to a better understanding of disrupting columnar grains in applications that use mechanical vibrations, such as welding, directed energy deposition, brazing, etc.

  1. Analytical and experimental studies of ventilation systems subjected to simulated tornado conditions: Verification of the TVENT computer code

    SciTech Connect

    Martin, R.A.; Gregory, W.S.; Ricketts, C.I.; Smith, P.R.; Littleton, P.E.; Talbott, D.V.

    1988-04-01

    Analytical and experimental studies of ventilation systems have been conducted to verify the Los Alamos National Laboratory TVENT accident analysis computer code for simulated tornado conditions. This code was developed to be a user-friendly analysis tool for designers and regulatory personnel and was designed to predict pressure and flow transients in arbitrary ventilation systems. The experimental studies used two relatively simple, yet sensitive, physical systems designed using similitude analysis. These physical models were instrumented end-to-end for pressure and volumetric flow rate and then subjected to the worst credible tornado conditions using a special blowdown apparatus. We verified TVENT by showing that it successfully predicted our experimental results. By comparing experimental results from both physical models with TVENT results, we showed that we have derived the proper similitude relations (governed by compressibility effects) for all sizes of ventilation systems. As a by-product of our studies, we determined the need for fan speed variation modeling in TVENT. This modification was made and resulted in a significant improvement in our comparisons of analytical and experimental results.

  2. Evaluating the addition of positive reinforcement for learning a frightening task: a pilot study with horses.

    PubMed

    Heleski, Camie; Bauson, Laura; Bello, Nora

    2008-01-01

    Horse training often relies upon negative reinforcement (NR). This study tested the hypothesis that adding positive reinforcement (PR) to NR would enhance learning in horses (n = 34) being taught to walk over a tarp (novel/typically frightening task). Subjects were Arabians, and the same person handled all of them. This person handled half "traditionally" (NR only)--that is, halter/lead were pulled; when horse stepped forward, pressure was released; process repeated until criterion met (horse crossed the tarp with little/no obvious anxiety). The same person handled the other half traditionally--but with addition of PR < food + verbal praise > (NR + PR). Subjects "failed" the task if they refused to walk onto the tarp after 10 min. Nine horses failed; 6 of 9 failures were from NR only--no significant difference detected (p = .41). The study detected no difference in time to first crossing of the tarp (p = .30) or total time to achieve calmness criterion (p = .67). Overall, adding PR did not significantly enhance learning this task. However, there were practical implications--adding PR made the task safer/less fatiguing for the handler. PMID:18569217

  3. Evaluating the addition of positive reinforcement for learning a frightening task: a pilot study with horses.

    PubMed

    Heleski, Camie; Bauson, Laura; Bello, Nora

    2008-01-01

    Horse training often relies upon negative reinforcement (NR). This study tested the hypothesis that adding positive reinforcement (PR) to NR would enhance learning in horses (n = 34) being taught to walk over a tarp (novel/typically frightening task). Subjects were Arabians, and the same person handled all of them. This person handled half "traditionally" (NR only)--that is, halter/lead were pulled; when horse stepped forward, pressure was released; process repeated until criterion met (horse crossed the tarp with little/no obvious anxiety). The same person handled the other half traditionally--but with addition of PR < food + verbal praise > (NR + PR). Subjects "failed" the task if they refused to walk onto the tarp after 10 min. Nine horses failed; 6 of 9 failures were from NR only--no significant difference detected (p = .41). The study detected no difference in time to first crossing of the tarp (p = .30) or total time to achieve calmness criterion (p = .67). Overall, adding PR did not significantly enhance learning this task. However, there were practical implications--adding PR made the task safer/less fatiguing for the handler.

  4. Synthesis, Characterization, Molecular Modeling, and DNA Interaction Studies of Copper Complex Containing Food Additive Carmoisine Dye.

    PubMed

    Shahabadi, Nahid; Akbari, Alireza; Jamshidbeigi, Mina; Khodarahmi, Reza

    2016-06-01

    A copper complex of carmoisine dye; [Cu(carmoisine)2(H2O)2]; was synthesized and characterized by using physico-chemical and spectroscopic methods. The binding of this complex with calf thymus (ct) DNA was investigated by circular dichroism, absorption studies, emission spectroscopy, and viscosity measurements. UV-vis results confirmed that the Cu complex interacted with DNA to form a ground-state complex and the observed binding constant (2× 10(4) M(-1)) is more in keeping with the groove bindings with DNA. Furthermore, the viscosity measurement result showed that the addition of complex causes no significant change on DNA viscosity and it indicated that the intercalation mode is ruled out. The thermodynamic parameters are calculated by van't Hoff equation, which demonstrated that hydrogen bonds and van der Waals interactions played major roles in the reaction. The results of circular dichroism (CD) suggested that the complex can change the conformation of DNA from B-like form toward A-like conformation. The cytotoxicity studies of the carmoisine dye and its copper complex indicated that both of them had anticancer effects on HT-29 (colon cancer) cell line and they may be new candidates for treatment of the colon cancer.

  5. Density functional theory study of the effects of alloying additions on sulfur adsorption on nickel surfaces

    NASA Astrophysics Data System (ADS)

    Malyi, Oleksandr I.; Chen, Zhong; Kulish, Vadym V.; Bai, Kewu; Wu, Ping

    2013-01-01

    Reactions of hydrogen sulfide (H2S) with Nickel/Ytrria-doped zirconia (Ni/YDZ) anode materials might cause degradation of the performance of solid oxide fuel cells when S containing fuels are used. In this paper, we employ density functional theory to investigate S adsorption on metal (M)-doped and undoped Ni(0 0 1) and Ni(1 1 1) surfaces. Based on the performed calculations, we analyze the effects of 12 alloying additions (Ag, Au, Al, Bi, Cd, Co, Cu, Fe, Sn, Sb, V, and Zn) on the temperature of transition between clean (S atoms do not adsorb on the surfaces) and contaminated (S atoms can adsorb on the surfaces spontaneously) M-doped Ni surfaces for different concentrations of H2S in the fuel. Predicted results are consistent with many experimental studies relevant to S poisoning of both Ni/YDZ and M-doped Ni/YDZ anode materials. This study is important to understand S poisoning phenomena and to develop new S tolerant anode materials.

  6. Synthesis, Characterization, Molecular Modeling, and DNA Interaction Studies of Copper Complex Containing Food Additive Carmoisine Dye.

    PubMed

    Shahabadi, Nahid; Akbari, Alireza; Jamshidbeigi, Mina; Khodarahmi, Reza

    2016-06-01

    A copper complex of carmoisine dye; [Cu(carmoisine)2(H2O)2]; was synthesized and characterized by using physico-chemical and spectroscopic methods. The binding of this complex with calf thymus (ct) DNA was investigated by circular dichroism, absorption studies, emission spectroscopy, and viscosity measurements. UV-vis results confirmed that the Cu complex interacted with DNA to form a ground-state complex and the observed binding constant (2× 10(4) M(-1)) is more in keeping with the groove bindings with DNA. Furthermore, the viscosity measurement result showed that the addition of complex causes no significant change on DNA viscosity and it indicated that the intercalation mode is ruled out. The thermodynamic parameters are calculated by van't Hoff equation, which demonstrated that hydrogen bonds and van der Waals interactions played major roles in the reaction. The results of circular dichroism (CD) suggested that the complex can change the conformation of DNA from B-like form toward A-like conformation. The cytotoxicity studies of the carmoisine dye and its copper complex indicated that both of them had anticancer effects on HT-29 (colon cancer) cell line and they may be new candidates for treatment of the colon cancer. PMID:27152751

  7. Percutaneous Dorsal Instrumentation of Vertebral Burst Fractures: Value of Additional Percutaneous Intravertebral Reposition—Cadaver Study

    PubMed Central

    Krüger, Antonio; Schmuck, Maya; Noriega, David C.; Ruchholtz, Steffen; Baroud, Gamal; Oberkircher, Ludwig

    2015-01-01

    Purpose. The treatment of vertebral burst fractures is still controversial. The aim of the study is to evaluate the purpose of additional percutaneous intravertebral reduction when combined with dorsal instrumentation. Methods. In this biomechanical cadaver study twenty-eight spine segments (T11-L3) were used (male donors, mean age 64.9 ± 6.5 years). Burst fractures of L1 were generated using a standardised protocol. After fracture all spines were allocated to four similar groups and randomised according to surgical techniques (posterior instrumentation; posterior instrumentation + intravertebral reduction device + cement augmentation; posterior instrumentation + intravertebral reduction device without cement; and intravertebral reduction device + cement augmentation). After treatment, 100000 cycles (100–600 N, 3 Hz) were applied using a servohydraulic loading frame. Results. Overall anatomical restoration was better in all groups where the intravertebral reduction device was used (p < 0.05). In particular, it was possible to restore central endplates (p > 0.05). All techniques decreased narrowing of the spinal canal. After loading, clearance could be maintained in all groups fitted with the intravertebral reduction device. Narrowing increased in the group treated with dorsal instrumentation. Conclusions. For height and anatomical restoration, the combination of an intravertebral reduction device with dorsal instrumentation showed significantly better results than sole dorsal instrumentation. PMID:26137481

  8. Dosimetric Study and Verification of Total Body Irradiation Using Helical Tomotherapy and its Comparison to Extended SSD Technique

    SciTech Connect

    Zhuang, Audrey H.; Liu An; Schultheiss, Timothy E.; Wong, Jeffrey Y.C.

    2010-01-01

    The American College of Radiology practice guideline for total body irradiation (TBI) requires a back-up treatment delivery system. This study investigates the development of helical tomotherapy (HT) for delivering TBI and compares it with conventional extended source-to-surface distance (X-SSD) technique. Four patients' head-to-thigh computed tomographic images were used in this study, with the target defined as the body volume without the left and right lungs. HT treatment plans with the standard TBI prescription (1.2 Gy/fx, 10 fractions) were generated and verified on phantoms. To compare HT plans with X-SSD treatment, the dose distribution of X-SSD technique was simulated using the Eclipse software. The average dose received by 90% of the target volume was 12.3 Gy (range, 12.2-12.4 Gy) for HT plans and 10.3 Gy (range, 10.08-10.58 Gy) for X-SSD plans (p < 0.001). The left and right lung median doses were 5.44 Gy and 5.40 Gy, respectively, for HT plans and 8.34 Gy and 8.95 Gy, respectively, for X-SSD treatment. The treatment planning time was comparable between the two methods. The beam delivery time of HT treatment was longer than X-SSD treatment. In conclusion, HT-based TBI plans have better dose coverage to the target and better dose sparing to the lungs compared with X-SSD technique, which applies dose compensators, lung blocks, and electron boosts. This study demonstrates that HT is possible for delivering TBI. Clinical validation of the feasibility of this approach would be of interest in the future.

  9. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation

  10. Automated radiotherapy treatment plan integrity verification

    SciTech Connect

    Yang Deshan; Moore, Kevin L.

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  11. Experimental study of enhanced heat transfer by addition of CuO nanoparticle

    NASA Astrophysics Data System (ADS)

    Jesumathy, Stella; Udayakumar, M.; Suresh, S.

    2012-06-01

    An energy storage system has been designed to study the thermal characteristics of paraffin wax with an embedded nano size copper oxide (CuO) particle. This paper presents studies conducted on phase transition times, heat fraction as well as heat transfer characteristics of paraffin wax as phase change material (PCM) embedded with CuO nanoparticles. 40 nm mean size CuO particles of 2, 5 and 10% by weight were dispersed in PCM for this study. Experiments were performed on a heat exchanger with 1.5-10 l/min of heat transfer fluid (HTF) flow. Time-based variations of the temperature distributions are revealed from the results of observations of melting and solidification curves. The results strongly suggested that the thermal conductivity enhances 6, 6.7 and 7.8% in liquid state and in dynamic viscosity it enhances by 5, 14 and 30% with increasing mass fraction of the CNEPs. The thermal conductivity ratio of the composites can be augmented by a factor up to 1.3. The heat transfer coefficient during solidification increased about 78% for the maximum flow rate. The analysis of experimental results reveals that the addition of copper oxide nanoparticles to the paraffin wax enhances both the conduction and natural convection very effectively in composites and in paraffin wax. The paraffin wax-based composites have great potential for energy storage applications like industrial waste heat recovery, solar thermal applications and solar based dynamic space power generation with optimal fraction of copper oxide nanoparticles.

  12. Increased Risk of Additional Cancers Among Patients with Gastrointestinal Stromal Tumors: A Population-Based Study

    PubMed Central

    Murphy, James D.; Ma, Grace L.; Baumgartner, Joel M.; Madlensky, Lisa; Burgoyne, Adam M.; Tang, Chih-Min; Martinez, Maria Elena; Sicklick, Jason K.

    2015-01-01

    Purpose Most gastrointestinal stromal tumors (GIST) are considered non-hereditary or sporadic. However, single-institution studies suggest that GIST patients develop additional malignancies with increased frequencies. We hypothesized that we could gain greater insight into possible associations between GIST and other malignancies using a national cancer database inquiry. Methods Patients diagnosed with GIST (2001–2011) in the Surveillance, Epidemiology, and End Results database were included. Standardized prevalence ratios (SPRs) and standardized incidence ratios (SIRs) were used to quantify cancer risks incurred by GIST patients before and after GIST diagnoses, respectively, when compared with the general U.S. population. Results Of 6,112 GIST patients, 1,047 (17.1%) had additional cancers. There were significant increases in overall cancer rates: 44% (SPR=1.44) before diagnosis and 66% (SIR=1.66) after GIST diagnoses. Malignancies with significantly increased occurrence both before/after diagnoses included other sarcomas (SPR=5.24/SIR=4.02), neuroendocrine-carcinoid tumors (SPR=3.56/SIR=4.79), non-Hodgkin’s lymphoma (SPR=1.69/SIR=1.76), and colorectal adenocarcinoma (SPR=1.51/SIR=2.16). Esophageal adenocarcinoma (SPR=12.0), bladder adenocarcinoma (SPR=7.51), melanoma (SPR=1.46), and prostate adenocarcinoma (SPR=1.20) were significantly more common only before GIST. Ovarian carcinoma (SIR=8.72), small intestine adenocarcinoma (SIR=5.89), papillary thyroid cancer (SIR=5.16), renal cell carcinoma (SIR=4.46), hepatobiliary adenocarcinomas (SIR=3.10), gastric adenocarcinoma (SIR=2.70), pancreatic adenocarcinoma (SIR=2.03), uterine adenocarcinoma (SIR=1.96), non-small cell lung cancer (SIR=1.74), and transitional cell carcinoma of the bladder (SIR=1.65) were significantly more common only after GIST. Conclusion This is the first population-based study to characterize the associations and temporal relationships between GIST and other cancers, both by site and

  13. Improved fluid dynamics similarity, analysis and verification. Part 5: Analytical and experimental studies of thermal stratification phenomena

    NASA Technical Reports Server (NTRS)

    Winter, E. R. F.; Schoenhals, R. J.; Haug, R. I.; Libby, T. L.; Nelson, R. N.; Stevenson, W. H.

    1968-01-01

    The stratification behavior of a contained fluid subjected to transient free convection heat transfer was studied. A rectangular vessel was employed with heat transfer from two opposite walls of the vessel to the fluid. The wall temperature was increased suddenly to initiate the process and was then maintained constant throughout the transient stratification period. Thermocouples were positioned on a post at the center of the vessel. They were adjusted so that temperatures could be measured at the fluid surface and at specific depths beneath the surface. The predicted values of the surface temperature and the stratified layer thickness were found to agree reasonably well with the experimental measurements. The experiments also provided information on the transient centerline temperature distribution and the transient flow distribution.

  14. Dosimetric study and in-vivo dose verification for conformal avoidance treatment of anal adenocarcinoma using helical tomotherapy

    SciTech Connect

    Han Chunhui . E-mail: chan@coh.org; Chen Yijen; Liu An; Schultheiss, Timothy E.; Wong, Jeffrey Y.C.

    2007-04-01

    This study evaluated the efficacy of using helical tomotherapy for conformal avoidance treatment of anal adenocarcinoma. We retrospectively generated step-and-shoot intensity-modulated radiotherapy (sIMRT) plans and helical tomotherapy plans for two anal cancer patients, one male and one female, who were treated by the sIMRT technique. Dose parameters for the planning target volume (PTV) and the organs-at-risk (OARs) were compared between the sIMRT and the helical tomotherapy plans. The helical tomotherapy plans showed better dose homogeneity in the PTV, better dose conformity around the PTV, and, therefore, better sparing of nearby OARs compared with the sIMRT plans. In-vivo skin dose measurements were performed during conformal avoidance helical tomotherapy treatment of an anal cancer patient to verify adequate delivery of skin dose and sparing of OARs.

  15. Standardized verification of fuel cycle modeling

    DOE PAGES

    Feng, B.; Dixon, B.; Sunny, E.; Cuadra, A.; Jacobson, J.; Brown, N. R.; Powers, J.; Worrall, A.; Passerini, S.; Gregg, R.

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  16. New method for detection of complex 3D fracture motion - Verification of an optical motion analysis system for biomechanical studies

    PubMed Central

    2012-01-01

    Background Fracture-healing depends on interfragmentary motion. For improved osteosynthesis and fracture-healing, the micromotion between fracture fragments is undergoing intensive research. The detection of 3D micromotions at the fracture gap still presents a challenge for conventional tactile measurement systems. Optical measurement systems may be easier to use than conventional systems, but, as yet, cannot guarantee accuracy. The purpose of this study was to validate the optical measurement system PONTOS 5M for use in biomechanical research, including measurement of micromotion. Methods A standardized transverse fracture model was created to detect interfragmentary motions under axial loadings of up to 200 N. Measurements were performed using the optical measurement system and compared with a conventional high-accuracy tactile system consisting of 3 standard digital dial indicators (1 μm resolution; 5 μm error limit). Results We found that the deviation in the mean average motion detection between the systems was at most 5.3 μm, indicating that detection of micromotion was possible with the optical measurement system. Furthermore, we could show two considerable advantages while using the optical measurement system. Only with the optical system interfragmentary motion could be analyzed directly at the fracture gap. Furthermore, the calibration of the optical system could be performed faster, safer and easier than that of the tactile system. Conclusion The PONTOS 5 M optical measurement system appears to be a favorable alternative to previously used tactile measurement systems for biomechanical applications. Easy handling, combined with a high accuracy for 3D detection of micromotions (≤ 5 μm), suggests the likelihood of high user acceptance. This study was performed in the context of the deployment of a new implant (dynamic locking screw; Synthes, Oberdorf, Switzerland). PMID:22405047

  17. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  18. 78 FR 68461 - Guidance for Industry: Studies To Evaluate the Utility of Anti-Salmonella Chemical Food Additives...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-14

    ... Anti- Salmonella Chemical Food Additives in Feeds; Request for Comments AGENCY: Food and Drug... Chemical Food Additives in Feeds,'' and is seeking comments on this guidance before revisions are made... Guidance for Industry: Studies to Evaluate the Utility of Anti-Salmonella Chemical Food Additives in...

  19. Verification of Legal Knowledge-base with Conflictive Concept

    NASA Astrophysics Data System (ADS)

    Hagiwara, Shingo; Tojo, Satoshi

    In this paper, we propose a verification methodology of large-scale legal knowledge. With a revision of legal code, we are forced to revise also other affected code to keep the consistency of law. Thus, our task is to revise the affected area properly and to investigate its adequacy. In this study, we extend the notion of inconsistency besides of the ordinary logical inconsistency, to include the conceptual conflicts. We obtain these conflictions from taxonomy data, and thus, we can avoid tedious manual declarations of opponent words. In the verification process, we adopt extended disjunctive logic programming (EDLP) to tolerate multiple consequences for a given set of antecedents. In addition, we employ abductive logic programming (ALP) regarding the situations to which the rules are applied as premises. Also, we restrict a legal knowledge-base to acyclic program to avoid the circulation of definitions, to justify the relevance of verdicts. Therefore, detecting cyclic parts of legal knowledge would be one of our objectives. The system is composed of two subsystems; we implement the preprocessor in Ruby to facilitate string manipulation, and the verifier in Prolog to exert the logical inference. Also, we employ XML format in the system to retain readability. In this study, we verify actual code of ordinances of Toyama prefecture, and show the experimental results.

  20. Beyond the Call of Duty: A Qualitative Study of Teachers' Additional Responsibilities Related to Sexuality Education

    ERIC Educational Resources Information Center

    Eisenberg, Marla E.; Madsen, Nikki; Oliphant, Jennifer A.; Resnick, Michael

    2011-01-01

    Seven focus groups were conducted with sexuality educators in Minnesota to explore ways that teaching sexuality education differs from teaching other health education content and to determine if additional supports or resources are needed for sexuality educators. Teachers described many specific additional responsibilities or concerns related to…

  1. Real time bolt preload monitoring using piezoceramic transducers and time reversal technique—a numerical study with experimental verification

    NASA Astrophysics Data System (ADS)

    Parvasi, Seyed Mohammad; Ho, Siu Chun Michael; Kong, Qingzhao; Mousavi, Reza; Song, Gangbing

    2016-08-01

    Bolted joints are ubiquitous structural elements, and form critical connections in mechanical and civil structures. As such, loosened bolted joints may lead to catastrophic failures of these structures, thus inspiring a growing interest in monitoring of bolted joints. A novel energy based wave method is proposed in this study to monitor the axial load of bolted joint connections. In this method, the time reversal technique was used to focus the energy of a piezoelectric (PZT)-generated ultrasound wave from one side of the interface to be measured as a signal peak by another PZT transducer on the other side of the interface. A tightness index (TI) was defined and used to correlate the peak amplitude to the bolt axial load. The TI bypasses the need for more complex signal processing required in other energy-based methods. A coupled, electro-mechanical analysis with elasto-plastic finite element method was used to simulate and analyze the PZT based ultrasonic wave propagation through the interface of two steel plates connected by a single nut and bolt connection. Numerical results, backed by experimental results from testing on a bolted connection between two steel plates, revealed that the peak amplitude of the focused signal increases as the bolt preload (torque level) increases due to the enlarging true contact area of the steel plates. The amplitude of the focused peak saturates and the TI reaches unity as the bolt axial load reaches a threshold value. These conditions are associated with the maximum possible true contact area between the surfaces of the bolted connection.

  2. Addition reaction of alkyl radical to C60 fullerene: Density functional theory study

    NASA Astrophysics Data System (ADS)

    Tachikawa, Hiroto; Kawabata, Hiroshi

    2016-02-01

    Functionalized fullerenes are known as a high-performance molecules. In this study, the alkyl-functionalized fullerenes (denoted by R-C60) have been investigated by means of the density functional theory (DFT) method to elucidate the effects of functionalization on the electronic states of fullerene. Also, the reaction mechanism of alkyl radicals with C60 was investigated. The methyl, ethyl, propyl, and butyl radicals (denoted by n = 1-4, where n means the number of carbon atoms in the alkyl radical) were examined as alkyl radicals. The DFT calculation showed that the alkyl radical binds to the carbon atom of C60 at the on-top site, and a strong C-C single bond is formed. The binding energies of alkyl radicals to C60 were distributed in the range of 31.8-35.1 kcal mol-1 at the CAM-B3LYP/6-311G(d,p) level. It was found that the activation barrier exists before alkyl addition, the barrier heights were calculated to be 2.1-2.8 kcal mol-1. The electronic states of R-C60 complexes were discussed on the basis of the theoretical results.

  3. Experimental study of combustion characteristics of nanoscale metal and metal oxide additives in biofuel (ethanol).

    PubMed

    Jones, Matthew; Li, Calvin H; Afjeh, Abdollah; Peterson, Gp

    2011-01-01

    An experimental investigation of the combustion behavior of nano-aluminum (n-Al) and nano-aluminum oxide (n-Al2O3) particles stably suspended in biofuel (ethanol) as a secondary energy carrier was conducted. The heat of combustion (HoC) was studied using a modified static bomb calorimeter system. Combustion element composition and surface morphology were evaluated using a SEM/EDS system. N-Al and n-Al2O3 particles of 50- and 36-nm diameters, respectively, were utilized in this investigation. Combustion experiments were performed with volume fractions of 1, 3, 5, 7, and 10% for n-Al, and 0.5, 1, 3, and 5% for n-Al2O3. The results indicate that the amount of heat released from ethanol combustion increases almost linearly with n-Al concentration. N-Al volume fractions of 1 and 3% did not show enhancement in the average volumetric HoC, but higher volume fractions of 5, 7, and 10% increased the volumetric HoC by 5.82, 8.65, and 15.31%, respectively. N-Al2O3 and heavily passivated n-Al additives did not participate in combustion reactively, and there was no contribution from Al2O3 to the HoC in the tests. A combustion model that utilized Chemical Equilibrium with Applications was conducted as well and was shown to be in good agreement with the experimental results. PMID:21711760

  4. Experimental study of combustion characteristics of nanoscale metal and metal oxide additives in biofuel (ethanol)

    NASA Astrophysics Data System (ADS)

    Jones, Matthew; Li, Calvin H.; Afjeh, Abdollah; Peterson, Gp

    2011-12-01

    An experimental investigation of the combustion behavior of nano-aluminum (n-Al) and nano-aluminum oxide (n-Al2O3) particles stably suspended in biofuel (ethanol) as a secondary energy carrier was conducted. The heat of combustion (HoC) was studied using a modified static bomb calorimeter system. Combustion element composition and surface morphology were evaluated using a SEM/EDS system. N-Al and n-Al2O3 particles of 50- and 36-nm diameters, respectively, were utilized in this investigation. Combustion experiments were performed with volume fractions of 1, 3, 5, 7, and 10% for n-Al, and 0.5, 1, 3, and 5% for n-Al2O3. The results indicate that the amount of heat released from ethanol combustion increases almost linearly with n-Al concentration. N-Al volume fractions of 1 and 3% did not show enhancement in the average volumetric HoC, but higher volume fractions of 5, 7, and 10% increased the volumetric HoC by 5.82, 8.65, and 15.31%, respectively. N-Al2O3 and heavily passivated n-Al additives did not participate in combustion reactively, and there was no contribution from Al2O3 to the HoC in the tests. A combustion model that utilized Chemical Equilibrium with Applications was conducted as well and was shown to be in good agreement with the experimental results.

  5. Experimental study of combustion characteristics of nanoscale metal and metal oxide additives in biofuel (ethanol)

    PubMed Central

    2011-01-01

    An experimental investigation of the combustion behavior of nano-aluminum (n-Al) and nano-aluminum oxide (n-Al2O3) particles stably suspended in biofuel (ethanol) as a secondary energy carrier was conducted. The heat of combustion (HoC) was studied using a modified static bomb calorimeter system. Combustion element composition and surface morphology were evaluated using a SEM/EDS system. N-Al and n-Al2O3 particles of 50- and 36-nm diameters, respectively, were utilized in this investigation. Combustion experiments were performed with volume fractions of 1, 3, 5, 7, and 10% for n-Al, and 0.5, 1, 3, and 5% for n-Al2O3. The results indicate that the amount of heat released from ethanol combustion increases almost linearly with n-Al concentration. N-Al volume fractions of 1 and 3% did not show enhancement in the average volumetric HoC, but higher volume fractions of 5, 7, and 10% increased the volumetric HoC by 5.82, 8.65, and 15.31%, respectively. N-Al2O3 and heavily passivated n-Al additives did not participate in combustion reactively, and there was no contribution from Al2O3 to the HoC in the tests. A combustion model that utilized Chemical Equilibrium with Applications was conducted as well and was shown to be in good agreement with the experimental results. PMID:21711760

  6. Meiofaunal and bacterial community response to diesel additions in a microcosm study.

    PubMed

    Lindgren, J Fredrik; Hassellöv, Ida-Maja; Dahllöf, Ingela

    2012-03-01

    Effects of low PAH-containing diesel were studied in a 60-day microcosm experiment at PAH concentrations 130, 1300 and 13,000μg/kg sediment. Nutrient fluxes, potential nitrification and meiofaunal community composition were analysed at three time points. Changed ∑NOx-fluxes indicated reduced sediment nitrification in Medium and High with time, in agreement with lowered potential nitrification rates in all treatments. Reduction in silicate and phosphate fluxes over time suggested severe effects on activity of meiofauna. Reduced activity increased the anoxic sediment layer, which could have contributed to the changed ∑NOx-fluxes. There were significant differences in meiofaunal community composition after 30 and 60days in Medium and High. Changes were due to increasing numbers of harpacticoids and the foraminiferan group Rotaliina, as well as decreasing numbers of Nematodes and the foraminiferan group Reophax. In spite of the low PAH-level, small additions of this diesel can still have pronounced effects on meiofaunal and bacterial communities.

  7. A theoretical study of wave dispersion and thermal conduction for HMX/additive interfaces

    NASA Astrophysics Data System (ADS)

    Long, Yao; Chen, Jun

    2014-04-01

    The wave dispersion rule for non-uniform material is useful for ultrasonic inspection and engine life prediction, and also is key in achieving an understanding of the energy dissipation and thermal conduction properties of solid material. On the basis of linear response theory and molecular dynamics, we derive a set of formulas for calculating the wave dispersion rate of interface systems, and study four kinds of interfaces inside plastic bonded explosives: HMX/{HMX, TATB, F2312, F2313}. (HMX: octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine; TATB: 1,3,5-triamino-2,4,6-trinitrobenzene; F2312, F2313: fluoropolymers). The wave dispersion rate is obtained over a wide frequency range from kHz to PHz. We find that at low frequency, the rate is proportional to the square of the frequency, and at high frequency, the rate couples with the molecular vibration modes at the interface. By using the results, the thermal conductivities of HMX/additive interfaces are derived, and a physical model is built for describing the total thermal conductivity of mixture explosives, including HMX multi-particle systems and {TATB, F2312, F2313}-coated HMX.

  8. Age and gender dependent heart rate circadian model development and performance verification on the proarrhythmic drug case study

    PubMed Central

    2013-01-01

    Background There are two main reasons for drug withdrawals at the various levels of the development path – hepatic and cardiac toxicity. The latter one is mainly connected with the proarrhythmic potency and according to the present practice is supposed to be recognized at the pre-clinical (in vitro and animal in vivo) or clinical level (human in vivo studies). There are, although, some limitations to all the above mentioned methods which have led to novel in vitro – in vivo extrapolation methods being introduced. With the use of in silico implemented mathematical and statistical modelling it is possible to translate the in vitro findings into the human in vivo situation at the population level. Human physiology is influenced by many parameters and one of them which needs to be properly accounted for is a heart rate which follows the circadian rhythm. We described such phenomenon statistically which enabled the improved assessment of the drug proarrhythmic potency. Methods A publicly available data set describing the circadian changes of the heart rate of 18 healthy subjects, 5 males (average age 36, range 26–45) and 13 females (average age 34, range 20–50) was used for the heart rate model development. External validation was done with the use of a clinical research database containing heart rate measurements derived from 67 healthy subjects, 34 males and 33 females (average age 33, range 17–72). The developed heart rate model was then incorporated into the ToxComp platform to simulate the impact of circadian variation in the heart rate on QTc interval. The usability of the combined models was assessed with moxifloxacin (MOXI) as a model drug. Results The developed heart rate model fitted well, both to the training data set (RMSE = 128 ms and MAPE = 12.3%) and the validation data set (RMSE = 165 ms and MAPE = 17.1%). Simulations performed at the population level proved that the combination of the IVIVE platform and the population variability description

  9. Automating engineering verification in ALMA subsystems

    NASA Astrophysics Data System (ADS)

    Ortiz, José; Castillo, Jorge

    2014-08-01

    The Atacama Large Millimeter/submillimeter Array is an interferometer comprising 66 individual high precision antennas located over 5000 meters altitude in the north of Chile. Several complex electronic subsystems need to be meticulously tested at different stages of an antenna commissioning, both independently and when integrated together. First subsystem integration takes place at the Operations Support Facilities (OSF), at an altitude of 3000 meters. Second integration occurs at the high altitude Array Operations Site (AOS), where also combined performance with Central Local Oscillator (CLO) and Correlator is assessed. In addition, there are several other events requiring complete or partial verification of instrument specifications compliance, such as parts replacements, calibration, relocation within AOS, preventive maintenance and troubleshooting due to poor performance in scientific observations. Restricted engineering time allocation and the constant pressure of minimizing downtime in a 24/7 astronomical observatory, impose the need to complete (and report) the aforementioned verifications in the least possible time. Array-wide disturbances, such as global power interruptions and following recovery, generate the added challenge of executing this checkout on multiple antenna elements at once. This paper presents the outcome of the automation of engineering verification setup, execution, notification and reporting in ALMA and how these efforts have resulted in a dramatic reduction of both time and operator training required. Signal Path Connectivity (SPC) checkout is introduced as a notable case of such automation.

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  12. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  13. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  14. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  15. Comparative study of trimethyl phosphite and trimethyl phosphate as electrolyte additives in lithium ion batteries

    NASA Astrophysics Data System (ADS)

    Yao, X. L.; Xie, S.; Chen, C. H.; Wang, Q. S.; Sun, J. H.; Li, Y. L.; Lu, S. X.

    Safety concerns of lithium ion batteries have been the key problems in their practical applications. Trimethyl phosphite (TMP(i)) and trimethyl phosphate (TMP(a)) were used as the electrolyte additives to improve the safety and electrochemical performance of lithium cells. Gallvanostatic cell cycling, flammability test and thermal stability measurements by means of accelerated rate calorimeter (ARC) and micro calorimeter were performed. It is found that both TMP(i) and TMP(a) reduce the flammability of the electrolyte. The TMP(i) additive not only enhances the thermal stability of the electrolyte, but also improves its electrochemical performance. The TMP(a) additive can improve the thermal stability of the electrolyte at the expense of some degree of degradation of its electrochemical performance. Therefore, TMP(i) is a better flame retardant additive in the electrolyte compared with TMP(a).

  16. Fortification of yogurts with different antioxidant preservatives: A comparative study between natural and synthetic additives.

    PubMed

    Caleja, Cristina; Barros, Lillian; Antonio, Amilcar L; Carocho, Márcio; Oliveira, M Beatriz P P; Ferreira, Isabel C F R

    2016-11-01

    Consumers demand more and more so-called "natural" products and, therefore, the aim of this work was to compare the effects of natural versus synthetic antioxidant preservatives in yogurts. Matricaria recutita L. (chamomile) and Foeniculum vulgare Mill. (fennel) decoctions were tested as natural additives, while potassium sorbate (E202) was used as a synthetic additive. The fortification of yogurts with natural and synthetic antioxidants did not cause significant changes in the yoghurt pH and nutritional value, in comparison with control samples (yogurt without any additive). However, the fortified yogurts showed higher antioxidant activity, mainly the yogurts with natural additives (and among these, the ones with chamomile decoction). Overall, it can be concluded that plant decoctions can be used to develop novel yogurts, by replacing synthetic preservatives and improving the antioxidant properties of the final product, without changing the nutritional profile. PMID:27211646

  17. Fortification of yogurts with different antioxidant preservatives: A comparative study between natural and synthetic additives.

    PubMed

    Caleja, Cristina; Barros, Lillian; Antonio, Amilcar L; Carocho, Márcio; Oliveira, M Beatriz P P; Ferreira, Isabel C F R

    2016-11-01

    Consumers demand more and more so-called "natural" products and, therefore, the aim of this work was to compare the effects of natural versus synthetic antioxidant preservatives in yogurts. Matricaria recutita L. (chamomile) and Foeniculum vulgare Mill. (fennel) decoctions were tested as natural additives, while potassium sorbate (E202) was used as a synthetic additive. The fortification of yogurts with natural and synthetic antioxidants did not cause significant changes in the yoghurt pH and nutritional value, in comparison with control samples (yogurt without any additive). However, the fortified yogurts showed higher antioxidant activity, mainly the yogurts with natural additives (and among these, the ones with chamomile decoction). Overall, it can be concluded that plant decoctions can be used to develop novel yogurts, by replacing synthetic preservatives and improving the antioxidant properties of the final product, without changing the nutritional profile.

  18. Chemostat Studies of TCE-Dehalogenating Anaerobic Consortia under Excess and Limited Electron Donor Addition

    NASA Astrophysics Data System (ADS)

    Semprini, L.; Azizian, M.; Green, J.; Mayer-Blackwell, K.; Spormann, A. M.

    2015-12-01

    Two cultures - the Victoria Strain (VS) and the Evanite Strain (EV), enriched with the organohalide respiring bacteria Dehalococcoides mccartyi - were grown in chemostats for more than 4 years at a mean cell residence time of 50 days. The slow doubling rate represents growth likely experienced in the subsurface. The chemostats were fed formate as an electron donor and trichloroethene (TCE) as the terminal electron acceptor. Under excess formate conditions, stable operation was observed with respect to TCE transformation, steady-state hydrogen (H2) concentrations (40 nM), and the structure of the dehalogenating community. Both cultures completely transformed TCE to ethene, with minor amounts of vinyl chloride (VC) observed, along with acetate formation. When formate was limited, TCE was transformed incompletely to ethene (40-60%) and VC (60- 40%), and H2 concentrations ranged from 1 to 3 nM. The acetate concentration dropped below detection. Batch kinetic studies of TCE transformation with chemostat harvested cells found transformation rates of c-DCE and VC were greatly reduced when the cells were grown with limited formate. Upon increasing formate addition to the chemostats, from limited to excess, essentially complete transformation of TCE to ethene was achieved. The increase in formate was associated with an increase in H2 concentration and the production of acetate. Results of batch kinetic tests showed increases in transformation rates for TCE and c-DCE by factors of 3.5 and 2.5, respectively, while VC rates increased by factors of 33 to 500, over a six month period. Molecular analysis of chemostat samples is being performed to quantify the changes in copy numbers of reductase genes and to determine whether shifts in the strains of Dehalococcoides mccartyi where responsible for the observed rate increases. The results demonstrate the importance of electron donor supply for successful in-situ remediation.

  19. Study of metal whiskers growth and mitigation technique using additive manufacturing

    NASA Astrophysics Data System (ADS)

    Gullapalli, Vikranth

    For years, the alloy of choice for electroplating electronic components has been tin-lead (Sn-Pb) alloy. However, the legislation established in Europe on July 1, 2006, required significant lead (Pb) content reductions from electronic hardware due to its toxic nature. A popular alternative for coating electronic components is pure tin (Sn). However, pure tin has the tendency to spontaneously grow electrically conductive Sn whisker during storage. Sn whisker is usually a pure single crystal tin with filament or hair-like structures grown directly from the electroplated surfaces. Sn whisker is highly conductive, and can cause short circuits in electronic components, which is a very significant reliability problem. The damages caused by Sn whisker growth are reported in very critical applications such as aircraft, spacecraft, satellites, and military weapons systems. They are also naturally very strong and are believed to grow from compressive stresses developed in the Sn coating during deposition or over time. The new directive, even though environmentally friendly, has placed all lead-free electronic devices at risk because of whisker growth in pure tin. Additionally, interest has occurred about studying the nature of other metal whiskers such as zinc (Zn) whiskers and comparing their behavior to that of Sn whiskers. Zn whiskers can be found in flooring of data centers which can get inside electronic systems during equipment reorganization and movement and can also cause systems failure. Even though the topic of metal whiskers as reliability failure has been around for several decades to date, there is no successful method that can eliminate their growth. This thesis will give further insights towards the nature and behavior of Sn and Zn whiskers growth, and recommend a novel manufacturing technique that has potential to mitigate metal whiskers growth and extend life of many electronic devices.

  20. A study of the electrochemistry of nickel hydroxide electrodes with various additives

    NASA Astrophysics Data System (ADS)

    Zhu, Wen-Hua; Ke, Jia-Jun; Yu, Hong-Mei; Zhang, Deng-Jun

    Nickel composite electrodes (NCE) with various additives are prepared by a chemical impregnation method from nitrate solutions on sintered porous plaques. The electrochemical properties, such as utilization of active material, swelling and the discharge potential of the nickel oxide electrode (NOE) are determined mainly through the composition of the active material and the characteristics of nickel plaques. Most additives (Mg, Ca, Sr, Ba, Zn, Cd, Co, Li and Al hydroxide) exert effects on the discharge potential and swelling of the NOE. Chemical co-precipitation with the addition of calcium, zinc, magnesium and barium hydroxide increases the discharge potential by more than 20 mV, but that with zinc hydroxide results in an obvious decrease of active-material utilization and that with calcium and magnesium hydroxide produces a larger increase of electrode thickness. The effects of anion additives are also examined. Less than 1% mol of NiS in the active material increases the discharge potential. Cadmium, cobalt and zinc hydroxide are excellent additives for preventing swelling of the NCE. Slow voltammetry (0.2 mV s -1) in 6 M KOH is applied to characterize the oxygen-evolving potential of the NCE. The difference between the oxygen-evolution potential and the potential of the oxidation peak for the NCE with additives of calcium, lithium, barium and aluminium hydroxide is at least + 60 mV.

  1. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  2. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  3. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  4. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  5. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  6. Extension and validation of an analytical model for in vivo PET verification of proton therapy—a phantom and clinical study

    NASA Astrophysics Data System (ADS)

    Attanasi, F.; Knopf, A.; Parodi, K.; Paganetti, H.; Bortfeld, T.; Rosso, V.; Del Guerra, A.

    2011-08-01

    The interest in positron emission tomography (PET) as a tool for treatment verification in proton therapy has become widespread in recent years, and several research groups worldwide are currently investigating the clinical implementation. After the first off-line investigation with a PET/CT scanner at MGH (Boston, USA), attention is now focused on an in-room PET application immediately after treatment in order to also detect shorter-lived isotopes, such as O15 and N13, minimizing isotope washout and avoiding patient repositioning errors. Clinical trials are being conducted by means of commercially available PET systems, and other tests are planned using application-dedicated tomographs. Parallel to the experimental investigation and new hardware development, great interest has been shown in the development of fast procedures to provide feedback regarding the delivered dose from reconstructed PET images. Since the thresholds of inelastic nuclear reactions leading to tissue β+-activation fall within the energy range of 15-20 MeV, the distal activity fall-off is correlated, but not directly matched, to the distal fall-off of the dose distribution. Moreover, the physical interactions leading to β+-activation and energy deposition are of a different nature. All these facts make it essential to further develop accurate and fast methodologies capable of predicting, on the basis of the planned dose distribution, expected PET images to be compared with actual PET measurements, thus providing clinical feedback on the correctness of the dose delivery and of the irradiation field position. The aim of this study has been to validate an analytical model and to implement and evaluate it in a fast and flexible framework able to locally predict such activity distributions directly taking the reference planning CT and planned dose as inputs. The results achieved in this study for phantoms and clinical cases highlighted the potential of the implemented method to predict expected

  7. Extension and validation of an analytical model for in vivo PET verification of proton therapy--a phantom and clinical study.

    PubMed

    Attanasi, F; Knopf, A; Parodi, K; Paganetti, H; Bortfeld, T; Rosso, V; Del Guerra, A

    2011-08-21

    The interest in positron emission tomography (PET) as a tool for treatment verification in proton therapy has become widespread in recent years, and several research groups worldwide are currently investigating the clinical implementation. After the first off-line investigation with a PET/CT scanner at MGH (Boston, USA), attention is now focused on an in-room PET application immediately after treatment in order to also detect shorter-lived isotopes, such as O15 and N13, minimizing isotope washout and avoiding patient repositioning errors. Clinical trials are being conducted by means of commercially available PET systems, and other tests are planned using application-dedicated tomographs. Parallel to the experimental investigation and new hardware development, great interest has been shown in the development of fast procedures to provide feedback regarding the delivered dose from reconstructed PET images. Since the thresholds of inelastic nuclear reactions leading to tissue β+ -activation fall within the energy range of 15-20 MeV, the distal activity fall-off is correlated, but not directly matched, to the distal fall-off of the dose distribution. Moreover, the physical interactions leading to β+ -activation and energy deposition are of a different nature. All these facts make it essential to further develop accurate and fast methodologies capable of predicting, on the basis of the planned dose distribution, expected PET images to be compared with actual PET measurements, thus providing clinical feedback on the correctness of the dose delivery and of the irradiation field position. The aim of this study has been to validate an analytical model and to implement and evaluate it in a fast and flexible framework able to locally predict such activity distributions directly taking the reference planning CT and planned dose as inputs. The results achieved in this study for phantoms and clinical cases highlighted the potential of the implemented method to predict expected

  8. Maltreated children's representations of mother and an additional caregiver: a longitudinal study.

    PubMed

    Manashko, Shany; Besser, Avi; Priel, Beatriz

    2009-04-01

    In the current longitudinal investigation, we explored the continuity of and changes in the mental representations of the mother and an additional caregiver among forty-five 9- to 11-year-old children who had been severely maltreated and subsequently placed in long-term residential care as well as the relationships between the content and structure of these representations and teacher's assessments of the child's externalizing and internalizing symptoms. At Time 1, a nonmaltreated comparison group was assessed concomitantly. Compared to nonmaltreated children, maltreated children scored higher for externalizing and internalizing symptoms, and their maternal representations were found to be significantly less benevolent and integrated and more punitive. In addition, among the maltreated children, the additional caregiver representations were found to be more benevolent and integrated, and less punitive, than the maternal representations. After 30 months, the maltreated children's levels of externalizing and internalizing symptoms diminished, their maternal representations become more benevolent and less punitive, and the additional caregiver representations became less benevolent. Moreover, the Benevolence of the additional caregiver representation was found to predict these children's changes in externalizing symptoms beyond the effects of their symptomatology and its associations with the Benevolence of these representations at Time 1. PMID:19220720

  9. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  10. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... Service. SUMMARY: We provide limited fee-based Social Security number (SSN) verification service to...) and our regulation at 20 CFR 401.100, establish the legal authority for us to provide SSN... addition to the benefit of providing high volume, centralized SSN verification services to the...

  11. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  12. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines.

    PubMed

    Gansäuer, Andreas; Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca; Grimme, Stefan

    2013-01-01

    The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol(-1) and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG (‡) and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically.

  13. Effect of stabilizing additives on the structure and hydration of proteins: a study involving monoclinic lysozyme.

    PubMed

    Saraswathi, N T; Sankaranarayanan, R; Vijayan, M

    2002-07-01

    In pursuance of a long-range programme on the hydration, mobility and action of proteins, the structural basis of the stabilizing effect of sugars and polyols is being investigated. With two crystallographically independent molecules with slightly different packing environments in the crystal, monoclinic lysozyme constitutes an ideal system for exploring the problem. The differences in the structure and hydration of the two molecules provide a framework for examining the changes caused by stabilizing additives. Monoclinic crystals were grown under native conditions and also in the presence of 10% sucrose, 15% trehalose, 10% trehalose, 10% sorbitol and 5% glycerol. The crystal structures were refined at resolutions ranging from 1.8 to 2.1 A. The average B values, and hence the mobility of the structure, are lower in the presence of additives than in the native crystals. However, a comparison of the structures indicates that the effect of the additives on the structure and the hydration shell around the protein molecule is considerably less than that caused by differences in packing. It is also less than that caused by the replacement of NaNO(3) by NaCl as the precipitant in the crystallization experiments. This result is not in conformity with the commonly held belief that additives exert their stabilizing effect through the reorganization of the hydration shell, at least as far as the ordered water molecules are concerned.

  14. Studies on the Food Additive Propyl Gallate: Synthesis, Structural Characterization, and Evaluation of the Antioxidant Activity

    ERIC Educational Resources Information Center

    Garrido, Jorge; Garrido, E. Manuela; Borges, Fernanda

    2012-01-01

    Antioxidants are additives largely used in industry for delaying, retarding, or preventing the development of oxidative deterioration. Propyl gallate (E310) is a phenolic antioxidant extensively used in the food, cosmetics, and pharmaceutical industries. A series of lab experiments have been developed to teach students about the importance and…

  15. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines.

    PubMed

    Gansäuer, Andreas; Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca; Grimme, Stefan

    2013-01-01

    The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol(-1) and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG (‡) and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  16. Vector generalized additive models for extreme rainfall data analysis (study case rainfall data in Indramayu)

    NASA Astrophysics Data System (ADS)

    Utami, Eka Putri Nur; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall pattern are good indicators for potential disasters. Global Circulation Model (GCM) contains global scale information that can be used to predict the rainfall data. Statistical downscaling (SD) utilizes the global scale information to make inferences in the local scale. Essentially, SD can be used to predict local scale variables based on global scale variables. SD requires a method to accommodate non linear effects and extreme values. Extreme value Theory (EVT) can be used to analyze the extreme value. One of methods to identify the extreme events is peak over threshold that follows Generalized Pareto Distribution (GPD). The vector generalized additive model (VGAM) is an extension of the generalized additive model. It is able to accommodate linear or nonlinear effects by involving more than one additive predictors. The advantage of VGAM is to handle multi response models. The key idea of VGAM are iteratively reweighted least square for maximum likelihood estimation, penalized smoothing, fisher scoring and additive models. This works aims to analyze extreme rainfall data in Indramayu using VGAM. The results show that the VGAM with GPD is able to predict extreme rainfall data accurately. The prediction in February is very close to the actual value at quantile 75.

  17. Teaching Young Children Decomposition Strategies to Solve Addition Problems: An Experimental Study

    ERIC Educational Resources Information Center

    Cheng, Zi-Juan

    2012-01-01

    The ability to count has traditionally been considered an important milestone in children's development of number sense. However, using counting (e.g., counting on, counting all) strategies to solve addition problems is not the best way for children to achieve their full mathematical potential and to prepare them to develop more complex and…

  18. Study on automatic optical element addition or deletion in lens optimization

    NASA Astrophysics Data System (ADS)

    Cheng, Xuemin; Wang, Yongtian; Hao, Qun

    2002-09-01

    Two lens form parameters, quantifying the symmetry of the optical system and the optical power distribution among the individual lens elements, are used as the criteria for automatic element addition or deletion in lens optimization. The scheme based on the criteria is described in this paper. Design examples are provided, which demonstrate that the scheme is practicable.

  19. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  20. Stationary spiraling eddies in presence of polar amplification of global warming as a governing factor of ecology of Greenland seals White Sea population: results of verification study

    NASA Astrophysics Data System (ADS)

    Melentyev, K.; Chernook, V.; Melentyev, V.

    2003-04-01

    Ice-associated forms of marine mammals are representatives of a high level of fodder chains in the ocean and taxation of population number for different group, as assessment of ecology and animal welfare are the important tasks for marine biology, ecology, fishery and other application uses. Many problems create a global warming and antropogenical impact on marine and coastal ecosystem. In order to investigate ice covered Arctic Ocean and charting the number of seals were performed annual inspections onboard research aircraft PINRO "Arktika". Multi-spectral airborne and satellite observations were fulfilled regularly from Barents and White Sea to the Bering and Okhotsk Sea (1996-2002). A contemporary status of different group of sea mammals was evaluated, where number of adults and pups were checked separately. In situ observations were provided with using helicopter and icebreaker for gathering a water samples and ice cores (with following biochemical and toxicological analysis). A prevailing part of life cycle of Greenland seals (harp seal) is strongly depended from winter hydrology (water masses, stable currents, meandering fronts, stationary eddies) and closely connected with type of ice (pack, fast ice) and other parameters of ice (age, origin, salinity, ice edge.). First-year ice floes which has a specific properties and distinctive features are used by harp seals for pupping, lactation, molting, pairing and resting. Ringed seals, inversely, use for corresponding purposes only fast-ice. Different aspects of ecology, and migration features of harp seals were analyzed in frame of verification study. It was revealed a scale of influence of winter severity and wind regime, but stationary eddies in the White Sea is most effective governing factor (novelty). Following relationship " eddies - ecology of Greenland seal White Sea population " will be discussed: A) regularities of eddies formation and their spatial arrangement, temporal (seasonal and annual

  1. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  2. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  3. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  4. A laboratory study of the perceived benefit of additional noise attenuation by houses

    NASA Technical Reports Server (NTRS)

    Flindell, I. H.

    1983-01-01

    Two Experiments were conducted to investigate the perceived benefit of additional house attenuation against aircraft flyover noise. First, subjects made annoyance judgments in a simulated living room while an operative window with real and dummy storm windows was manipulated in full view of those subjects. Second, subjects made annoyance judgments in an anechoic audiometric test chamber of frequency shaped noise signals having spectra closely matched to those of the aircraft flyover noises reproduced in the first experiment. These stimuli represented the aircraft flyover noises in levels and spectra but without the situational and visual cues present in the simulated living room. Perceptual constancy theory implies that annoyance tends to remain constant despite reductions in noise level caused by additional attenuation of which the subjects are fully aware. This theory was supported when account was taken for a reported annoyance overestimation for certain spectra and for a simulated condition cue overreaction.

  5. A laboratory study of the perceived benefit of additional noise attenuation by houses

    NASA Astrophysics Data System (ADS)

    Flindell, I. H.

    1983-06-01

    Two Experiments were conducted to investigate the perceived benefit of additional house attenuation against aircraft flyover noise. First, subjects made annoyance judgments in a simulated living room while an operative window with real and dummy storm windows was manipulated in full view of those subjects. Second, subjects made annoyance judgments in an anechoic audiometric test chamber of frequency shaped noise signals having spectra closely matched to those of the aircraft flyover noises reproduced in the first experiment. These stimuli represented the aircraft flyover noises in levels and spectra but without the situational and visual cues present in the simulated living room. Perceptual constancy theory implies that annoyance tends to remain constant despite reductions in noise level caused by additional attenuation of which the subjects are fully aware. This theory was supported when account was taken for a reported annoyance overestimation for certain spectra and for a simulated condition cue overreaction.

  6. Thiopeptin, a New Feed-Additive Antibiotic: Biological Studies and Field Trials

    PubMed Central

    Mine, K.; Miyairi, N.; Takano, N.; Mori, S.; Watanabe, N.

    1972-01-01

    Thiopeptin is a new antibiotic, produced by Streptomyces tateyamensis and developed solely for animal use as a feed additive. The antibiotic content in animal tissue and feed was assayed in terms of the antimicrobial activity against Mycoplasma laidlawii A. This antibiotic was found to be relatively nontoxic in rats and mice. In chickens, this antibiotic is excreted into feces within 48 hr of administration and is not absorbed in tissue. It is well tolerated in both broilers and swine and is highly stable in animal feed. Thiopeptin-supplemented feed contributes to the improvement of weight gain, feed efficiency in chickens and swine, and the egg performance in layers. Thus, thiopeptin, when used as a feed additive, is quite suitable for supplementing animal nutrition. PMID:4680812

  7. Structural changes in gluten protein structure after addition of emulsifier. A Raman spectroscopy study

    NASA Astrophysics Data System (ADS)

    Ferrer, Evelina G.; Gómez, Analía V.; Añón, María C.; Puppo, María C.

    2011-06-01

    Food protein product, gluten protein, was chemically modified by varying levels of sodium stearoyl lactylate (SSL); and the extent of modifications (secondary and tertiary structures) of this protein was analyzed by using Raman spectroscopy. Analysis of the Amide I band showed an increase in its intensity mainly after the addition of the 0.25% of SSL to wheat flour to produced modified gluten protein, pointing the formation of a more ordered structure. Side chain vibrations also confirmed the observed changes.

  8. Magnetic Force Microscopy Study of Zr2Co11 -Based Nanocrystalline Materials: Effect of Mo Addition

    DOE PAGES

    Yue, Lanping; Jin, Yunlong; Zhang, Wenyong; Sellmyer, David J.

    2015-01-01

    Tmore » he addition of Molybdenum was used to modify the nanostructure and enhance coercivity of rare-earth-free Zr2Co11-based nanocrystalline permanent magnets. he effect of Mo addition on magnetic domain structures of melt spun nanocrystalline Zr16Co84-xMox(x=0, 0.5, 1, 1.5, and 2.0) ribbons has been investigated. It was found that magnetic properties and local domain structures are strongly influenced by Mo doping. he coercivity of the samples increases with the increase in Mo content (x≤1.5). he maximum energy product(BH)maxincreases with increasingxfrom 0.5 MGOe forx=0to a maximum value of 4.2 MGOe forx=1.5. he smallest domain size with a relatively short magnetic correlation length of 128 nm and largest root-mean-square phase shiftΦrmsvalue of 0.66° are observed for thex=1.5. he optimal Mo addition promotes magnetic domain structure refinement and thus leads to a significant increase in coercivity and energy product in this sample.« less

  9. Load bearing and stiffness tailored NiTi implants produced by additive manufacturing: a simulation study

    NASA Astrophysics Data System (ADS)

    Rahmanian, Rasool; Shayesteh Moghaddam, Narges; Haberland, Christoph; Dean, David; Miller, Michael; Elahinia, Mohammad

    2014-03-01

    Common metals for stable long-term implants (e.g. stainless steel, Titanium and Titanium alloys) are much stiffer than spongy cancellous and even stiffer than cortical bone. When bone and implant are loaded this stiffness mismatch results in stress shielding and as a consequence, degradation of surrounding bony structure can lead to disassociation of the implant. Due to its lower stiffness and high reversible deformability, which is associated with the superelastic behavior, NiTi is an attractive biomaterial for load bearing implants. However, the stiffness of austenitic Nitinol is closer to that of bone but still too high. Additive manufacturing provides, in addition to the fabrication of patient specific implants, the ability to solve the stiffness mismatch by adding engineered porosity to the implant. This in turn allows for the design of different stiffness profiles in one implant tailored to the physiological load conditions. This work covers a fundamental approach to bring this vision to reality. At first modeling of the mechanical behavior of different scaffold designs are presented as a proof of concept of stiffness tailoring. Based on these results different Nitinol scaffolds can be produced by additive manufacturing.

  10. On code verification of RANS solvers

    NASA Astrophysics Data System (ADS)

    Eça, L.; Klaij, C. M.; Vaz, G.; Hoekstra, M.; Pereira, F. S.

    2016-04-01

    This article discusses Code Verification of Reynolds-Averaged Navier Stokes (RANS) solvers that rely on face based finite volume discretizations for volumes of arbitrary shape. The study includes test cases with known analytical solutions (generated with the method of manufactured solutions) corresponding to laminar and turbulent flow, with the latter using eddy-viscosity turbulence models. The procedure to perform Code Verification based on grid refinement studies is discussed and the requirements for its correct application are illustrated in a simple one-dimensional problem. It is shown that geometrically similar grids are recommended for proper Code Verification and so the data should not have scatter making the use of least square fits unnecessary. Results show that it may be advantageous to determine the extrapolated error to cell size/time step zero instead of assuming that it is zero, especially when it is hard to determine the asymptotic order of grid convergence. In the RANS examples, several of the features of the ReFRESCO solver are checked including the effects of the available turbulence models in the convergence properties of the code. It is shown that it is required to account for non-orthogonality effects in the discretization of the diffusion terms and that the turbulence quantities transport equations can deteriorate the order of grid convergence of mean flow quantities.

  11. Parameters and pitfalls to consider in the conduct of food additive research, Carrageenan as a case study.

    PubMed

    Weiner, Myra L

    2016-01-01

    This paper provides guidance on the conduct of new in vivo and in vitro studies on high molecular weight food additives, with carrageenan, the widely used food additive, as a case study. It is important to understand the physical/chemical properties and to verify the identity/purity, molecular weight and homogeneity/stability of the additive in the vehicle for oral delivery. The strong binding of CGN to protein in rodent chow or infant formula results in no gastrointestinal tract exposure to free CGN. It is recommended that doses of high Mw non-caloric, non-nutritive additives not exceed 5% by weight of total solid diet to avoid potential nutritional effects. Addition of some high Mw additives at high concentrations to liquid nutritional supplements increases viscosity and may affect palatability, caloric intake and body weight gain. In in vitro studies, the use of well-characterized, relevant cell types and the appropriate composition of the culture media are necessary for proper conduct and interpretation. CGN is bound to media protein and not freely accessible to cells in vitro. Interpretation of new studies on food additives should consider the interaction of food additives with the vehicle components and the appropriateness of the animal or cell model and dose-response.

  12. Parameters and pitfalls to consider in the conduct of food additive research, Carrageenan as a case study.

    PubMed

    Weiner, Myra L

    2016-01-01

    This paper provides guidance on the conduct of new in vivo and in vitro studies on high molecular weight food additives, with carrageenan, the widely used food additive, as a case study. It is important to understand the physical/chemical properties and to verify the identity/purity, molecular weight and homogeneity/stability of the additive in the vehicle for oral delivery. The strong binding of CGN to protein in rodent chow or infant formula results in no gastrointestinal tract exposure to free CGN. It is recommended that doses of high Mw non-caloric, non-nutritive additives not exceed 5% by weight of total solid diet to avoid potential nutritional effects. Addition of some high Mw additives at high concentrations to liquid nutritional supplements increases viscosity and may affect palatability, caloric intake and body weight gain. In in vitro studies, the use of well-characterized, relevant cell types and the appropriate composition of the culture media are necessary for proper conduct and interpretation. CGN is bound to media protein and not freely accessible to cells in vitro. Interpretation of new studies on food additives should consider the interaction of food additives with the vehicle components and the appropriateness of the animal or cell model and dose-response. PMID:26615870

  13. Functions of social support and self-verification in association with loneliness, depression, and stress.

    PubMed

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  14. Strategic Petroleum Reserve (SPR) additional geologic site characterization studies, Bryan Mound Salt Dome, Texas

    SciTech Connect

    Neal, J.T.; Magorian, T.R.; Ahmad, S.

    1994-11-01

    This report revises the original report that was published in 1980. Some of the topics covered in the earlier report were provisional and it is now practicable to reexamine them using new or revised geotechnical data and that obtained from SPR cavern operations, which involves 16 new caverns. Revised structure maps and sections show interpretative differences as compared with the 1980 report and more definition in the dome shape and caprock structural contours, especially a major southeast-northwest trending anomalous zone. The original interpretation was of westward tilt of the dome, this revision shows a tilt to the southeast, consistent with other gravity and seismic data. This interpretation refines the evaluation of additional cavern space, by adding more salt buffer and allowing several more caverns. Additional storage space is constrained on this nearly full dome because of low-lying peripheral wetlands, but 60 MMBBL or more of additional volume could be gained in six or more new caverns. Subsidence values at Bryan Mound are among the lowest in the SPR system, averaging about 11 mm/yr (0.4 in/yr), but measurement and interpretation issues persist, as observed values are about the same as survey measurement accuracy. Periodic flooding is a continuing threat because of the coastal proximity and because peripheral portions of the site are at elevations less than 15 ft. This threat may increase slightly as future subsidence lowers the surface, but the amount is apt to be small. Caprock integrity may be affected by structural features, especially the faulting associated with anomalous zones. Injection wells have not been used extensively at Bryan Mound, but could be a practicable solution to future brine disposal needs. Environmental issues center on the areas of low elevation that are below 15 feet above mean sea level: the coastal proximity and lowland environment combined with the potential for flooding create conditions that require continuing surveillance.

  15. Test verification and validation for molecular diagnostic assays.

    PubMed

    Halling, Kevin C; Schrijver, Iris; Persons, Diane L

    2012-01-01

    With our ever-increasing understanding of the molecular basis of disease, clinical laboratories are implementing a variety of molecular diagnostic tests to aid in the diagnosis of hereditary disorders, detection and monitoring of cancer, determination of prognosis and guidance for cancer therapy, and detection and monitoring of infectious diseases. Before introducing any new test into the clinical laboratory, the performance characteristics of the assay must be "verified," if it is a US Food and Drug Administration (FDA)-approved or FDA-cleared test, or "validated," if it is a laboratory-developed test. Although guidelines exist for how validation and verification studies may be addressed for molecular assays, the specific details of the approach used by individual laboratories is rarely published. Many laboratories, especially those introducing new types of molecular assays, would welcome additional guidance, especially in the form of specific examples, on the process of preparing a new molecular assay for clinical use. PMID:22208481

  16. A near-infrared spectroscopic study of young field ultracool dwarfs: additional analysis

    NASA Astrophysics Data System (ADS)

    Allers, K. N.; Liu, M. C.

    We present additional analysis of the classification system presented in \\citet{allers13}. We refer the reader to \\citet{allers13} for a detailed discussion of our near-IR spectral type and gravity classification system. Here, we address questions and comments from participants of the Brown Dwarfs Come of Age meeting. In particular, we examine the effects of binarity and metallicity on our classification system. We also present our classification of Pleiades brown dwarfs using published spectra. Lastly, we determine SpTs and calculate gravity-sensitive indices for the BT-Settl atmospheric models and compare them to observations.

  17. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314 Tank Farm Restoration and Safe Operations

    SciTech Connect

    MCGREW, D.L.

    1999-09-28

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate.

  18. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    SciTech Connect

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  19. The influence of bioaugmentation and biosurfactant addition on bioremediation efficiency of diesel-oil contaminated soil: feasibility during field studies.

    PubMed

    Szulc, Alicja; Ambrożewicz, Damian; Sydow, Mateusz; Ławniczak, Łukasz; Piotrowska-Cyplik, Agnieszka; Marecik, Roman; Chrzanowski, Łukasz

    2014-01-01

    The study focused on assessing the influence of bioaugmentation and addition of rhamnolipids on diesel oil biodegradation efficiency during field studies. Initial laboratory studies (measurement of emitted CO2 and dehydrogenase activity) were carried out in order to select the consortium for bioaugmentation as well as to evaluate the most appropriate concentration of rhamnolipids. The selected consortium consisted of following bacterial taxa: Aeromonas hydrophila, Alcaligenes xylosoxidans, Gordonia sp., Pseudomonas fluorescens, Pseudomonas putida, Rhodococcus equi, Stenotrophomonas maltophilia, Xanthomonas sp. It was established that the application of rhamnolipids at 150 mg/kg of soil was most appropriate in terms of dehydrogenase activity. Based on the obtained results, four treatment methods were designed and tested during 365 days of field studies: I) natural attenuation; II) addition of rhamnolipids; III) bioaugmentation; IV) bioaugmentation and addition of rhamnolipids. It was observed that bioaugmentation contributed to the highest diesel oil biodegradation efficiency, whereas the addition of rhamnolipids did not notably influence the treatment process.

  20. Kaolinite flocculation induced by smectite addition - a transmission X-ray microscopic study.

    PubMed

    Zbik, Marek S; Song, Yen-Fang; Frost, Ray L

    2010-09-01

    The influence of smectite addition on kaolinite suspensions in water was investigated by transmission X-ray microscopy (TXM) and Scanning Electron Microscopy (SEM). Sedimentation test screening was also conducted. Micrographs were processed by the STatistic IMage Analysing (STIMAN) program and structural parameters were calculated. From the results of the sedimentation tests important influences of small smectite additions to about 3wt.% on kaolinite suspension flocculation has been found. In order to determine the reason for this smectite impact on kaolinite suspension, macroscopic behaviour micro-structural examination using Transmission X-ray Microscope (TXM) and SEM has been undertaken. TXM & SEM micrographs of freeze-dried kaolinite-smectite suspensions with up to 20% smectite showed a high degree of orientation of the fabric made of highly oriented particles and greatest density when 3wt.% of smectite was added to the 10wt.% dense kaolinite suspension. In contrast, suspensions containing pure kaolinite do not show such platelet mutual orientation but homogenous network of randomly oriented kaolinite platelets. This suggests that in kaolinite-smectite suspensions, smectite forms highly oriented basic framework into which kaolinite platelets may bond in face to face preferential contacts strengthening structure and allowing them to show plastic behaviour which is cause of platelets orientation. PMID:20621806

  1. Excitotoxic food additives--relevance of animal studies to human safety.

    PubMed

    Olney, J W

    1984-01-01

    Evidence is reviewed supporting the view that excitotoxic food additives pose a significant hazard to the developing nervous system of young children. The following points are stressed: (1) although blood-brain barriers protect most central neurons from excitotoxins, certain brain regions lack such protection (a characteristic common to all vertebrate species); (2) regardless of species, it requires only a transient increase in blood excitotoxin levels for neurons in unprotected brain regions to be "silently" destroyed; (3) humans may be at particularly high risk for this kind of brain damage, since ingestion of a given amount of excitotoxin causes much higher blood excitotoxin levels in humans than in other species; (4) in addition to the heightened risk on a species basis, risk may be further increased for certain consumer sub-populations due to youth, disease or genetic factors; (5) despite these reasons for maintaining a wide margin of safety in the use of excitotoxins in foods, no safety margin is currently being observed, i.e., a comparative evaluation of animal (extensive) and human (limited) data supports the conclusion that excitotoxins, as used in foods today, may produce blood elevations high enough to cause damage to the nervous system of young children, damage which is not detectable at the time of occurrence but which may give rise to subtle disturbances in neuroendocrine function in adolescence and/or adulthood.

  2. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  3. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  4. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  5. Development of a Safeguards Verification Method and Instrument to Detect Pin Diversion from Pressurized Water Reactor (PWR) Spent Fuel Assemblies Phase I Study

    SciTech Connect

    Ham, Y S; Sitaraman, S

    2008-12-24

    A novel methodology to detect diversion of spent fuel from Pressurized Water Reactors (PWR) has been developed in order to address a long unsolved safeguards verification problem for international safeguards community such as International Atomic Energy Agency (IAEA) or European Atomic Energy Community (EURATOM). The concept involves inserting tiny neutron and gamma detectors into the guide tubes of a spent fuel assembly and measuring the signals. The guide tubes form a quadrant symmetric pattern in the various PWR fuel product lines and the neutron and gamma signals from these various locations are processed to obtain a unique signature for an undisturbed fuel assembly. Signatures based on the neutron and gamma signals individually or in a combination can be developed. Removal of fuel pins from the assembly will cause the signatures to be visibly perturbed thus enabling the detection of diversion. All of the required signal processing to obtain signatures can be performed on standard laptop computers. Monte Carlo simulation studies and a set of controlled experiments with actual commercial PWR spent fuel assemblies were performed and validated this novel methodology. Based on the simulation studies and benchmarking measurements, the methodology developed promises to be a powerful and practical way to detect partial defects that constitute 10% or more of the total active fuel pins. This far exceeds the detection threshold of 50% missing pins from a spent fuel assembly, a threshold defined by the IAEA Safeguards Criteria. The methodology does not rely on any operator provided data like burnup or cooling time and does not require movement of the fuel assembly from the storage rack in the spent fuel pool. A concept was developed to build a practical field device, Partial Defect Detector (PDET), which will be completely portable and will use standard radiation measuring devices already in use at the IAEA. The use of the device will not require any information provided

  6. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  7. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  8. Additive Manufacturing of a Microbial Fuel Cell--A detailed study.

    PubMed

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-01-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m(-3) per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments.

  9. Additive Manufacturing of a Microbial Fuel Cell—A detailed study

    PubMed Central

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-01-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m−3 per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments. PMID:26611142

  10. Additive Manufacturing of a Microbial Fuel Cell—A detailed study

    NASA Astrophysics Data System (ADS)

    Calignano, Flaviana; Tommasi, Tonia; Manfredi, Diego; Chiolerio, Alessandro

    2015-11-01

    In contemporary society we observe an everlasting permeation of electron devices, smartphones, portable computing tools. The tiniest living organisms on Earth could become the key to address this challenge: energy generation by bacterial processes from renewable stocks/waste through devices such as microbial fuel cells (MFCs). However, the application of this solution was limited by a moderately low efficiency. We explored the limits, if any, of additive manufacturing (AM) technology to fabricate a fully AM-based powering device, exploiting low density, open porosities able to host the microbes, systems easy to fuel continuously and to run safely. We obtained an optimal energy recovery close to 3 kWh m-3 per day that can power sensors and low-power appliances, allowing data processing and transmission from remote/harsh environments.

  11. Preliminary study of neutron absorption by concrete with boron carbide addition

    NASA Astrophysics Data System (ADS)

    Abdullah, Yusof; Ariffin, Fatin Nabilah Tajul; Hamid, Roszilah; Yusof, Mohd Reusmaazran; Zali, Nurazila Mat; Ahmad, Megat Harun Al Rashid Megat; Yazid, Hafizal; Ahmad, Sahrim; Mohamed, Abdul Aziz

    2014-02-01

    Concrete has become a conventional material in construction of nuclear reactor due to its properties like safety and low cost. Boron carbide was added as additives in the concrete construction as it has a good neutron absorption property. The sample preparation for concrete was produced with different weight percent of boron carbide powder content. The neutron absorption rate of these samples was determined by using a fast neutron source of Americium-241/Be (Am-Be 241) and detection with a portable backscattering neutron detector. Concrete with 20 wt % of boron carbide shows the lowest count of neutron transmitted and this indicates the most neutrons have been absorbed by the concrete. Higher boron carbide content may affect the concrete strength and other properties.

  12. Sulphur diffusion in β-NiAl and effect of Pt additive: an ab initio study

    NASA Astrophysics Data System (ADS)

    Chen, Kuiying

    2016-02-01

    Diffusivities of detrimental impurity sulfur (S) in stoichiometric and Pt doped β-NiAl were evaluated using density functional theory calculations. The apparent activation energy and the pre-exponential factor of diffusivity via the next nearest neighbour (NNN) and interstitial jumps were evaluated to identify possible preferred diffusion mechanism(s). By calculating the electron localization function (ELF), the bonding characteristics of S with its surrounding atoms were assessed for the diffusion process. By comparison with the experimental results, the S diffusion through the NNN vacancy-mediated mechanism is found to be favoured. Addition of Pt in β-NiAl was found to significantly reduce the S diffusivity, and an associated electronic effect was explored. The elucidation of the above mechanisms may shed light on the development of new Pt-modified doped β-NiAl bond coats that can extend the life of oxidation resistant and thermal barrier coatings.

  13. Preliminary study of neutron absorption by concrete with boron carbide addition

    SciTech Connect

    Abdullah, Yusof Yusof, Mohd Reusmaazran; Zali, Nurazila Mat; Ahmad, Megat Harun Al Rashid Megat; Yazid, Hafizal; Ariffin, Fatin Nabilah Tajul; Ahmad, Sahrim; Hamid, Roszilah; Mohamed, Abdul Aziz

    2014-02-12

    Concrete has become a conventional material in construction of nuclear reactor due to its properties like safety and low cost. Boron carbide was added as additives in the concrete construction as it has a good neutron absorption property. The sample preparation for concrete was produced with different weight percent of boron carbide powder content. The neutron absorption rate of these samples was determined by using a fast neutron source of Americium-241/Be (Am-Be 241) and detection with a portable backscattering neutron detector. Concrete with 20 wt % of boron carbide shows the lowest count of neutron transmitted and this indicates the most neutrons have been absorbed by the concrete. Higher boron carbide content may affect the concrete strength and other properties.

  14. THERMODYNAMIC STUDY OF THE NICKEL ADDITION IN ZINC HOT-DIP GALVANIZING BATHS

    SciTech Connect

    Pistofidis, N.; Vourlias, G.

    2010-01-21

    A usual practice during zinc hot-dip galvanizing is the addition of nickel in the liquid zinc which is used to inhibit the Sandelin effect. Its action is due to the fact that the zeta(zeta) phase of the Fe-Zn system is replaced by the TAU(tau) phase of the Fe-Zn-Ni system. In the present work an attempt is made to explain the formation of the TAU phase with thermodynamics. For this reason the Gibbs free energy changes for TAU and zeta phases were calculated. The excess free energy for the system was calculated with the Redlich-Kister polyonyme. From this calculation it was deduced that the Gibbs energy change for the tau phase is negative. As a result its formation is spontaneous.

  15. Professional Competence Development of the Social Work Specialists in the Period of Study in the System of Additional Education

    ERIC Educational Resources Information Center

    Davletkaliev, Denis Kuanyshevich; Zueva, Natalia Konstantinovna; Lebedeva, Natalya Vasilevna; Mkrtumova, Irina Vladimirovna; Timofeeva, Olga

    2015-01-01

    The goal of this work is the study of psychological-pedagogical approaches to the understanding of the idea of professional competence of social work specialists as well as the role of study in the system of additional educations in professional-personal development of the listeners. In the process of study of this problem we define main…

  16. A systematic study of well-known electrolyte additives in LiCoO2/graphite pouch cells

    NASA Astrophysics Data System (ADS)

    Wang, David Yaohui; Sinha, N. N.; Petibon, R.; Burns, J. C.; Dahn, J. R.

    2014-04-01

    The effectiveness of well-known electrolyte additives singly or in combination on LiCoO2/graphite pouch cells has been systematically investigated and compared using the ultra high precision charger (UHPC) at Dalhousie University and electrochemical impedance spectroscopy (EIS). UHPC studies are believed to identify the best electrolyte additives singly or in combination within a short time period (several weeks). Three parameters: 1) the coulombic efficiency (CE); 2) the charge endpoint capacity slippage (slippage) and 3) the charge transfer resistance (Rct), of LiCoO2/graphite pouch cells with different electrolyte additives singly or in combination were measured and the results for over 55 additive sets are compared. The experimental results suggest that a combination of electrolyte additives can be more effective than a single electrolyte additive. However, of all the additive sets tested, simply using 2 wt.% vinylene carbonate yielded cells very competitive in CE, slippage and Rct. It is hoped that this comprehensive report can be used as a guide and reference for the study of other electrolyte additives singly or in combination.

  17. Spectroscopic Evidence for Covalent Binding of Sulfadiazine to Natural Soils via 1,4-nucleophilic addition (Michael Type Addition) studied by Spin Labeling ESR

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Olga

    2015-04-01

    Among different classes of veterinary pharmaceuticals, Sulfadiazine (SDZ) is widely used in animal husbandry. Its residues were detected in different environmental compartments. However, soil is a hot spot for SDZ as it receives a large portion of excreted compounds through the application of manure during soil fertilization. Ample studies on the fate of SDZ in soils showed that a large portion forms nonextractable residues (NER) along with transformation products and a low mineralization (Mueller et al., 2013). A common observation was an initially fast formation of NER up to 10% of the applied amount promptly after the application of SDZ to soil, and this portion increased up to 50% within a few days (Mueller et al., 2013; Nowak et al., 2011). A common finding for SDZ, as for other sulfonamides, was biphasic kinetics of the formation of NER, which was attributed to the occurrence of two reaction processes: a rapid, often reversible process and a slower, irreversible process (Weber et al., 1996). A single-phase reaction process was also established under anaerobic treatment (Gulkowska et al., 2014). A major focus of this work is to elucidate a reaction mechanism of covalent binding of SDZ to soil that is currently required to estimate a risk of NER formed by SDZ in soils for human health. Taking into account a key role of the amine functional groups of SDZ on its reactivity in soil, nitroxide radicals with the sewed aromatic or aliphatic amines labeled soil samples and then, were investigated by means of ESR spectroscopy. 2,5,5-Trimethyl-2-(3-aminophenyl)pyrrolidin-1-yloxy and 4-amino-2,2,6,6-Tetramethylpiperidin-1-oxyl modeled decomposition products of SDZ with the aromatic and aliphatic amines, respectively. The application of the defined combination of both spin labels (SL) to different soils well simulated a change of a paramagnetic signal of soil organic radicals interacted with SDZ. After their application to soil, SL were found in soil sites characterized

  18. Spectroscopic Evidence for Covalent Binding of Sulfadiazine to Natural Soils via 1,4-nucleophilic addition (Michael Type Addition) studied by Spin Labeling ESR

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Olga

    2015-04-01

    Among different classes of veterinary pharmaceuticals, Sulfadiazine (SDZ) is widely used in animal husbandry. Its residues were detected in different environmental compartments. However, soil is a hot spot for SDZ as it receives a large portion of excreted compounds through the application of manure during soil fertilization. Ample studies on the fate of SDZ in soils showed that a large portion forms nonextractable residues (NER) along with transformation products and a low mineralization (Mueller et al., 2013). A common observation was an initially fast formation of NER up to 10% of the applied amount promptly after the application of SDZ to soil, and this portion increased up to 50% within a few days (Mueller et al., 2013; Nowak et al., 2011). A common finding for SDZ, as for other sulfonamides, was biphasic kinetics of the formation of NER, which was attributed to the occurrence of two reaction processes: a rapid, often reversible process and a slower, irreversible process (Weber et al., 1996). A single-phase reaction process was also established under anaerobic treatment (Gulkowska et al., 2014). A major focus of this work is to elucidate a reaction mechanism of covalent binding of SDZ to soil that is currently required to estimate a risk of NER formed by SDZ in soils for human health. Taking into account a key role of the amine functional groups of SDZ on its reactivity in soil, nitroxide radicals with the sewed aromatic or aliphatic amines labeled soil samples and then, were investigated by means of ESR spectroscopy. 2,5,5-Trimethyl-2-(3-aminophenyl)pyrrolidin-1-yloxy and 4-amino-2,2,6,6-Tetramethylpiperidin-1-oxyl modeled decomposition products of SDZ with the aromatic and aliphatic amines, respectively. The application of the defined combination of both spin labels (SL) to different soils well simulated a change of a paramagnetic signal of soil organic radicals interacted with SDZ. After their application to soil, SL were found in soil sites characterized

  19. Sensitization to Food Additives in Patients with Allergy: A Study Based on Skin Test and Open Oral Challenge.

    PubMed

    Moghtaderi, Mozhgan; Hejrati, Zinatosadat; Dehghani, Zahra; Dehghani, Faranak; Kolahi, Niloofar

    2016-06-01

    There has been a great increase in the consumption of various food additives in recent years. The purpose of this study was to identify the incidence of sensitization to food additives by using skin prick test in patients with allergy and to determine the concordance rate between positive skin tests and oral challenge in hypersensitivity to additives. This cross-sectional study included 125 (female 71, male 54) patients aged 2-76 years with allergy and 100 healthy individuals. Skin tests were performed in both patient and control groups with 25 fresh food additives. Among patients with allergy, 22.4% showed positive skin test at least to one of the applied materials. Skin test was negative to all tested food additives in control group. Oral food challenge was done in 28 patients with positive skin test, in whom 9 patients showed reaction to culprit (Concordance rate=32.1%). The present study suggested that about one-third of allergic patients with positive reaction to food additives showed positive oral challenge; it may be considered the potential utility of skin test to identify the role of food additives in patients with allergy.

  20. Sensitization to Food Additives in Patients with Allergy: A Study Based on Skin Test and Open Oral Challenge.

    PubMed

    Moghtaderi, Mozhgan; Hejrati, Zinatosadat; Dehghani, Zahra; Dehghani, Faranak; Kolahi, Niloofar

    2016-06-01

    There has been a great increase in the consumption of various food additives in recent years. The purpose of this study was to identify the incidence of sensitization to food additives by using skin prick test in patients with allergy and to determine the concordance rate between positive skin tests and oral challenge in hypersensitivity to additives. This cross-sectional study included 125 (female 71, male 54) patients aged 2-76 years with allergy and 100 healthy individuals. Skin tests were performed in both patient and control groups with 25 fresh food additives. Among patients with allergy, 22.4% showed positive skin test at least to one of the applied materials. Skin test was negative to all tested food additives in control group. Oral food challenge was done in 28 patients with positive skin test, in whom 9 patients showed reaction to culprit (Concordance rate=32.1%). The present study suggested that about one-third of allergic patients with positive reaction to food additives showed positive oral challenge; it may be considered the potential utility of skin test to identify the role of food additives in patients with allergy. PMID:27424134

  1. Te Rita Papesch: Case Study of an Exemplary Learner of Maori as an Additional Language

    ERIC Educational Resources Information Center

    Ratima, Matiu Tai; Papesch, Te Rita

    2014-01-01

    This paper presents a case study of the life experiences of one exemplar adult second language Maori learner--Te Rita Papesch. Te Rita was one of 17 participants who were interviewed as a part of the first author's PhD study which sought to answer the question: what factors lead to the development of proficiency in te reo Maori amongst adult…

  2. Study on additional carrier sensing for IEEE 802.15.4 wireless sensor networks.

    PubMed

    Lee, Bih-Hwang; Lai, Ruei-Lung; Wu, Huai-Kuei; Wong, Chi-Ming

    2010-01-01

    Wireless sensor networks based on the IEEE 802.15.4 standard are able to achieve low-power transmissions in the guise of low-rate and short-distance wireless personal area networks (WPANs). The slotted carrier sense multiple access with collision avoidance (CSMA/CA) is used for contention mechanism. Sensor nodes perform a backoff process as soon as the clear channel assessment (CCA) detects a busy channel. In doing so they may neglect the implicit information of the failed CCA detection and further cause the redundant sensing. The blind backoff process in the slotted CSMA/CA will cause lower channel utilization. This paper proposes an additional carrier sensing (ACS) algorithm based on IEEE 802.15.4 to enhance the carrier sensing mechanism for the original slotted CSMA/CA. An analytical Markov chain model is developed to evaluate the performance of the ACS algorithm. Both analytical and simulation results show that the proposed algorithm performs better than IEEE 802.15.4, which in turn significantly improves throughput, average medium access control (MAC) delay and power consumption of CCA detection.

  3. Chromosome studies in the aquatic monocots of Myanmar: A brief review with additional records.

    PubMed

    Ito, Yu; Tanaka, Nobuyuki

    2014-01-01

    Myanmar (Burma) constitutes a significant component of the Indo-Myanmar biodiversity hotspot, with elements of the Indian, the Indochina, and the Sino-Japanese floristic regions, yet thus far only a few reliable sources of the country's flora have been available. As a part of a contribution for the floristic inventory of Myanmar, since it is important in a floristic survey to obtain as much information as possible, in addition to previous two reports, here we present three more chromosome counts in the aquatic monocots of Myanmar: Limnocharisflava with 2n = 20, Sagittariatrifolia with 2n = 22 (Alismataceae), and Potamogetondistinctus × Potamogetonnodosus with 2n = 52 (Potamogetonaceae); the third one is new to science. A brief review of cytological researches in the floristic regions' 45 non-hybrid aquatic monocots plus well investigated two inter-specific hybrids that are recorded in Myanmar is given, indicating that the further works with a focus on species in Myanmar that has infra-specific chromosome variation in the floristic regions will address the precise evolutionary history of the aquatic flora of Myanmar.

  4. Numerical study of the effect of water addition on gas explosion.

    PubMed

    Liang, Yuntao; Zeng, Wen

    2010-02-15

    Through amending the SENKIN code of CHEMKIN III chemical kinetics package, a computational model of gas explosion in a constant volume bomb was built, and the detailed reaction mechanism (GRI-Mech 3.0) was adopted. The mole fraction profiles of reactants, some selected free radicals and catastrophic gases in the process of gas explosion were analyzed by this model. Furthermore, through the sensitivity analysis of the reaction mechanism of gas explosion, the dominant reactions that affect gas explosion and the formation of catastrophic gases were found out. At the same time, the inhibition mechanisms of water on gas explosion and the formation of catastrophic gases were analyzed. The results show that the induced explosion time is prolonged, and the mole fractions of reactant species such as CH(4), O(2) and catastrophic gases such as CO, CO(2) and NO are decreased as water is added to the mixed gas. With the water fraction in the mixed gas increasing, the sensitivities of the dominant reactions contributing to CH(4), CO(2) are decreased and the sensitivity coefficients of CH(4), CO and NO mole fractions are also decreased. The inhibition of gas explosion with water addition can be ascribed to the significant decrease of H, O and OH in the process of gas explosion due to the water presence. PMID:19811873

  5. Generalized linear and generalized additive models in studies of species distributions: Setting the scene

    USGS Publications Warehouse

    Guisan, A.; Edwards, T.C.; Hastie, T.

    2002-01-01

    An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.

  6. Exploratory studies of extended storage of apheresis platelets in a platelet additive solution (PAS).

    PubMed

    Slichter, Sherrill J; Corson, Jill; Jones, Mary Kay; Christoffel, Todd; Pellham, Esther; Bailey, S Lawrence; Bolgiano, Doug

    2014-01-01

    To evaluate the poststorage viability of apheresis platelets stored for up to 18 days in 80% platelet additive solution (PAS)/20% plasma, 117 healthy subjects donated platelets using the Haemonetics MCS+, COBE Spectra (Spectra), or Trima Accel (Trima) systems. Control platelets from the same subjects were compared with their stored test PAS platelets by radiolabeling their stored and control platelets with either (51)chromium or (111)indium. Trima platelets met Food and Drug Administration poststorage platelet viability criteria for only 7 days vs almost 13 days for Haemonetics platelets; ie, platelet recoveries after these storage times averaged 44 ± 3% vs 49 ± 3% and survivals were 5.4 ± 0.3 vs 4.6 ± 0.3 days, respectively. The differences in storage duration are likely related to both the collection system and the storage bag. The Spectra and Trima platelets were hyperconcentrated during collection, and PAS was added, whereas the Haemonetics platelets were elutriated with PAS, which may have resulted in less collection injury. When Spectra and Trima platelets were stored in Haemonetics' bags, poststorage viability was significantly improved. Platelet viability is better maintained in vitro than in vivo, allowing substantial increases in platelet storage times. However, implementation will require resolution of potential bacterial overgrowth during storage.

  7. Chromosome studies in the aquatic monocots of Myanmar: A brief review with additional records

    PubMed Central

    2014-01-01

    Abstract Myanmar (Burma) constitutes a significant component of the Indo-Myanmar biodiversity hotspot, with elements of the Indian, the Indochina, and the Sino-Japanese floristic regions, yet thus far only a few reliable sources of the country's flora have been available. As a part of a contribution for the floristic inventory of Myanmar, since it is important in a floristic survey to obtain as much information as possible, in addition to previous two reports, here we present three more chromosome counts in the aquatic monocots of Myanmar: Limnocharis flava with 2n = 20, Sagittaria trifolia with 2n = 22 (Alismataceae), and Potamogeton distinctus × Potamogeton nodosus with 2n = 52 (Potamogetonaceae); the third one is new to science. A brief review of cytological researches in the floristic regions' 45 non-hybrid aquatic monocots plus well investigated two inter-specific hybrids that are recorded in Myanmar is given, indicating that the further works with a focus on species in Myanmar that has infra-specific chromosome variation in the floristic regions will address the precise evolutionary history of the aquatic flora of Myanmar. PMID:24891826

  8. Multi-spectroscopic DNA interaction studies of sunset yellow food additive.

    PubMed

    Kashanian, Soheila; Heidary Zeidali, Sahar; Omidfar, Kobra; Shahabadi, Nahid

    2012-12-01

    The use of food dyes is at least controversial due to their essential role. Synthetic color food additives occupy an important place in the food industry. Moreover many of them have been related to health problems mainly in children that are considered the most vulnerable group. The purpose of this work is to present spectrophotometric methods to analyze the interaction of native calf thymus DNA (CT-DNA) with sunset yellow (SY) at physiological pH. Considerable hyperchromism and no red shift with an intrinsic binding constant of 7 × 10(4 )M(-1) were observed in UV absorption band of SY. Binding constants of DNA with complex were calculated at different temperatures. Slow increase in specific viscosity of DNA, induced circular dichroism spectral changes, and no significant changes in the fluorescence of neutral red-DNA solutions in the presence of SY suggest that this molecule interacts with CT-DNA via groove binding mode. Furthermore, the enthalpy and entropy of the reaction between SY and CT-DNA showed that the reaction is exothermic and enthalpy favored (∆H = -58.19 kJ mol(-1); ΔS = -274.36 kJ mol(-1) ) which are other evidences to indicate that van der Waals interactions and hydrogen bonding are the main running forces in the binding of the mentioned molecule and mode of interaction with DNA.

  9. A study of alternative metal particle structures and mixtures for dental amalgams based on mercury additions.

    PubMed

    Marquez, J A; Murr, L E; Agüero, V

    2000-08-01

    The perception that mercury in dental amalgam is toxic to the human organism has prompted worldwide efforts by the scientific community to develop alternative amalgam-like materials that utilize little or no mercury. In this investigation, an attempt is made to develop a new dental alloy system by adding liquid mercury to silver-coated Ag4Sn intermetallic particles in lesser amounts than are used in conventional amalgam alloys. An effort to precipitate the important eta-prime (Cu6Sn5) phase was made by adding pure Cu and Sn powders to the alloy formulation during trituration. Tytin a popular Ag-Sn-Cu single-composition, spray-atomized conventional dental alloy was used as the control to obtain baseline data for comparisons of microstructures and mechanical properties. Amalgamation of the coated particles with mercury, with or without the addition of Cu and Sn powders, mostly produced specimens with chemically non-coherent microstructures that were relatively weak in compression. These results were due, in part, to mercury's inability to chemically wet the Ag-coated particles and Cu and Sn powders because of naturally occurring surface oxide films. The strongest specimens tested had silver dendritic coatings, resulting in compression strength values up to 40% of the control's. Their higher strength is attributed to mechanical interlocking at the particle/matrix interfaces.

  10. Additional road markings as an indication of speed limits: results of a field experiment and a driving simulator study.

    PubMed

    Daniels, Stijn; Vanrie, Jan; Dreesen, An; Brijs, Tom

    2010-05-01

    Although speed limits are indicated by road signs, road users are not always aware, while driving, of the actual speed limit on a given road segment. The Roads and Traffic Agency developed additional road markings in order to support driver decisions on speed on 70 km/h roads in Flanders-Belgium. In this paper the results are presented of two evaluation studies, both a field study and a simulator study, on the effects of the additional road markings on speed behaviour. The results of the field study showed no substantial effect of the markings on speed behaviour. Neither did the simulator study, with slightly different stimuli. Nevertheless an effect on lateral position was noticed in the simulator study, showing at least some effect of the markings. The role of conspicuity of design elements and expectations towards traffic environments is discussed. Both studies illustrate well some strengths and weaknesses of observational field studies compared to experimental simulator studies.

  11. Optical security verification for blurred fingerprints

    NASA Astrophysics Data System (ADS)

    Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.

    1998-12-01

    Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.

  12. Multichip reticle approach for OPC model verification

    NASA Astrophysics Data System (ADS)

    Taravade, Kunal N.; Belova, Nadya; Jost, Andrew M.; Callan, Neal P.

    2003-12-01

    The complexity of current semiconductor technology due to shrinking feature sizes causes more and more engineering efforts and expenses to deliver the final product to customers. One of the largest expense in the entire budget is the reticle manufacturing. With the need to perform mask correction in order to account for optical proximity effects on the wafer level, the reticle expenses have become even more critical. For 0.13um technology one can not avoid optical proximity correction (OPC) procedure for modifying original designs to comply with design rules as required by Front End (FE) and Back End (BE) processes. Once an OPC model is generated one needs to confirm and verify the said model with additional test reticles for every critical layer of the technology. Such a verification procedure would include the most critical layers (two FE layers and four BE layers for the 0.13 technology node). This allows us to evaluate model performance under real production conditions encountered on customer designs. At LSI we have developed and verified the low volume reticle (LVR) approach for verification of different OPC models. The proposed approach allows performing die-to-die reticle defect inspection in addition to checking the printed image on the wafer. It helps finalizing litho and etch process parameters. Processing wafers with overlaying masks for two consecutive BE layer (via and metal2 masks) allowed us to evaluate robustness of OPC models for a wafer stack against both reticle and wafer induced misalignments.

  13. Voice measures of workload in the advanced flight deck: Additional studies

    NASA Technical Reports Server (NTRS)

    Schneider, Sid J.; Alpert, Murray

    1989-01-01

    These studies investigated acoustical analysis of the voice as a measure of workload in individual operators. In the first study, voice samples were recorded from a single operator during high, medium, and low workload conditions. Mean amplitude, frequency, syllable duration, and emphasis all tended to increase as workload increased. In the second study, NASA test pilots performed a laboratory task, and used a flight simulator under differing work conditions. For two of the pilots, high workload in the simulator brought about greater amplitude, peak duration, and stress. In both the laboratory and simulator tasks, high workload tended to be associated with more statistically significant drop-offs in the acoustical measures than were lower workload levels. There was a great deal of intra-subject variability in the acoustical measures. The results suggested that in individual operators, increased workload might be revealed by high initial amplitude and frequency, followed by rapid drop-offs over time.

  14. Study on Type C Coal Fly ash as an Additive to Molding Sand for Steel Casting

    NASA Astrophysics Data System (ADS)

    Palaniappan, Jayanthi

    2016-05-01

    Study of physio-chemical properties studies such as granulometric analysis, moisture, X ray fluorescence etc. were performed with Type C coal—combustion fly ash to investigate their potential as a distinct option for molding sand in foundry, thereby reducing the dependency on latter. Technological properties study such as compressive strength, tensile strength, permeability and compaction of various compositions of fly ash molding sand (10, 20 and 30 % fly ash substitute to chemically bonded sand) were performed and compared with silica molding sand. Steel casting production using this fly ash molding sand was done and the casting surface finish and typical casting parameters were assessed. It was noted that a good quality steel casting could be produced using type C fly ash molding sand, which effectively replaced 20 % of traditional molding sand and binders thereby providing greater financial profits to the foundry and an effective way of fly ash utilization (waste management).

  15. Additional Study of Water Droplet Median Volume Diameter (MVD) Effects on Ice Shapes

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Anderson, David N.

    2005-01-01

    This paper reports the result of an experimental study in the NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the MVD-independent effect identified previously might apply to SLD conditions in rime icing situations. Models were NACA 0012 wing sections with chords of 53.3 and 91.4 cm. Tests were conducted with a nominal airspeed of 77 m/s (150 kt) and a number of MVD's ranging from 15 to 100 m with LWC of 0.5 to 1 g/cu m. In the present study, ice shapes recorded from past studies and recent results at SLD and Appendix-C conditions are reviewed to show that droplet diameter is not important to rime ice shape for MVD of 30 microns or larger, but for less than 30 m drop sizes a rime ice shape transition from convex to wedge to spearhead type ice shape is observed.

  16. A study of grain boundary sliding in copper with and without an addition of phosphorus

    NASA Astrophysics Data System (ADS)

    Pettersson, Kjell

    2010-10-01

    Copper will be used as a corrosion barrier in the storage of high level nuclear waste. In order to improve the creep fracture properties of the material it will contain 30-50 ppm of phosphorus, OFP copper as opposed to OF copper without P. It has been suggested that the phosphorus impedes grain boundary sliding in copper and recently a quantitative theory based on this idea has shown that there is no risk for creep-brittle fracture of OFP copper under waste storage conditions. In order to verify the basis of this theory grain boundary sliding has been investigated in copper with and without a P addition. The method has been to examine intentionally scratched surfaces of tensile specimens tension tested to plastic strains of 1%, 2% and 4% at 150 and 200 °C. After testing specimen surfaces have been examined in SEM and sliding distances have been measured as in-surface displacement of scratches. The results have been plotted as distribution functions where the fraction of slides smaller than a given value is plotted versus sliding distance. The result is that in most cases the distribution functions for OF and OFP copper overlap. In a small number of cases there is a tendency that less sliding has occurred in OFP copper. The overall conclusion is however that although there may be a slight difference between the materials with regard to grain boundary sliding it is not large enough to explain the observed difference in creep brittleness. Tension tests to fracture in the temperature range 100-200 °C show that the tensile properties of the two copper qualities are more or less identical until intergranular cracking starts in the OF copper. Then the flow stress decreases in comparison with OFP. It is suggested that at least part of the observed differences in creep strength between the two coppers may be due to the effect of intergranular cracking.

  17. SCAPEGOAT WILDERNESS AND ADDITIONS, BOB MARSHALL AND GREAT BEAR WILDERNESSES, AND ADJACENT STUDY AREAS, MONTANA.

    USGS Publications Warehouse

    Earhart, Robert L.; Marks, Lawrence Y.

    1984-01-01

    Hydrocarbon and non-fuels mineral surveys indicate that parts of the Bob Marshall and Great Bear Wildernesses and several of the adjacent study areas have probable and substantiated mineral-resource potential for hydrocarbon accumulations, especially natural gas; the Scapegoat and Great Bear Wildernesses have a substantiated resource potential for copper and silver. The Bob Marshall Wilderness has a substantiated potential for barite and a probable potential for copper and silver. Lead, zinc, coal, and limestone occur locally within the study areas but such occurrences are small and low grade and no resource potential is identified.

  18. The new geospatial tools: global transparency enhancing safeguards verification

    SciTech Connect

    Pabian, Frank Vincent

    2010-09-16

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  19. Practical aspects of dynamic verification of extensometers; Part 1 -- The concepts

    SciTech Connect

    Albright, F.J.; Annala, J. )

    1994-01-01

    Material property studies frequently require the measurement of load and strain. Accurate measurement of both parameters is essential. Methods for accurate static calibration and verification of load transducers and extensometers are well established. More recently, standard practices have been developed for the dynamic calibration of load transducers. Still in its infancy is a standard method for dynamic verification of extensometers. Dynamic verification introduces a wide range of new issues. These encompass not only the transducer but also the conditioning electronics and actual test machine. Static calibration permits the elimination of nearly all dynamics, whereas dynamic verification must be done in the presence of these dynamic effects. This paper outlines the various concepts that need to be understood when performing the dynamic verification of an extensometer. Problems related to computer aided verification are emphasized, issues of aliasing and resolution in particular.

  20. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  1. Nahuatl as a Classical, Foreign, and Additional Language: A Phenomenological Study

    ERIC Educational Resources Information Center

    De Felice, Dustin

    2012-01-01

    In this study, participants learning an endangered language variety shared their experiences, thoughts, and feelings about the often complex and diverse language-learning process. I used phenomenological interviews in order to learn more about these English or Spanish language speakers' journey with the Nahuatl language. From first encounter to…

  2. CNV-based genome wide association study reveals additional variants contributing to meat quality in swine

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pork quality is important both to the meat processing industry and consumers’ purchasing attitudes. Copy number variation (CNV) is a burgeoning kind of variant that may influence meat quality. Herein, a genome-wide association study (GWAS) was performed between CNVs and meat quality traits in swine....

  3. Genome-Wide Association Study of Intelligence: Additive Effects of Novel Brain Expressed Genes

    ERIC Educational Resources Information Center

    Loo, Sandra K.; Shtir, Corina; Doyle, Alysa E.; Mick, Eric; McGough, James J.; McCracken, James; Biederman, Joseph; Smalley, Susan L.; Cantor, Rita M.; Faraone, Stephen V.; Nelson, Stanley F.

    2012-01-01

    Objective: The purpose of the present study was to identify common genetic variants that are associated with human intelligence or general cognitive ability. Method: We performed a genome-wide association analysis with a dense set of 1 million single-nucleotide polymorphisms (SNPs) and quantitative intelligence scores within an ancestrally…

  4. A Micro-Developmental Approach to Studying Young Children's Problem Solving Behavior in Addition

    ERIC Educational Resources Information Center

    Voutsina, Chronoula

    2012-01-01

    This paper presents a study that investigated the process of change in 5-6-year-old children's successful problem-solving approaches when tackling a multiple-step task in elementary arithmetic. Micro-developmental changes in children's successful problem-solving behavior were analyzed using Karmiloff-Smith's model of representational redescription…

  5. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  6. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  7. Comparative study of glycine single crystals with additive of potassium nitrate in different concentration ratios

    NASA Astrophysics Data System (ADS)

    Gujarati, Vivek P.; Deshpande, M. P.; Patel, Kamakshi R.; Chaki, S. H.

    2016-05-01

    Semi-organic crystals of Glycine Potassium Nitrate (GPN) with potential applications in Non linear optics (NLO) were grown using slow evaporation technique. Glycine and Potassium Nitrate were taken in three different concentration ratios of 3:1, 2:1 and 1:1 respectively. We checked the solubility of the material in distilled water at different temperatures and could observe the growth of crystals in 7 weeks time. Purity of the grown crystals was confirmed by Energy Dispersive X-ray Analysis (EDAX) and CHN analysis. GSN Powder X-ray diffraction pattern was recorded to confirm the crystalline nature. To confirm the applications of grown crystals in opto-electronics field, UV-Vis-NIR study was carried out. Dielectric properties of the samples were studied in between the frequency range 1Hz to 100 KHz.

  8. Studying quantum dot blinking through the addition of an engineered inorganic hole trap.

    PubMed

    Tenne, Ron; Teitelboim, Ayelet; Rukenstein, Pazit; Dyshel, Maria; Mokari, Taleb; Oron, Dan

    2013-06-25

    An all-inorganic compound colloidal quantum dot incorporating a highly emissive CdSe core, which is linked by a CdS tunneling barrier to an engineered charge carrier trap composed of PbS, is designed, and its optical properties are studied in detail at the single-particle level. Study of this structure enables a deeper understanding of the link between photoinduced charging and surface trapping of charge carriers and the phenomenon of quantum dot blinking. In the presence of the hole trap, a "gray" emissive state appears, associated with charging of the core. Rapid switching is observed between the "on" and the "gray" state, although the switching dynamics in and out of the dark "off" state remain unaffected. This result completes the links in the causality chain connecting charge carrier trapping, charging of QDs, and the appearance of a "gray" emission state.

  9. Brief reconnaissance study for the addition of hydropower for Carr Fork Dam, Sassafras, Kentucky

    SciTech Connect

    Gebhard, T.G. Jr.

    1982-05-24

    The feasibility of retrofitting the Carr Fork Dam near Hazard, KY for power generation was examined. This dam has a developable head of 80 ft and was built in 1975 to provide flood protection. The study of environmental, institutional, safety, and economic factors showed that the total investment cost would be $909,600 and that hydroelectric power development at this site is not feasible unless a higher price could be obtained for the power sold. (LCL)

  10. Expanding access to primary care without additional budgets? A case study from Burkina Faso.

    PubMed

    Marschall, Paul; Flessa, Steffen

    2008-11-01

    The aim of this study is to demonstrate the impact of increased access to primary care on provider costs in the rural health district of Nouna, Burkina Faso. This study question is crucial for health care planning in this district, as other research work shows that the population has a higher need for health care services. From a public health perspective, an increase of utilisation of first-line health facilities would be necessary. However, the governmental budget that is needed to finance improved access was not known. The study is based on data of 2004 of a comprehensive provider cost information system. This database provides us with the actual costs of each primary health care facility (Centre de Santé et de Promotion Sociale, CSPS) in the health district. We determine the fixed and variable costs of each institution and calculate the average cost per service unit rendered in 2004. Based on the cost structure of each CSPS, we calculate the total costs if the demand for health care services increased. We conclude that the total provider costs of primary care (and therefore the governmental budget) would hardly rise if the coverage of the population were increased. This is mainly due to the fact that the highest variable costs are drugs, which are fully paid for by the customers (Bamako Initiative). The majority of other costs are fixed. Consequently, health care reforms that improve access to health care institutions must not fear dramatically increasing the costs of health care services. PMID:18197447

  11. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  12. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  13. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  14. [Chewing gum as an additional agent in maintaining oral hygiene versus smoking status--preliminary study].

    PubMed

    Nakonieczna-Rudnicka, Marta; Strycharz-Dudziak, Małgorzata; Bachanek, Teresa

    2012-01-01

    Nowadays chewing gum is widely used in different age groups, so complying with proper duration and frequency of chewing is an important factor influencing the state of masticatory system. The study involved 112 dental students of the Medical University of Lublin. Everyday use of chewing gum declared 47,32% of cases. Chewing time up to 10 minutes was stated in 23,08% of respondents, 11-20 minutes in 40,38% of interviewees. Among the examined students 17,3% smoked cigarettes. In smokers group 83,33% of questioned chewed the gum every day, while among non-smokers - 43,37%. Chewing time shorter than 10 minutes declared 22,22% of smokers and 23,26% of non-smokers, while chewing time between 11-20 minutes - 27,78% i 44,35% of smokers and non-smokers respectively. Obtained results indicate the need of carrying out further studies aimed at the nicotine influence on saliva parameters with respect to development of diseases of hard tooth tissues.

  15. Study on the interaction of the toxic food additive carmoisine with serum albumins: a microcalorimetric investigation.

    PubMed

    Basu, Anirban; Kumar, Gopinatha Suresh

    2014-05-30

    The interaction of the synthetic azo dye and food colorant carmoisine with human and bovine serum albumins was studied by microcalorimetric techniques. A complete thermodynamic profile of the interaction was obtained from isothermal titration calorimetry studies. The equilibrium constant of the complexation process was of the order of 10(6)M(-1) and the binding stoichiometry was found to be 1:1 with both the serum albumins. The binding was driven by negative standard molar enthalpy and positive standard molar entropy contributions. The binding affinity was lower at higher salt concentrations in both cases but the same was dominated by mostly non-electrostatic forces at all salt concentrations. The polyelectrolytic forces contributed only 5-8% of the total standard molar Gibbs energy change. The standard molar enthalpy change enhanced whereas the standard molar entropic contribution decreased with rise in temperature but they compensated each other to keep the standard molar Gibbs energy change almost invariant. The negative standard molar heat capacity values suggested the involvement of a significant hydrophobic contribution in the complexation process. Besides, enthalpy-entropy compensation phenomenon was also observed in both the systems. The thermal stability of the serum proteins was found to be remarkably enhanced on binding to carmoisine. PMID:24742664

  16. Study on the interaction of the toxic food additive carmoisine with serum albumins: a microcalorimetric investigation.

    PubMed

    Basu, Anirban; Kumar, Gopinatha Suresh

    2014-05-30

    The interaction of the synthetic azo dye and food colorant carmoisine with human and bovine serum albumins was studied by microcalorimetric techniques. A complete thermodynamic profile of the interaction was obtained from isothermal titration calorimetry studies. The equilibrium constant of the complexation process was of the order of 10(6)M(-1) and the binding stoichiometry was found to be 1:1 with both the serum albumins. The binding was driven by negative standard molar enthalpy and positive standard molar entropy contributions. The binding affinity was lower at higher salt concentrations in both cases but the same was dominated by mostly non-electrostatic forces at all salt concentrations. The polyelectrolytic forces contributed only 5-8% of the total standard molar Gibbs energy change. The standard molar enthalpy change enhanced whereas the standard molar entropic contribution decreased with rise in temperature but they compensated each other to keep the standard molar Gibbs energy change almost invariant. The negative standard molar heat capacity values suggested the involvement of a significant hydrophobic contribution in the complexation process. Besides, enthalpy-entropy compensation phenomenon was also observed in both the systems. The thermal stability of the serum proteins was found to be remarkably enhanced on binding to carmoisine.

  17. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  18. Evaluation of LC-MS for the analysis of cleaning verification samples.

    PubMed

    Simmonds, Emma L; Lough, W John; Gray, Martin R

    2006-02-24

    The cleaning verification of pharmaceutical manufacturing equipment prior to further use is a cGMP requirement. Typically, relevant data are generated by HPLC with UV detection using methods individually developed and validated for each product. This work describes the use of HPLC with mass spectrometry to analyse cleaning verification samples, a novel means of utilising this analytical technology. The initial aim was to produce a single, generic method capable of quantifying a broad range of pharmaceuticals. Ultimately, however, a more effective strategy, in terms of efficiency and reliability, proved to be application of a well-defined approach to the rapid generation of compound specific methods. Results of studies to optimise the sample preparation for a basic compound in drug development (compound 1), together with experimental results for two further compounds are presented. These demonstrated that the combination of a well defined approach to chromatographic method development and mass spectrometric detection provided methodology with advantages in terms of sensitivity. Additionally, and by virtue of its potential for general applicability, the approach proposed has the potential to improve the overall efficiency with which methods for cleaning verification samples can be developed and applied.

  19. Shelf Life and Quality Study of Minced Tilapia with Nori and Hijiki Seaweeds as Natural Additives

    PubMed Central

    Ribeiro, Ingridy Simone; Shirahigue, Ligianne Din; Ferraz de Arruda Sucasas, Lia; Anbe, Lika; da Cruz, Pedro Gomes; Gallo, Cláudio Rosa; Carpes, Solange Teresinha; Marques, Marcos José; Oetterer, Marília

    2014-01-01

    The extraction of mechanically separated meat has emerged as an attractive process. However, it increases the incorporation of oxygen and, consequently, of flavors due to rancidity. Thus, preservatives must be added. The objective of this study was to evaluate the shelf life of minced tilapia to replace synthetic preservatives with Hijiki and Nori seaweeds extracts. The application of the extracts had no effect on the chemical composition of the minced tilapia. The seaweed extracts had inhibitory effect on total volatile base nitrogen. The minced tilapia complied with the microbiological standard set by Brazilin law. The panelists detected no differences in the rancid aroma and only minor differences were detected in the color of the products. It can be concluded that the minced tilapia with added seaweed extracts were within quality standards during frozen storage. PMID:25478593

  20. Mechanistic study of secondary organic aerosol components formed from nucleophilic addition reactions of methacrylic acid epoxide

    NASA Astrophysics Data System (ADS)

    Birdsall, A. W.; Miner, C. R.; Mael, L. E.; Elrod, M. J.

    2014-08-01

    Recently, methacrylic acid epoxide (MAE) has been proposed as a precursor to an important class of isoprene-derived compounds found in secondary organic aerosol (SOA): 2-methylglyceric acid (2-MG) and a set of oligomers, nitric acid esters and sulfuric acid esters related to 2-MG. However, the specific chemical mechanisms by which MAE could form these compounds have not been previously studied. In order to determine the relevance of these processes to atmospheric aerosol, MAE and 2-MG have been synthesized and a series of bulk solution-phase experiments aimed at studying the reactivity of MAE using nuclear magnetic resonance (NMR) spectroscopy have been performed. The present results indicate that the acid-catalyzed MAE reaction is more than 600 times slower than a similar reaction of an important isoprene-derived epoxide, but is still expected to be kinetically feasible in the atmosphere on more acidic SOA. The specific mechanism by which MAE leads to oligomers was identified, and the reactions of MAE with a number of atmospherically relevant nucleophiles were also investigated. Because the nucleophilic strengths of water, sulfate, alcohols (including 2-MG), and acids (including MAE and 2-MG) in their reactions with MAE were found to be of a similar magnitude, it is expected that a diverse variety of MAE + nucleophile product species may be formed on ambient SOA. Thus, the results indicate that epoxide chain reaction oligomerization will be limited by the presence of high concentrations of non-epoxide nucleophiles (such as water); this finding is consistent with previous environmental chamber investigations of the relative humidity-dependence of 2-MG-derived oligomerization processes and suggests that extensive oligomerization may not be likely on ambient SOA because of other competitive MAE reaction mechanisms.

  1. Mechanistic study of secondary organic aerosol components formed from nucleophilic addition reactions of methacrylic acid epoxide

    NASA Astrophysics Data System (ADS)

    Birdsall, A. W.; Miner, C. R.; Mael, L. E.; Elrod, M. J.

    2014-12-01

    Recently, methacrylic acid epoxide (MAE) has been proposed as a precursor to an important class of isoprene-derived compounds found in secondary organic aerosol (SOA): 2-methylglyceric acid (2-MG) and a set of oligomers, nitric acid esters, and sulfuric acid esters related to 2-MG. However, the specific chemical mechanisms by which MAE could form these compounds have not been previously studied with experimental methods. In order to determine the relevance of these processes to atmospheric aerosol, MAE and 2-MG have been synthesized and a series of bulk solution-phase experiments aimed at studying the reactivity of MAE using nuclear magnetic resonance (NMR) spectroscopy have been performed. The present results indicate that the acid-catalyzed MAE reaction is more than 600 times slower than a similar reaction of an important isoprene-derived epoxide, but is still expected to be kinetically feasible in the atmosphere on more acidic SOA. The specific mechanism by which MAE leads to oligomers was identified, and the reactions of MAE with a number of atmospherically relevant nucleophiles were also investigated. Because the nucleophilic strengths of water, sulfate, alcohols (including 2-MG), and acids (including MAE and 2-MG) in their reactions with MAE were found to be of similar magnitudes, it is expected that a diverse variety of MAE + nucleophile product species may be formed on ambient SOA. Thus, the results indicate that epoxide chain reaction oligomerization will be limited by the presence of high concentrations of non-epoxide nucleophiles (such as water); this finding is consistent with previous environmental chamber investigations of the relative humidity dependence of 2-MG-derived oligomerization processes and suggests that extensive oligomerization may not be likely on ambient SOA because of other competitive MAE reaction mechanisms.

  2. l-carnitine as a Potential Additive in Blood Storage Solutions: A Study on Erythrocytes.

    PubMed

    Soumya, R; Carl, H; Vani, R

    2016-09-01

    Erythrocytes undergo various changes during storage (storage lesion) that in turn reduces their functioning and survival. Oxidative stress plays a major role in the storage lesion and antioxidants can be used to combat this stress. This study elucidates the effects of l-carnitine (LC) on erythrocytes of stored blood. Blood was obtained from male Wistar rats and stored (4 °C) for 20 days in CPDA-1 (citrate phosphate dextrose adenine) solution. Samples were divided into-(i) controls (ii) LC 10 (l-carnitine at a concentration of 10 mM) (iii) LC 30 (l-carnitine at a concentration of 30 mM) and (iv) LC 60 (l-carnitine at a concentration of 60 mM). Every fifth day, the biomarkers (haemoglobin, hemolysis, antioxidant enzymes, lipid peroxidation and protein oxidation products) were analysed in erythrocytes. Hemoglobin and protein sulfhydryls were insignificant during storage indicative of the maintenance of hemoglobin and sulfhydryls in all groups. Superoxide dismutase and malondialdehyde levels increased initially and decreased towards the end of storage. The levels of catalase and glutathione peroxidase were lower in experimentals than controls during storage. l-carnitine assisted the enzymes by scavenging the reactive oxygen species produced. Hemolysis increased in all groups with storage, elucidating that l-carnitine could not completely protect lipids and proteins from oxidative stress. Hence, this study opens up new avenues of using l-carnitine as a component of storage solutions with combinations of antioxidants in order to maintain efficacy of erythrocytes.

  3. Biological effect of food additive titanium dioxide nanoparticles on intestine: an in vitro study.

    PubMed

    Song, Zheng-Mei; Chen, Ni; Liu, Jia-Hui; Tang, Huan; Deng, Xiaoyong; Xi, Wen-Song; Han, Kai; Cao, Aoneng; Liu, Yuanfang; Wang, Haifang

    2015-10-01

    Titanium dioxide nanoparticles (TiO2 NPs) are widely found in food-related consumer products. Understanding the effect of TiO2 NPs on the intestinal barrier and absorption is essential and vital for the safety assessment of orally administrated TiO2 NPs. In this study, the cytotoxicity and translocation of two native TiO2 NPs, and these two TiO2 NPs pretreated with the digestion simulation fluid or bovine serum albumin were investigated in undifferentiated Caco-2 cells, differentiated Caco-2 cells and Caco-2 monolayer. TiO2 NPs with a concentration less than 200 µg ml(-1) did not induce any toxicity in differentiated cells and Caco-2 monolayer after 24 h exposure. However, TiO2 NPs pretreated with digestion simulation fluids at 200 µg ml(-1) inhibited the growth of undifferentiated Caco-2 cells. Undifferentiated Caco-2 cells swallowed native TiO2 NPs easily, but not pretreated NPs, implying the protein coating on NPs impeded the cellular uptake. Compared with undifferentiated cells, differentiated ones possessed much lower uptake ability of these TiO2 NPs. Similarly, the traverse of TiO2 NPs through the Caco-2 monolayer was also negligible. Therefore, we infer the possibility of TiO2 NPs traversing through the intestine of animal or human after oral intake is quite low. This study provides valuable information for the risk assessment of TiO2 NPs in food.

  4. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  5. A digital process for additive manufacturing of occlusal splints: a clinical pilot study

    PubMed Central

    Salmi, Mika; Paloheimo, Kaija-Stiina; Tuomi, Jukka; Ingman, Tuula; Mäkitie, Antti

    2013-01-01

    The aim of this study was to develop and evaluate a digital process for manufacturing of occlusal splints. An alginate impression was taken from the upper and lower jaws of a patient with temporomandibular disorder owing to cross bite and wear of the teeth, and then digitized using a table laser scanner. The scanned model was repaired using the 3Data Expert software, and a splint was designed with the Viscam RP software. A splint was manufactured from a biocompatible liquid photopolymer by stereolithography. The system employed in the process was SLA 350. The splint was worn nightly for six months. The patient adapted to the splint well and found it comfortable to use. The splint relieved tension in the patient's bite muscles. No sign of tooth wear or significant splint wear was detected after six months of testing. Modern digital technology enables us to manufacture clinically functional occlusal splints, which might reduce costs, dental technician working time and chair-side time. Maximum-dimensional errors of approximately 1 mm were found at thin walls and sharp corners of the splint when compared with the digital model. PMID:23614943

  6. Mass analysis addition to the Differential Ion Flux Probe (DIFP) study

    NASA Technical Reports Server (NTRS)

    Wright, K. H., Jr.; Jolley, Richard

    1994-01-01

    The objective of this study is to develop a technique to measure the characteristics of space plasmas under highly disturbed conditions; e.g., non-Maxwellian plasmas with strong drifting populations and plasmas contaminated by spacecraft outgassing. The approach, conducted in conjunction with current MSFC activities, is to extend the capabilities of the Differential Ion Flux Probe (DIFP) to include a high throughput mass measurement that does not require either high voltage or contamination sensitive devices such as channeltron electron multipliers or microchannel plates. This will significantly reduce the complexity and expense of instrument fabrication, testing, and integration of flight hardware compared to classical mass analyzers. The feasibility of the enhanced DIFP has been verified by using breadboard test models in a controlled plasma environment. The ability to manipulate particles through the instrument regardless of incident angle, energy, or ionic component has been amply demonstrated. The energy analysis mode is differential and leads directly to a time-of-flight mass measurement. With the new design, the DIFP will separate multiple ion streams and analyze each stream independently for ion flux intensity, velocity (including direction of motion), mass, and temperature (or energy distribution). In particular, such an instrument will be invaluable on follow-on electrodynamic TSS missions and, possibly, for environmental monitoring on the space station.

  7. Value addition of Palmyra palm and studies on the storage life.

    PubMed

    Chaurasiya, A K; Chakraborty, I; Saha, J

    2014-04-01

    Palmyra palm (Borassus flabellifer L.) belonging to the family Palmae is referred to as tree of life with several uses including food, beverage, fibre, medicinal and timber. Unfortunately, the nutritionally enriched pulp of ripened palm has limited commercial use. Extraction of pulp has been accomplished by using water and heat to ensure maximum pulp recovery. Different recipes were tried for the preparation of two uncommon value added products like palm spread and palm toffee. On the basis of biochemical composition, organoleptic scores, microbial estimation and storage study both under ambient and refrigerated conditions; the suitable recipe was selected with the maximum acceptability. Gradual increase in total soluble solid (TSS), total sugar and reducing sugar while decrease in ascorbic acid, pH, β-carotene and protein content of processed products have been observed irrespective of storage condition. The results obtained from sensory evaluation and microbial status revealed that palm spread and toffee remained acceptable up to 9 months and 8 months, respectively at ambient temperature. The income per rupee investment for these two products was found to be remunerative. PMID:24741173

  8. The science verification of FLAMES

    NASA Astrophysics Data System (ADS)

    Primas, Francesca

    2003-06-01

    After a new VLT instrument has been commissioned and thoroughly tested1, a series of scientific and technical checkups are scheduled in order to test the front-to-end operations chain before the official start of regular operations. Technically speaking, these are the socalled Dry Runs, part of which are usually devoted to the Science Verification (SV for short) of that specific instrument. A Science Verification programme includes a set of typical scientific observations with the aim of verifying and demonstrating to the community the capabilities of a new instrument in the operational framework of the VLT Paranal Observatory. Though manifold, its goals can be summarised in two main points: from the scientific point of view, by demonstrating the scientific potential of the new instrument, these observations will provide ESO users with first science- grade data, thus fostering an early scientific return. From the technical point of view, by testing the whole operational system (from the preparation of the observations to their execution and analysis), it will provide important feedback to the Instrument Operation Teams (both in Paranal and in Garching), to the Instrument Division, and to the Data Flow groups. More details about the concept(s) behind a Science Verification can be found in the “Science Verification Policy and Procedures” document (available at http://www.eso.org/science/vltsv/).

  9. Towards the formal verification of the requirements and design of a processor interface unit: HOL listings

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.

  10. Mapping 15O production rate for proton therapy verification

    PubMed Central

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping; Min, Chul Hee; Testa, Mauro; Winey, Brian; Normandin, Marc D.; Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas; El Fakhri, Georges

    2015-01-01

    Purpose This is a proof-of-principle study for the evaluation of 15O production as an imaging target, through the use of positron emission tomography (PET), to improve verification of proton treatment plans and study the effects of perfusion. Methods and Materials Dynamic PET measurements of irradiation-produced isotopes were taken for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged in both live and dead conditions. A differential equation was fitted to the phantom and the in vivo data, yielding estimates of the 15O production and clearance rates, which was compared for live versus dead for the rabbit, and to Monte Carlo (MC) predictions. Results PET clearance rates agreed with the decay constants of the dominant radionuclide species in three different phantom materials. In two oxygen-rich materials, the ratio of 15O production rates agreed with the MC prediction. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using the 15O decay constant, while the live thigh activity decayed faster. Most importantly, the 15O production rates agreed within 2% (p> 0.5) between conditions. Conclusion We developed a new method for quantitative measurement of 15O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of 15O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, 15O clearance rates may be useful in monitoring permeability changes due to therapy. PMID:25817530

  11. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    SciTech Connect

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping; Min, Chul Hee; Testa, Mauro; Winey, Brian; Normandin, Marc D.; Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas; El Fakhri, Georges

    2015-06-01

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates for the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.

  12. Verification and validation of control system software

    SciTech Connect

    Munro, J.K. Jr.; Kisner, R.A. ); Bhadtt, S.C. )

    1991-01-01

    The following guidelines are proposed for verification and validation (V V) of nuclear power plant control system software: (a) use risk management to decide what and how much V V is needed; (b) classify each software application using a scheme that reflects what type and how much V V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs.

  13. CTBT integrated verification system evaluation model supplement

    SciTech Connect

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  14. Genetic association studies in complex disease: disentangling additional predisposing loci from associated neutral loci using a constrained - permutation approach.

    PubMed

    Spijker, G T; Nolte, I M; Jansen, R C; Te Meerman, G J

    2005-01-01

    In the process of genetically mapping a complex disease, the question may arise whether a certain polymorphism is the only causal variant in a region. A number of methods can answer this question, but unfortunately these methods are optimal for bi-allelic loci only. We wanted to develop a method that is more suited for multi-allelic loci, such as microsatellite markers. We propose the Additional Disease Loci Test (ADLT): the alleles at an additional locus are permuted within the subsample of haplotypes that have identical alleles at the predisposing locus. The hypothesis being tested is, whether the predisposing locus is the sole factor predisposing to the trait that is in LD with the additional locus under study. We applied ADLT to simulated datasets and a published dataset on Type 1 Diabetes, genotyped for microsatellite markers in the HLA-region. The method showed the expected number of false-positive results in the absence of additional loci, but proved to be more powerful than existing methods in the presence of additional disease loci. ADLT was especially superior in datasets with less LD or with multiple predisposing alleles. We conclude that the ADLT can be useful in identifying additional disease loci.

  15. MACCS2 development and verification efforts

    SciTech Connect

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  16. Effect of Additives on Green Sand Molding Properties using Design of Experiments and Taguchi's Quality Loss Function - An Experimental Study

    NASA Astrophysics Data System (ADS)

    Desai, Bhagyashree; Mokashi, Pavani; Anand, R. L.; Burli, S. B.; Khandal, S. V.

    2016-09-01

    The experimental study aims to underseek the effect of various additives on the green sand molding properties as a particular combination of additives could yield desired sand properties. The input parameters (factors) selected were water and powder (Fly ash, Coconut shell and Tamarind) in three levels. Experiments were planned using design of experiments (DOE). On the basis of plans, experiments were conducted to understand the behavior of sand mould properties such as compression strength, shear strength, permeability number with various additives. From the experimental results it could be concluded that the factors have significant effect on the sand properties as P-value found to be less than 0.05 for all the cases studied. The optimization based on quality loss function was also performed. The study revealed that the quality loss associated with the tamarind powder was lesser compared to other additives selected for the study. The optimization based on quality loss function and the parametric analysis using ANOVA suggested that the tamarind powder of 8 gm per Kg of molding sand and moisture content of 7% yield better properties to obtain sound castings.

  17. A Pilot Study to Examine the Effect of Additional Structured Outdoor Playtime on Preschoolers' Physical Activity Levels

    ERIC Educational Resources Information Center

    Alhassan, Sofiya; Nwaokelemeh, Ogechi; Lyden, Kate; Goldsby, TaShauna; Mendoza, Albert

    2013-01-01

    The impact of additional structured outdoor playtime on preschoolers'; physical activity (PA) level is unclear. The purpose of this pilot study was to explore the effects of increasing structured outdoor playtime on preschoolers'; PA levels. Eight full-day classrooms (n = 134 children) from two preschool programmes were randomised into a treatment…

  18. STUDY OF THE EFFECT OF CHLORINE ADDITION ON MERCURY OXIDATION BY SCR CATALYST UNDER SIMULATED SUBBITUMINOUS COAL FLUE GAS

    EPA Science Inventory

    An entrained flow reactor is used to study the effect of addition of chlorine-containing species on the oxidation of elemental mercury (Hgo)by a selective catalytic reduction (SCR) catalyst in simulated subbituminous coal combustion flue gas. The combustion flue gas was doped wit...

  19. A comparison of software verification techniques

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A controlled experiment performed by the Software Engineering Laboratory (SEL) to compare the effectiveness of code reading, functional testing, and structural testing as software verification techniques is described. The experiment results indicate that code reading provides the greatest error detection capability at the lowest cost, whereas structural testing is the least effective technique. The experiment plan is explained, the experiment results are described, related results from other studies are discussed. The application of these results to the development of software in the flight dynamics environment is considered. Appendices summarize the experiment data and list the test programs.

  20. A review of studies on the effects of ultraviolet irradiation on the resistance to infections: evidence from rodent infection models and verification by experimental and observational human studies.

    PubMed

    Termorshuizen, F; Garssen, J; Norval, M; Koulu, L; Laihia, J; Leino, L; Jansen, C T; De Gruijl, F; Gibbs, N K; De Simone, C; Van Loveren, H

    2002-02-01

    Recent studies on the immunosuppressive effects of ultraviolet radiation (UVR) and the related resistance to infections in rodents and humans are presented. The waveband dependency of trans-to-cis isomerisation of urocanic acid in the stratum corneum and the role of DNA damage in UVR-induced erythema and immunosuppression were investigated to further elucidate the underlying mechanisms. Furthermore, human experimental studies on UVR-induced immunomodulation were performed. It appeared that the doses needed to suppress various immune parameters in humans (e.g. NK activity, contact hypersensitivity) were higher than those needed in experiments in rodents. Still, extrapolation of experimental animal data to the human situation showed that UVR may impair the resistance to different systemic infections at relevant outdoor doses. In observational human studies we aimed to substantiate the relevance of UVR for infections in humans. It was shown that sunny season was associated with a slightly retarded but clinically non-relevant antibody response to hepatitis B vaccination. Furthermore, sunny season appeared to be associated with a small decline in the number of CD4+ T-helper cells in a cohort of HIV-infected persons and a higher recurrence of herpes simplex and herpes zoster in a cohort of renal transplant recipients. However, in a study among young children a higher exposure to solar UVR was associated with a lower occurrence of upper respiratory tract symptoms. As disentangling the effects of UVR from other relevant factors is often impossible in observational studies, concise quantitative risk estimations for the human situation cannot be given at present.

  1. A review of studies on the effects of ultraviolet irradiation on the resistance to infections: evidence from rodent infection models and verification by experimental and observational human studies.

    PubMed

    Termorshuizen, F; Garssen, J; Norval, M; Koulu, L; Laihia, J; Leino, L; Jansen, C T; De Gruijl, F; Gibbs, N K; De Simone, C; Van Loveren, H

    2002-02-01

    Recent studies on the immunosuppressive effects of ultraviolet radiation (UVR) and the related resistance to infections in rodents and humans are presented. The waveband dependency of trans-to-cis isomerisation of urocanic acid in the stratum corneum and the role of DNA damage in UVR-induced erythema and immunosuppression were investigated to further elucidate the underlying mechanisms. Furthermore, human experimental studies on UVR-induced immunomodulation were performed. It appeared that the doses needed to suppress various immune parameters in humans (e.g. NK activity, contact hypersensitivity) were higher than those needed in experiments in rodents. Still, extrapolation of experimental animal data to the human situation showed that UVR may impair the resistance to different systemic infections at relevant outdoor doses. In observational human studies we aimed to substantiate the relevance of UVR for infections in humans. It was shown that sunny season was associated with a slightly retarded but clinically non-relevant antibody response to hepatitis B vaccination. Furthermore, sunny season appeared to be associated with a small decline in the number of CD4+ T-helper cells in a cohort of HIV-infected persons and a higher recurrence of herpes simplex and herpes zoster in a cohort of renal transplant recipients. However, in a study among young children a higher exposure to solar UVR was associated with a lower occurrence of upper respiratory tract symptoms. As disentangling the effects of UVR from other relevant factors is often impossible in observational studies, concise quantitative risk estimations for the human situation cannot be given at present. PMID:11811930

  2. Mechanistic and computational studies of the atom transfer radical addition of CCl4 to styrene catalyzed by copper homoscorpionate complexes.

    PubMed

    Muñoz-Molina, José María; Sameera, W M C; Álvarez, Eleuterio; Maseras, Feliu; Belderrain, Tomás R; Pérez, Pedro J

    2011-03-21

    Experimental as well as theoretical studies have been carried out with the aim of elucidating the mechanism of the atom transfer radical addition (ATRA) of styrene and carbon tetrachloride with a Tp(x)Cu(NCMe) complex as the catalyst precursor (Tp(x) = hydrotrispyrazolyl-borate ligand). The studies shown herein demonstrate the effect of different variables in the kinetic behavior. A mechanistic proposal consistent with theoretical and experimental data is presented.

  3. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  4. Imaging of prompt gamma rays emitted during delivery of clinical proton beams with a Compton camera: feasibility studies for range verification

    NASA Astrophysics Data System (ADS)

    Polf, Jerimy C.; Avery, Stephen; Mackin, Dennis S.; Beddar, Sam

    2015-09-01

    The purpose of this paper is to evaluate the ability of a prototype Compton camera (CC) to measure prompt gamma rays (PG) emitted during delivery of clinical proton pencil beams for prompt gamma imaging (PGI) as a means of providing in vivo verification of the delivered proton radiotherapy beams. A water phantom was irradiated with clinical 114 MeV and 150 MeV proton pencil beams. Up to 500 cGy of dose was delivered per irradiation using clinical beam currents. The prototype CC was placed 15 cm from the beam central axis and PGs from 0.2 MeV up to 6.5 MeV were measured during irradiation. From the measured data (2D) images of the PG emission were reconstructed. (1D) profiles were extracted from the PG images and compared to measured depth dose curves of the delivered proton pencil beams. The CC was able to measure PG emission during delivery of both 114 MeV and 150 MeV proton beams at clinical beam currents. 2D images of the PG emission were reconstructed for single 150 MeV proton pencil beams as well as for a 5   ×   5 cm mono-energetic layer of 114 MeV pencil beams. Shifts in the Bragg peak (BP) range were detectable on the 2D images. 1D profiles extracted from the PG images show that the distal falloff of the PG emission profile lined up well with the distal BP falloff. Shifts as small as 3 mm in the beam range could be detected from the 1D PG profiles with an accuracy of 1.5 mm or better. However, with the current CC prototype, a dose of 400 cGy was required to acquire adequate PG signal for 2D PG image reconstruction. It was possible to measure PG interactions with our prototype CC during delivery of proton pencil beams at clinical dose rates. Images of the PG emission could be reconstructed and shifts in the BP range were detectable. Therefore PGI with a CC for in vivo range verification during proton treatment delivery is feasible. However, improvements in the prototype CC detection efficiency and reconstruction algorithms are necessary

  5. Imaging of prompt gamma rays emitted during delivery of clinical proton beams with a Compton camera: feasibility studies for range verification.

    PubMed

    Polf, Jerimy C; Avery, Stephen; Mackin, Dennis S; Beddar, Sam

    2015-09-21

    The purpose of this paper is to evaluate the ability of a prototype Compton camera (CC) to measure prompt gamma rays (PG) emitted during delivery of clinical proton pencil beams for prompt gamma imaging (PGI) as a means of providing in vivo verification of the delivered proton radiotherapy beams. A water phantom was irradiated with clinical 114 MeV and 150 MeV proton pencil beams. Up to 500 cGy of dose was delivered per irradiation using clinical beam currents. The prototype CC was placed 15 cm from the beam central axis and PGs from 0.2 MeV up to 6.5 MeV were measured during irradiation. From the measured data (2D) images of the PG emission were reconstructed. (1D) profiles were extracted from the PG images and compared to measured depth dose curves of the delivered proton pencil beams. The CC was able to measure PG emission during delivery of both 114 MeV and 150 MeV proton beams at clinical beam currents. 2D images of the PG emission were reconstructed for single 150 MeV proton pencil beams as well as for a 5   ×   5 cm mono-energetic layer of 114 MeV pencil beams. Shifts in the Bragg peak (BP) range were detectable on the 2D images. 1D profiles extracted from the PG images show that the distal falloff of the PG emission profile lined up well with the distal BP falloff. Shifts as small as 3 mm in the beam range could be detected from the 1D PG profiles with an accuracy of 1.5 mm or better. However, with the current CC prototype, a dose of 400 cGy was required to acquire adequate PG signal for 2D PG image reconstruction. It was possible to measure PG interactions with our prototype CC during delivery of proton pencil beams at clinical dose rates. Images of the PG emission could be reconstructed and shifts in the BP range were detectable. Therefore PGI with a CC for in vivo range verification during proton treatment delivery is feasible. However, improvements in the prototype CC detection efficiency and reconstruction algorithms are necessary

  6. Unravelling the impact of hydrocarbon structure on the fumarate addition mechanism--a gas-phase ab initio study.

    PubMed

    Bharadwaj, Vivek S; Vyas, Shubham; Villano, Stephanie M; Maupin, C Mark; Dean, Anthony M

    2015-02-14

    The fumarate addition reaction mechanism is central to the anaerobic biodegradation pathway of various hydrocarbons, both aromatic (e.g., toluene, ethyl benzene) and aliphatic (e.g., n-hexane, dodecane). Succinate synthase enzymes, which belong to the glycyl radical enzyme family, are the main facilitators of these biochemical reactions. The overall catalytic mechanism that converts hydrocarbons to a succinate molecule involves three steps: (1) initial H-abstraction from the hydrocarbon by the radical enzyme, (2) addition of the resulting hydrocarbon radical to fumarate, and (3) hydrogen abstraction by the addition product to regenerate the radical enzyme. Since the biodegradation of hydrocarbon fuels via the fumarate addition mechanism is linked to bio-corrosion, an improved understanding of this reaction is imperative to our efforts of predicting the susceptibility of proposed alternative fuels to biodegradation. An improved understanding of the fuel biodegradation process also has the potential to benefit bioremediation. In this study, we consider model aromatic (toluene) and aliphatic (butane) compounds to evaluate the impact of hydrocarbon structure on the energetics and kinetics of the fumarate addition mechanism by means of high level ab initio gas-phase calculations. We predict that the rate of toluene degradation is ∼100 times faster than butane at 298 K, and that the first abstraction step is kinetically significant for both hydrocarbons, which is consistent with deuterium isotope effect studies on toluene degradation. The detailed computations also show that the predicted stereo-chemical preference of the succinate products for both toluene and butane are due to the differences in the radical addition rate constants for the various isomers. The computational and kinetic modeling work presented here demonstrates the importance of considering pre-reaction and product complexes in order to accurately treat gas phase systems that involve intra and inter

  7. Enantioselective conjugate addition of nitro compounds to α,β-unsaturated ketones: an experimental and computational study.

    PubMed

    Manzano, Rubén; Andrés, José M; Álvarez, Rosana; Muruzábal, María D; de Lera, Ángel R; Pedrosa, Rafael

    2011-05-16

    A series of chiral thioureas derived from easily available diamines, prepared from α-amino acids, have been tested as catalysts in the enantioselective Michael additions of nitroalkanes to α,β-unsaturated ketones. The best results are obtained with the bifunctional catalyst prepared from L-valine. This thiourea promotes the reaction with high enantioselectivities and chemical yields for aryl/vinyl ketones, but the enantiomeric ratio for alkyl/vinyl derivatives is very modest. The addition of substituted nitromethanes led to the corresponding adducts with excellent enantioselectivity but very poor diastereoselectivity. Evidence for the isomerization of the addition products has been obtained from the reaction of chalcone with [D(3)]nitromethane, which shows that the final addition products epimerize under the reaction conditions. The epimerization explains the low diastereoselectivity observed in the formation of adducts with two adjacent tertiary stereocenters. Density functional studies of the transition structures corresponding to two alternative activation modes of the nitroalkanes and α,β-unsaturated ketones by the bifunctional organocatalyst have been carried out at the B3LYP/3-21G* level. The computations are consistent with a reaction model involving the Michael addition of the thiourea-activated nitronate to the ketone activated by the protonated amine of the organocatalyst. The enantioselectivities predicted by the computations are consistent with the experimental values obtained for aryl- and alkyl-substituted α,β-unsaturated ketones.

  8. Appendix: Conjectures concerning proof, design, and verification.

    SciTech Connect

    Wos, L.

    2000-05-31

    This article focuses on an esoteric but practical use of automated reasoning that may indeed be new to many, especially those concerned primarily with verification of both hardware and software. Specifically, featured are a discussion and some methodology for taking an existing design -- of a circuit, a chip, a program, or the like--and refining and improving it in various ways. Although the methodology is general and does not require the use of a specific program, McCune's program OTTER does offer what is needed. OTTER has played and continues to play the key role in my research, and an interested person can gain access to this program in various ways, not the least of which is through the included CD-ROM in [3]. When success occurs, the result is a new design that may require fewer components, avoid the use of certain costly components, offer more reliability and ease of verification, and, perhaps most important, be more efficient in the contexts of speed and heat generation. Although the author has minimal experience in circuit design, circuit validation, program synthesis, program verification, and similar concerns, (at the encouragement of colleagues based on successes to be cited) he presents materials that might indeed be of substantial interest to manufacturers and programmers. He writes this article in part prompted by the recent activities of chip designers that include Intel and AMD, activities heavily emphasizing the proving of theorems. As for his research that appears to the author to be relevant, he has made an intense and most profitable study of finding proofs that are shorter [2,3], some that avoid the use of various types of term, some that are far less complex than previously known, and the like. Those results suggest to me a strong possible connection between more appealing proofs (in mathematics and in logic) and enhanced and improved design of both hardware and software. Here the author explores diverse conjectures that elucidate some of the

  9. A study of internal structure in components made by additive manufacturing process using 3 D X-ray tomography

    SciTech Connect

    Raguvarun, K. Balasubramaniam, Krishnan Rajagopal, Prabhu; Palanisamy, Suresh; Nagarajah, Romesh; Kapoor, Ajay; Hoye, Nicholas; Curiri, Dominic

    2015-03-31

    Additive manufacturing methods are gaining increasing popularity for rapidly and efficiently manufacturing parts and components in the industrial context, as well as for domestic applications. However, except when used for prototyping or rapid visualization of components, industries are concerned with the load carrying capacity and strength achievable by additive manufactured parts. In this paper, the wire-arc additive manufacturing (AM) process based on gas tungsten arc welding (GTAW) has been examined for the internal structure and constitution of components generated by the process. High-resolution 3D X-ray tomography is used to gain cut-views through wedge-shaped parts created using this GTAW additive manufacturing process with titanium alloy materials. In this work, two different control conditions for the GTAW process are considered. The studies reveal clusters of porosities, located in periodic spatial intervals along the sample cross-section. Such internal defects can have a detrimental effect on the strength of the resulting AM components, as shown in destructive testing studies. Closer examination of this phenomenon shows that defect clusters are preferentially located at GTAW traversal path intervals. These results highlight the strong need for enhanced control of process parameters in ensuring components with minimal defects and higher strength.

  10. Experimental verification of quantum computation

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Fitzsimons, Joseph F.; Kashefi, Elham; Walther, Philip

    2013-11-01

    Quantum computers are expected to offer substantial speed-ups over their classical counterparts and to solve problems intractable for classical computers. Beyond such practical significance, the concept of quantum computation opens up fundamental questions, among them the issue of whether quantum computations can be certified by entities that are inherently unable to compute the results themselves. Here we present the first experimental verification of quantum computation. We show, in theory and experiment, how a verifier with minimal quantum resources can test a significantly more powerful quantum computer. The new verification protocol introduced here uses the framework of blind quantum computing and is independent of the experimental quantum-computation platform used. In our scheme, the verifier is required only to generate single qubits and transmit them to the quantum computer. We experimentally demonstrate this protocol using four photonic qubits and show how the verifier can test the computer's ability to perform quantum computation.

  11. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  12. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  13. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  14. Effectiveness of a pressurized stormwater filtration system in Green Bay, Wisconsin: a study for the environmental technology verification program of the U.S. Environmental Protection Agency

    USGS Publications Warehouse

    Horwatich, J.A.; Corsi, Steven R.; Bannerman, Roger T.

    2004-01-01

    A pressurized stormwater filtration system was installed in 1998 as a stormwater-treatment practice to treat runoff from a hospital rooftop and parking lot in Green Bay, Wisconsin. This type of filtration system has been installed in Florida citrus groves and sewage treatment plants around the United States; however, this installation is the first of its kind to be used to treat urban runoff and the first to be tested in Wisconsin. The U.S. Geological Survey (USGS) monitored the system between November 2000 and September 2002 to evaluate it as part of the U.S. Environmental Protection Agency's Environmental Technology Verification Program. Fifteen runoff events were monitored for flow and water quality at the inlet and outlet of the system, and comparison of the event mean concentrations and constituent loads was used to evaluate its effectiveness. Loads were decreased in all particulate-associated constituents monitored, including suspended solids (83 percent), suspended sediment (81 percent), total Kjeldahl nitrogen (26 percent), total phosphorus (54 percent), and total recoverable zinc (62 percent). Total dissolved solids, dissolved phosphorus, and nitrate plus nitrite loads remained similar or increased through the system. The increase in some constituents was most likely due to a ground-water contribution between runoff events. Sand/silt split analysis resulted in the median silt content of 78 percent at the inlet, 87 percent at the outlet, and 3 percent at the flow splitter.

  15. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  16. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  17. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  18. MOV reliability evaluation and periodic verification scheduling

    SciTech Connect

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  19. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  20. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  1. Measurements for liquid rocket engine performance code verification

    NASA Technical Reports Server (NTRS)

    Praharaj, Sarat C.; Palko, Richard L.

    1986-01-01

    The goal of the rocket engine performance code verification tests is to obtain the I sub sp with an accuracy of 0.25% or less. This needs to be done during the sequence of four related tests (two reactive and two hot gas simulation) to best utilize the loss separation technique recommended in this study. In addition to I sub sp, the measurements of the input and output parameters for the codes are needed. This study has shown two things in regard to obtaining the I sub sp uncertainty within the 0.25% target. First, this target is generally not being realized at the present time, and second, the instrumentation and testing technology does exist to obtain this 0.25% uncertainty goal. However, to achieve this goal will require carefully planned, designed, and conducted testing. In addition, the test-stand (or system) dynamics must be evaluated in the pre-test and post-test phases of the design of the experiment and data analysis, respectively always keeping in mind that a .25% overall uncertainty in I sub sp is targeted. A table gives the maximum allowable uncertainty required for obtaining I sub sp with 0.25% uncertainty, the currently-quoted instrument specification, and present test uncertainty for the parameters. In general, it appears that measurement of the mass flow parameter within the required uncertainty may be the most difficult.

  2. Verification and validation for induction heating

    SciTech Connect

    Lam, Kin; Tippetts, Trevor B; Allen, David W

    2008-01-01

    Truchas is a software package being developed at LANL within the Telluride project for predicting the complex physical processes in metal alloy casting. The software was designed to be massively parallel, multi-material, multi-physics, and to run on 3D, fully unstructured meshes. This work describes a Verification and Validation assessment of Truchas for simulating the induction heating phase of a casting process. We used existing data from a simple experiment involving the induction heating of a graphite cylinder, as graphite is a common material used for mold assemblies. Because we do not have complete knowledge of all the conditions and properties in this experiment (as is the case in many other experiments), we performed a parameter sensitivity study, modeled the uncertainties of the most sensitive parameters, and quantified how these uncertainties propagate to the Truchas output response. A verification analysis produced estimates of the numerical error of the Truchas solution to our computational model. The outputs from Truchas runs with randomly sampled parameter values were used for the validation study.

  3. Verification of Medium Range Probabilistic Rainfall Forecasts Over India

    NASA Astrophysics Data System (ADS)

    Dube, Anumeha; Ashrit, Raghavendra; Singh, Harvir; Iyengar, Gopal; Rajagopal, E. N.

    2016-07-01

    Forecasting rainfall in the tropics is a challenging task further hampered by the uncertainty in the numerical weather prediction models. Ensemble prediction systems (EPSs) provide an efficient way of handling the inherent uncertainty of these models. Verification of forecasts obtained from an EPS is a necessity, to build confidence in using these forecasts. This study deals with the verification of the probabilistic rainfall forecast obtained from the National Centre for Medium Range Weather Forecasting (NCMRWF) Global Ensemble Forecast system (NGEFS) for three monsoon seasons, i.e., JJAS 2012, 2013 and 2014. Verification is done based on the Brier Score (BS) and its components (reliability, resolution and uncertainty), Brier Skill Score (BSS), reliability diagram, relative operating characteristic (ROC) curve and area under the ROC (AROC) curve. Three observation data sets are used (namely, NMSG, CPC-RFE2.0 and TRMM) for verification of forecasts and the statistics are compared. BS values for verification of NGEFS forecasts using NMSG data are the lowest, indicating that the forecasts have a better match with these observations as compared to both TRMM and CPC-RFE2.0. This is further strengthened by lower reliability, higher resolution and BSS values for verification against this data set. The ROC curve shows that lower rainfall amounts have a higher hit rate, which implies that the model has better skill in predicting these rainfall amounts. Th e reliability plots show that the events with lower probabilities were under forecasted and those with higher probabilities were over forecasted. From the current study it can be concluded that even though NGEFS is a coarse resolution EPS, the probabilistic forecast has good skill. This in turn leads to an increased confidence in issuing operational probabilistic forecasts based on NGEFS.

  4. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  5. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  6. Design verification of SIFT

    NASA Technical Reports Server (NTRS)

    Moser, Louise; Melliar-Smith, Michael; Schwartz, Richard

    1987-01-01

    A SIFT reliable aircraft control computer system, designed to meet the ultrahigh reliability required for safety critical flight control applications by use of processor replications and voting, was constructed for SRI, and delivered to NASA Langley for evaluation in the AIRLAB. To increase confidence in the reliability projections for SIFT, produced by a Markov reliability model, SRI constructed a formal specification, defining the meaning of reliability in the context of flight control. A further series of specifications defined, in increasing detail, the design of SIFT down to pre- and post-conditions on Pascal code procedures. Mechanically checked mathematical proofs were constructed to demonstrate that the more detailed design specifications for SIFT do indeed imply the formal reliability requirement. An additional specification defined some of the assumptions made about SIFT by the Markov model, and further proofs were constructed to show that these assumptions, as expressed by that specification, did indeed follow from the more detailed design specifications for SIFT. This report provides an outline of the methodology used for this hierarchical specification and proof, and describes the various specifications and proofs performed.

  7. Microstructural Development and Technical Challenges in Laser Additive Manufacturing: Case Study with a 316L Industrial Part

    NASA Astrophysics Data System (ADS)

    Marya, Manuel; Singh, Virendra; Marya, Surendar; Hascoet, Jean Yves

    2015-08-01

    Additive manufacturing (AM) brings disruptive changes to the ways parts, and products are designed, fabricated, tested, qualified, inspected, marketed, and sold. These changes introduce novel technical challenges and concerns arising from the maturity and diversity of today's AM processes, feedstock materials, and process parameter interactions. AM bears a resemblance with laser and electron beam welding in the so-called conduction mode, which involves a multitude of dynamic physical events between the projected feedstock and a moving heat source that eventually influence AM part properties. For this paper, an air vent was selected for its thin-walled, hollow, and variable cross section, and limited size. The studied air vents, randomly selected from a qualification batch, were fabricated out of 316L stainless steel using a 4 kW fiber laser powder-fed AM system, referred to as construction laser additive direct (CLAD). These were systematically characterized by microhardness indentation, visual examination, optical and scanning electron microscopy, and electron-back-scattering diffraction in order to determine AM part suitability for service and also broadly discuss metallurgical phenomena. The paper then briefly expands the discussion to include additional engineering alloys and further analyze relationships between AM process parameters and AM part properties, consistently utilizing past experience with the same powder-fed CLAD 3D printer, the well-established science and technology of welding and joining, and recent publications on additive manufacturing.

  8. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  9. Effect of One Percent Chlorhexidine Addition on the Antibacterial Activity and Mechanical Properties of Sealants: An in vitro Study

    PubMed Central

    Asokan, Sharath; John, J Baby; Priya, PR Geetha; Devi, Jagadeesan Gnana

    2015-01-01

    ABSTRACT Aim: The aim of the study was to evaluate the effect of addition of 1% chlorhexidine digluconate solution on the antibacterial activity and mechanical properties of glass ionomer and resin based sealant. Materials and methods: Conventional glass ionomer sealant (GIS) (Fuji VII, Japan) and resin sealant (Clinpro 3M ESPE, USA) were used in this study. Chlorhexidine digluconate (CHX) (20%) liquid was added to both the sealants, and the concentration of chlorhexidine in sealants was adjusted to 1%. The sealants were divided into four groups as: group A (GIS), group B (GIS + 1% CHX), group C (resin sealant), group D (resin sealant + 1% CHX). Five cylindrical specimens were prepared in each group. Their antibacterial activity against Streptococcus mutans and Lactobacillus acidophilus, and their mechanical properties (compressive strength and diametrical tensile strength) were assessed. Mann-Whitney and Wilcoxon signed rank test were used appropriately for statistical analysis (SPSS version 19). Result: Addition of one percent chlorhexidine significantly increased the antibacterial activity of both the sealants. There was a significant difference between groups A and B (p < 0.009), and groups C and D (p < 0.008). There was no significant difference in the mechanical properties of the sealants. Conclusion: Addition of one percent chlorhexidine to the glass ionomer and resin based sealants provided sufficient antibacterial activity, without significantly affecting the mechanical property of the sealants. How to cite this article: Shanmugaavel AK, Asokan S, John JB, Geetha Priya PR, Gnana Devi J. Effect of one percent Chlorhexidine Addition on the Antibacterial Activity and Mechanical Properties of Sealants: An in vitro Study. Int J Clin Pediatr Dent 2015;8(3):196-201. PMID:26628854

  10. A study on the effect of halloysite nanoparticle addition on the strength of glass fiber reinforced plastic

    NASA Astrophysics Data System (ADS)

    Kim, Yun-Hae; Park, Soo-Jeong; Lee, Jin-Woo; Moon, Kyung-Man

    2015-03-01

    Halloysite nanotube, which has been used in the polymer, has been spotlighted as a useful functional materials in the improvement of mechanical properties. In the current study, we established the optimal nanoparticle dispersion and analyzed the mechanical characteristics and the behavior of composites reinforced by HNTs have been synthesized by dispersing HNTs to the unsaturated polyester resin (UPR) and their mechanical characteristics, especially the tensile strength, interlaminar shear strength have been analyzed. Additionally, the reinforcement effect and its variation according to the amount of HNTs was also studied.

  11. A quantum chemical study of the mechanisms of olefin addition to group 9 transition metal dioxo compounds.

    PubMed

    Ahmed, Issahaku; Tia, Richard; Adei, Evans

    2016-01-01

    triplet PES than on the singlet PES for the formation of similar analogues. There are fewer competitive reaction pathways on the triplet surface than on the singlet PES. Also, cycloadditions that seem impossible on the singlet PES seem possible on the doublet and or triplet PESs, this is the case typically for the Rh and Co complexes, illustrating the importance of multiple spin states in organometallic reactions.Graphical AbstractTable of Contents Synopsis: A study of the mechanism of ethylene addition to MO2(CH2)(CH3)(M=Co,Rh,Ir) shows the reactions of the Co complex have lower activation barriers for the preferred [3+2] and [2+2] addition pathways and fewer side reactions than those of Rh and Ir. Reactions are more feasible and selective on the triplet PES than on the singlet PES. These illustrate the importance of multiple spin states in organometallic reactions and shows catalyst activity and selectivity decreases down the group.

  12. A fundamental study of the oxidation behavior of SI primary reference fuels with propionaldehyde and DTBP as an additive

    NASA Astrophysics Data System (ADS)

    Johnson, Rodney

    In an effort to combine the benefits of SI and CI engines, Homogeneous Charge Compression Ignition (HCCI) engines are being developed. HCCI combustion is achieved by controlling the temperature, pressure, and composition of the fuel and air mixture so that autoignition occurs in proper phasing with the piston motion. This control system is fundamentally more challenging than using a spark plug or fuel injector to determine ignition timing as in SI and CI engines, respectively. As a result, this is a technical barrier that must be overcome to make HCCI engines applicable to a wide range of vehicles and viable for high volume production. One way to tailor the autoignition timing is to use small amounts of ignition enhancing additives. In this study, the effect of the addition of DTBP and propionaldehyde on the autoignition behavior of SI primary reference fuels was investigated. The present work was conducted in a new research facility built around a single cylinder Cooperative Fuels Research (CFR) octane rating engine but modified to run in HCCI mode. It focused on the effect of select oxygenated hydrocarbons on hydrocarbon fuel oxidation, specifically, the primary reference fuels n-heptane and iso-octane. This work was conducted under HCCI operating conditions. Previously, the operating parameters for this engine were validated for stable combustion under a wide range of operating parameters such as engine speeds, equivalence ratios, compression ratios and inlet manifold temperature. The stable operating range under these conditions was recorded and used for the present study. The major focus of this study was to examine the effect of the addition of DTBP or propionaldehyde on the oxidation behavior of SI primary reference fuels. Under every test condition the addition of the additives DTBP and propionaldehyde caused a change in fuel oxidation. DTBP always promoted fuel oxidation while propionaldehyde promoted oxidation for lower octane number fuels and delayed

  13. Applicability of the DPPH assay for evaluating the antioxidant capacity of food additives - inter-laboratory evaluation study -.

    PubMed

    Shimamura, Tomoko; Sumikura, Yoshihiro; Yamazaki, Takeshi; Tada, Atsuko; Kashiwagi, Takehiro; Ishikawa, Hiroya; Matsui, Toshiro; Sugimoto, Naoki; Akiyama, Hiroshi; Ukeda, Hiroyuki

    2014-01-01

    An inter-laboratory evaluation study was conducted in order to evaluate the antioxidant capacity of food additives by using a 1,1-diphenyl-2-picrylhydrazyl (DPPH) assay. Four antioxidants used as existing food additives (i.e., tea extract, grape seed extract, enju extract, and d-α-tocopherol) and 6-hydroxy-2,5,7,8-tetramethylchroman-2-carboxylic acid (Trolox) were used as analytical samples, and 14 laboratories participated in this study. The repeatability relative standard deviation (RSD(r)) of the IC50 of Trolox, four antioxidants, and the Trolox equivalent antioxidant capacity (TEAC) were 1.8-2.2%, 2.2-2.9%, and 2.1-2.5%, respectively. Thus, the proposed DPPH assay showed good performance within the same laboratory. The reproducibility relative standard deviation (RSD(R)) of IC50 of Trolox, four antioxidants, and TEAC were 4.0-7.9%, 6.0-11%, and 3.7-9.3%, respectively. The RSD(R)/RSD(r) values of TEAC were lower than, or nearly equal to, those of IC50 of the four antioxidants, suggesting that the use of TEAC was effective for reducing the variance among the laboratories. These results showed that the proposed DPPH assay could be used as a standard method to evaluate the antioxidant capacity of food additives.

  14. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  15. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  16. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  17. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  18. Subsurface barrier integrity verification using perfluorocarbon tracers

    SciTech Connect

    Sullivan, T.M.; Heiser, J.; Milian, L.; Senum, G.

    1996-12-01

    Subsurface barriers are an extremely promising remediation option to many waste management problems. Gas phase tracers include perfluorocarbon tracers (PFT`s) and chlorofluorocarbon tracers (CFC`s). Both have been applied for leak detection in subsurface systems. The focus of this report is to describe the barrier verification tests conducted using PFT`s and analysis of the data from the tests. PFT verification tests have been performed on a simulated waste pit at the Hanford Geotechnical facility and on an actual waste pit at Brookhaven National Laboratory (BNL). The objective of these tests were to demonstrate the proof-of-concept that PFT technology can be used to determine if small breaches form in the barrier and for estimating the effectiveness of the barrier in preventing migration of the gas tracer to the monitoring wells. The subsurface barrier systems created at Hanford and BNL are described. The experimental results and the analysis of the data follow. Based on the findings of this study, conclusions are offered and suggestions for future work are presented.

  19. Bayesian ROC curve estimation under verification bias.

    PubMed

    Gu, Jiezhun; Ghosal, Subhashis; Kleiner, David E

    2014-12-20

    Receiver operating characteristic (ROC) curve has been widely used in medical science for its ability to measure the accuracy of diagnostic tests under the gold standard. However, in a complicated medical practice, a gold standard test can be invasive, expensive, and its result may not always be available for all the subjects under study. Thus, a gold standard test is implemented only when it is necessary and possible. This leads to the so-called 'verification bias', meaning that subjects with verified disease status (also called label) are not selected in a completely random fashion. In this paper, we propose a new Bayesian approach for estimating an ROC curve based on continuous data following the popular semiparametric binormal model in the presence of verification bias. By using a rank-based likelihood, and following Gibbs sampling techniques, we compute the posterior distribution of the binormal parameters intercept and slope, as well as the area under the curve by imputing the missing labels within Markov Chain Monte-Carlo iterations. Consistency of the resulting posterior under mild conditions is also established. We compare the new method with other comparable methods and conclude that our estimator performs well in terms of accuracy. PMID:25269427

  20. The influence of deposit control additives on nitrogen oxides emissions from spark ignition engines (case study: Tehran).

    PubMed

    Bidhendi, Gholamreza Nabi; Zand, Ali Daryabeigi; Tabrizi, Alireza Mikaeili; Pezeshk, Hamid; Baghvand, Akbar

    2007-04-15

    In the present research, the influence of a deposit control additive on NOx emissions from two types of gasoline engine vehicles i.e., Peykan (base on Hillman) and Pride (South Korea Kia motors) was studied. Exhaust NOx emissions were measured in to stages, before decarbonization process and after that. Statistical analysis was conducted on the measurement results. Results showed that NOx emissions from Peykans increased 0.28% and NOx emissions from Pride automobiles decreased 6.18% on average, due to the elimination of engine deposits. The observed variations were not statistically and practically significant. The results indicated that making use of detergent additives is not an effective way to reduce the exhaust NOx emissions from gasoline engine vehicles. PMID:19069943

  1. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  2. A mechanistic study of the addition of alcohol to a five-membered ring silene via a photochemical reaction.

    PubMed

    Su, Ming-Der

    2016-03-21

    The mechanism for the photochemical rearrangement of a cyclic divinyldisilane (1-Si) in its first excited state ((1)π → (1)π*) is determined using the CAS/6-311G(d) and MP2-CAS/6-311++G(3df,3pd) levels of theory. The photoproduct, a cyclic silene, reacts with various alcohols to yield a mixture of cis- and trans- adducts. The two reaction pathways are denoted as the cis- addition path (path A) and the trans-addition path (path B). These model studies demonstrate that conical intersections play a crucial role in the photo-rearrangements of cyclic divinyldisilanes. The theoretical evidence also demonstrates that the addition of alcohol to a cyclic divinyldisilane follows the reaction path: cyclic divinyldisilane → Franck-Condon region → conical intersection → photoproduct (cyclic silene) → local intermediate (with alcohol) → transition state → cis- or trans-adduct. The theoretical studies demonstrate that the steric effects as well as the concentrations of CH3OH must have a dominant role in determining the yields of the final adducts by stereochemistry. The same mechanism for the carbon derivative (1-C) is also considered in this work. However, the theoretical results indicate that 1-C does not undergo a methanol addition reaction via the photochemical reaction pathway, since its energy of conical intersection (S1/S0-CI-C) is more than that of its FC (FC-C). The reason for these phenomena could be that the atomic radius of carbon is much smaller than that of silicon (77 and 117 pm, respectively). As a result, the conformation for 1-C is more sterically congested than that for 1-Si, along the 1,3-silyl-migration pathway. PMID:26928893

  3. Independent verification and validation of large software requirement specification databases

    SciTech Connect

    Twitchell, K.E.

    1992-04-01

    To enhance quality, an independent verification and validation (IV V) review is conducted as software requirements are defined. Requirements are inspected for consistency and completeness. IV V strives to detect defects early in the software development life cycle and to prevent problems before they occur. The IV V review process of a massive software requirements specification, the Reserve Component Automation System (RCAS) Functional Description (FD) is explored. Analysis of the RCAS FD error history determined that there are no predictors of errors. The size of the FD mandates electronic analysis of the databases. Software which successfully performs automated consistency and completeness checks is discussed. The process of verifying the quality of analysis software is described. The use of intuitive ad hoc techniques, in addition to the automatic analysis of the databases, is required because of the varying content of the requirements databases. The ad hoc investigation process is discussed. Case studies are provided to illustrate how the process works. This thesis demonstrates that it is possible to perform an IV V review on a massive software requirements specification. Automatic analysis enables inspecting for completeness and consistency. The work with the RCAS FD clearly indicates that the IV V review process is not static; it must continually grow, adapt, and change as conditions warrant. The ad hoc investigation process provides this required flexibility This process also analyzes errors discovered by manual review and automatic processing. The analysis results in the development of new algorithms and the addition of new programs to the automatic inspection software.

  4. Implementation of Precision Verification Solvents on the External Tank

    NASA Technical Reports Server (NTRS)

    Campbell, M.

    1998-01-01

    This paper presents the Implementation of Precision Verification Solvents on the External Tank. The topics include: 1) Background; 2) Solvent Usages; 3) TCE (Trichloroethylene) Reduction; 4) Solvent Replacement Studies; 5) Implementation; 6) Problems Occuring During Implementation; and 7) Future Work. This paper is presented in viewgraph form.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT FOR AMMONIA RECOVERY PROCESS

    EPA Science Inventory

    This Technology Verification report describes the nature and scope of an environmental evaluation of ThermoEnergy Corporation’s Ammonia Recovery Process (ARP) system. The information contained in this report represents data that were collected over a 3-month pilot study. The ti...

  6. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  7. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  8. Guide to good practices for independent verification

    SciTech Connect

    1998-12-01

    This Guide to Good Practices is written to enhance understanding of, and provide direction for, Independent Verification, Chapter X of Department of Energy (DOE) Order 5480.19, Conduct of Operations Requirements for DOE Facilities. The practices in this guide should be considered when planning or reviewing independent verification activities. Contractors are advised to adopt procedures that meet the intent of DOE Order 5480.19. Independent Verification is an element of an effective Conduct of Operations program. The complexity and array of activities performed in DOE facilities dictate the necessity for coordinated independent verification activities to promote safe and efficient operations.

  9. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  10. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  11. Assessing clinical outcomes of patients with acute calculous cholecystitis in addition to the Tokyo grading: a retrospective study.

    PubMed

    Cheng, Wei-Chun; Chiu, Yen-Cheng; Chuang, Chiao-Hsiung; Chen, Chiung-Yu

    2014-09-01

    The management of acute cholecystitis is still based on clinical expertise. This study aims to investigate whether the outcome of acute cholecystitis can be related to the severity criteria of the Tokyo guidelines and additional clinical comorbidities. A total of 103 patients with acute cholecystitis were retrospectively enrolled and their medical records were reviewed. They were all classified according to therapeutic modality, including early cholecystectomy and antibiotic treatment with or without percutaneous cholecystostomy. The impact of the Tokyo guidelines and the presence of comorbidities on clinical outcome were assessed by univariate and multivariate regression analyses. According to Tokyo severity grading, 48 patients were Grade I, 31 patients were Grade II, and 24 patients were Grade III. The Grade III patients had a longer hospital stay than Grade II and Grade I patients (15.2 days, 9.2 days, and 7.3 days, respectively, p < 0.05). According to multivariate analysis, patients with Grade III Tokyo severity, higher Charlson's Comorbidity Score, and encountering complications had a longer hospital stay. Based on treatment modality, surgeons selected the patients with less severity and fewer comorbidities for cholecystectomy, and these patients had a shorter hospital stay. In addition to the grading of the Tokyo guidelines, comorbidities had an additional impact on clinical outcomes and should be an important consideration when making therapeutic decisions.

  12. Spectrophotometric study of complexation equilibria with H-point standard addition and H-point curve isolation methods.

    PubMed

    Abdollahi, H; Zeinali, S

    2004-01-01

    The use of H-point curve isolation (HPCIM) and H-point standard addition methods (HPSAM) for spectrophotometric studies of complex formation equilibria are proposed. One step complex formation, two successive stepwise and mononuclear complex formation systems, and competitive complexation systems are studied successfully by the proposed methods. HPCIM is used for extracting the spectrum of complex or sum of complex species and HPSAM is used for calculation of equilibrium concentrations of ligand for each sample. The outputs of these procedures are complete concentration profiles of equilibrium system, spectral profile of intermediate components, and good estimation of conditional formation constants. The reliability of the method is evaluated using model data. Spectrophotometric studies of murexide-calcium, dithizone-nickel, methyl thymol blue (MTB)-copper, and competition of murexide and sulfate ions for complexation with zinc, are used as experimental model systems with different complexation stoichiometries and spectral overlapping of involved components.

  13. Probiotics in addition to antibiotics for the treatment of acute tonsillitis: a randomized, placebo-controlled study.

    PubMed

    Gilbey, P; Livshits, L; Sharabi-Nov, A; Avraham, Y; Miron, D

    2015-05-01

    Probiotics are live microorganisms which, when administered in adequate amounts, confer a health benefit on the host. The probiotic Streptococcus salivarius has been shown to be effective in reducing the frequency of recurrent pharyngeal infections in children and adult populations. However, probiotics have not yet been evaluated in the treatment of acute pharyngotonsillitis in adults. We aimed to examine whether the addition of S. salivarius probiotics to the routine therapy of acute pharyngotonsillitis in adult patients may shorten disease duration and reduce symptom severity. This study was a prospective, randomized, placebo-controlled, double-blinded study comparing treatment with probiotics to placebo in addition to antibiotics in patients who were hospitalized with severe pharyngotonsillitis. Laboratory results, pain levels, body temperature, and daily volume of fluids consumed were recorded for both groups. Sixty participants were recruited, 30 for each group. No statistically significant differences between the two groups were observed regarding any of the major clinical and laboratory parameters examined. Supplement probiotic treatment with S. salivarius in patients with acute pharyngotonsillitis treated with penicillin is ineffective in relation to the parameters examined in this study and we cannot, therefore, recommend the use of S. salivarius during active pharyngotonsillar infection treated with penicillin.

  14. A study on the effect of the polymeric additive HPMC on morphology and polymorphism of ortho-aminobenzoic acid crystals

    NASA Astrophysics Data System (ADS)

    Simone, E.; Cenzato, M. V.; Nagy, Z. K.

    2016-07-01

    In the present study, the effect of Hydroxy Propyl Methyl Cellulose (HPMC) on the crystallization of ortho-aminobenzoic acid (OABA) was investigated by seeded and unseeded cooling crystallization experiments. The influence of HPMC on the induction time, crystal shape of Forms I and II of OABA and the polymorphic transformation time was studied. Furthermore, the capability of HPMC to inhibit growth of Form I was evaluated quantitatively and modeled using population balance equations (PBE) solved with the method of moments. The additive was found to strongly inhibit nucleation and growth of Form I as well as to increase the time for the polymorphic transformation from Form II to I. Solvent was also found to influence the shape of Form I crystals at equal concentrations of HPMC. In situ process analytical technology (PAT) tools, including Raman spectroscopy, focused beam reflectance measurement (FBRM) and attenuated total reflectance (ATR) UV-vis spectroscopy were used in combination with off-line techniques, such as optical microscopy, scanning electron microscopy (SEM), Raman spectroscopy, Malvern Mastersizer and differential scanning calorimetry (DSC) to study the crystals produced. The results illustrate how shape, size and stability of the two polymorphs of OABA can be controlled and tailored using a polymeric additive.

  15. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  16. The addition of GTN to capsaicin cream reduces the discomfort associated with application of capsaicin alone. A volunteer study.

    PubMed

    McCleane, G J; McLaughlin, M

    1998-11-01

    In a double blind, placebo controlled trial of 40 volunteers, the burning discomfort associated with application of capsaicin cream (0.025%) was compared to placebo, GTN cream (1.33%) and to the combination of capsaicin cream (0.025%) plus GTN cream 1.33%. Median VAS for burning pain were 0 for the placebo, GTN and GTN + capsaicin groups and 3 for the capsaicin group after single application of each cream at daily intervals. This study demonstrates that after single application the addition of GTN to capsaicin significantly reduces the burning discomfort associated with application of capsaicin alone.

  17. A SEARCH FOR ADDITIONAL PLANETS IN FIVE OF THE EXOPLANETARY SYSTEMS STUDIED BY THE NASA EPOXI MISSION

    SciTech Connect

    Ballard, Sarah; Charbonneau, David; Holman, Matthew J.; Christiansen, Jessie L.; Deming, Drake; Barry, Richard K.; Kuchner, Marc J.; Livengood, Timothy A.; Hewagama, Tilak; Hampton, Don L.; Lisse, Carey M.; Seager, Sara; Veverka, Joseph F.

    2011-05-01

    We present time series photometry and constraints on additional planets in five of the exoplanetary systems studied by the EPOCh (Extrasolar Planet Observation and Characterization) component of the NASA EPOXI mission: HAT-P-4, TrES-3, TrES-2, WASP-3, and HAT-P-7. We conduct a search of the high-precision time series for photometric transits of additional planets. We find no candidate transits with significance higher than our detection limit. From Monte Carlo tests of the time series using putative periods from 0.5 days to 7 days, we demonstrate the sensitivity to detect Neptune-sized companions around TrES-2, sub-Saturn-sized companions in the HAT-P-4, TrES-3, and WASP-3 systems, and Saturn-sized companions around HAT-P-7. We investigate in particular our sensitivity to additional transits in the dynamically favorable 3:2 and 2:1 exterior resonances with the known exoplanets: if we assume coplanar orbits with the known planets, then companions in these resonances with HAT-P-4b, WASP-3b, and HAT-P-7b would be expected to transit, and we can set lower limits on the radii of companions in these systems. In the nearly grazing exoplanetary systems TrES-3 and TrES-2, additional coplanar planets in these resonances are not expected to transit. However, we place lower limits on the radii of companions that would transit if the orbits were misaligned by 2.{sup 0}0 and 1.{sup 0}4 for TrES-3 and TrES-2, respectively.

  18. Theoretical study of the oxidation mechanisms of naphthalene initiated by hydroxyl radicals: the OH-addition pathway.

    PubMed

    Shiroudi, Abolfazl; Deleuze, Michael S; Canneaux, Sébastien

    2014-07-01

    The oxidation mechanisms of naphthalene by OH radicals under inert (He) conditions have been studied using density functional theory along with various exchange-correlation functionals. Comparison has been made with benchmark CBS-QB3 theoretical results. Kinetic rate constants were correspondingly estimated by means of transition state theory and statistical Rice-Ramsperger-Kassel-Marcus (RRKM) theory. Comparison with experiment confirms that, on the OH-addition reaction pathway leading to 1-naphthol, the first bimolecular reaction step has an effective negative activation energy around -1.5 kcal mol(-1), whereas this step is characterized by an activation energy around 1 kcal mol(-1) on the OH-addition reaction pathway leading to 2-naphthol. Effective rate constants have been calculated according to a steady state analysis upon a two-step model reaction mechanism. In line with experiment, the correspondingly obtained branching ratios indicate that, at temperatures lower than 410 K, the most abundant product resulting from the oxidation of naphthalene by OH radicals must be 1-naphthol. The regioselectivity of the OH(•)-addition onto naphthalene decreases with increasing temperatures and decreasing pressures. Because of slightly positive or even negative activation energies, the RRKM calculations demonstrate that the transition state approximation breaks down at ambient pressure (1 bar) for the first bimolecular reaction steps. Overwhelmingly high pressures, larger than 10(5) bar, would be required for restoring to some extent (within ∼5% accuracy) the validity of this approximation for all the reaction channels that are involved in the OH-addition pathway. Analysis of the computed structures, bond orders, and free energy profiles demonstrate that all reaction steps involved in the oxidation of naphthalene by OH radicals satisfy Leffler-Hammond's principle. Nucleus independent chemical shift indices and natural bond orbital analysis also show that the computed

  19. A new gravitational wave verification source

    NASA Astrophysics Data System (ADS)

    Kilic, Mukremin; Brown, Warren R.; Gianninas, A.; Hermes, J. J.; Allende Prieto, Carlos; Kenyon, S. J.

    2014-10-01

    We report the discovery of a detached 20-min orbital period binary white dwarf (WD). WD 0931+444 (SDSS J093506.93+441106.9) was previously classified as a WD + M dwarf system based on its optical spectrum. Our time-resolved optical spectroscopy observations obtained at the 8 m Gemini and 6.5 m MMT reveal peak-to-peak radial velocity variations of ≈400 km s-1 every 20 min for the WD, but no velocity variations for the M dwarf. In addition, high-speed photometry from the McDonald 2.1 m telescope shows no evidence of variability nor evidence of a reflection effect. An M dwarf companion is physically too large to fit into a 20 min orbit. Thus, the orbital motion of the WD is almost certainly due to an invisible WD companion. The M dwarf must be either an unrelated background object or the tertiary component of a hierarchical triple system. WD 0931+444 contains a pair of WDs, a 0.32 M⊙ primary and a ≥0.14 M⊙ secondary, at a separation of ≥0.19 R⊙. After J0651+2844, WD 0931+444 becomes the second shortest period detached binary WD currently known. The two WDs will lose angular momentum through gravitational wave radiation and merge in ≤9 Myr. The log h ≃ -22 gravitational wave strain from WD 0931+444 is strong enough to make it a verification source for gravitational wave missions in the milli-Hertz frequency range, e.g. the evolved Laser Interferometer Space Antenna (eLISA), bringing the total number of known eLISA verification sources to nine.

  20. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.