Alternative Nonvolatile Residue Analysis with Contaminant Identification Project
NASA Technical Reports Server (NTRS)
Loftin, Kathleen (Compiler); Summerfield, Burton (Compiler); Thompson, Karen (Compiler); Mullenix, Pamela (Compiler); Zeitlin, Nancy (Compiler)
2015-01-01
Cleanliness verification is required in numerous industries including spaceflight ground support, electronics, medical and aerospace. Currently at KSC requirement for cleanliness verification use solvents that environmentally unfriendly. This goal of this project is to produce an alternative cleanliness verification technique that is both environmentally friendly and more cost effective.
Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling
NASA Technical Reports Server (NTRS)
Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.
2002-01-01
Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.
Improved Detection Technique for Solvent Rinse Cleanliness Verification
NASA Technical Reports Server (NTRS)
Hornung, S. D.; Beeson, H. D.
2001-01-01
The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.
Investigation of Cleanliness Verification Techniques for Rocket Engine Hardware
NASA Technical Reports Server (NTRS)
Fritzemeier, Marilyn L.; Skowronski, Raymund P.
1994-01-01
Oxidizer propellant systems for liquid-fueled rocket engines must meet stringent cleanliness requirements for particulate and nonvolatile residue. These requirements were established to limit residual contaminants which could block small orifices or ignite in the oxidizer system during engine operation. Limiting organic residues in high pressure oxygen systems, such as in the Space Shuttle Main Engine (SSME), is particularly important. The current method of cleanliness verification for the SSME uses an organic solvent flush of the critical hardware surfaces. The solvent is filtered and analyzed for particulate matter followed by gravimetric determination of the nonvolatile residue (NVR) content of the filtered solvent. The organic solvents currently specified for use (1, 1, 1-trichloroethane and CFC-113) are ozone-depleting chemicals slated for elimination by December 1995. A test program is in progress to evaluate alternative methods for cleanliness verification that do not require the use of ozone-depleting chemicals and that minimize or eliminate the use of solvents regulated as hazardous air pollutants or smog precursors. Initial results from the laboratory test program to evaluate aqueous-based methods and organic solvent flush methods for NVR verification are provided and compared with results obtained using the current method. Evaluation of the alternative methods was conducted using a range of contaminants encountered in the manufacture of rocket engine hardware.
Magnetic cleanliness verification approach on tethered satellite
NASA Technical Reports Server (NTRS)
Messidoro, Piero; Braghin, Massimo; Grande, Maurizio
1990-01-01
Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.
NASA Technical Reports Server (NTRS)
Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy
1999-01-01
The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.
New Developments in Magnetostatic Cleanliness Modeling
NASA Astrophysics Data System (ADS)
Mehlem, K.; Wiegand, A.; Weickert, S.
2012-05-01
The paper describes improvements and extensions of the multiple magnetic dipole modeling method (MDM) for cleanliness verification which had been introduced by the author1 in 1977 and then applied during 3 decades to numerous international projects. The solutions of specific modeling problems which had been left unsolved so far, are described in the present paper. Special attention is given to the ambiguities of MDM solutions caused by the limited data coverage available. Constraint handling by the constraint-free NLP solver, optimal MDM sizing and multiple-point far-field compensation techniques are presented. The recent extension of the MDM method to field gradient data is formulated and demonstrated by an example. Finally, a complex MDM application (Ulysses) is presented. Finally, a short description of the MDM software GAMAG, recently introduced by the author1, is given.
Guidelines for qualifying cleaning and verification materials
NASA Technical Reports Server (NTRS)
Webb, D.
1995-01-01
This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.
Category V Compliant Container for Mars Sample Return Missions
NASA Technical Reports Server (NTRS)
Dolgin, Benjamin; Sanok, Joseph; Sevilla, Donald; Bement, Laurence J.
2000-01-01
A novel containerization technique that satisfies Planetary Protection (PP) Category V requirements has been developed and demonstrated on the mock-up of the Mars Sample Return Container. The proposed approach uses explosive welding with a sacrificial layer and cut-through-the-seam techniques. The technology produces a container that is free from Martian contaminants on an atomic level. The containerization technique can be used on any celestial body that may support life. A major advantage of the proposed technology is the possibility of very fast (less than an hour) verification of both containment and cleanliness with typical metallurgical laboratory equipment. No separate biological verification is required. In addition to Category V requirements, the proposed container presents a surface that is clean from any, even nonviable organisms, and any molecular fragments of biological origin that are unique to Mars or any other celestial body other than Earth.
NASA Technical Reports Server (NTRS)
Bankaitis, H.; Schueller, C. F.
1972-01-01
The oxygen system cleaning specifications drawn from 23 industrial and government sources are presented along with cleaning processes employed for meeting these specifications, and recommended postcleaning inspection procedures for establishing the cleanliness achieved. Areas of agreement and difference in the specifications, procedures, and inspection are examined. Also, the lack of clarity or specificity will be discussed. This absence of clarity represents potential safety hazards due to misinterpretation. It can result in exorbitant expenditures of time and money in satisfying unnecessary requirements.
The Mars Science Laboratory Organic Check Material
NASA Astrophysics Data System (ADS)
Conrad, Pamela G.; Eigenbrode, Jennifer L.; Von der Heydt, Max O.; Mogensen, Claus T.; Canham, John; Harpold, Dan N.; Johnson, Joel; Errigo, Therese; Glavin, Daniel P.; Mahaffy, Paul R.
2012-09-01
Mars Science Laboratory's Curiosity rover carries a set of five external verification standards in hermetically sealed containers that can be sampled as would be a Martian rock, by drilling and then portioning into the solid sample inlet of the Sample Analysis at Mars (SAM) suite. Each organic check material (OCM) canister contains a porous ceramic solid, which has been doped with a fluorinated hydrocarbon marker that can be detected by SAM. The purpose of the OCM is to serve as a verification tool for the organic cleanliness of those parts of the sample chain that cannot be cleaned other than by dilution, i.e., repeated sampling of Martian rock. SAM possesses internal calibrants for verification of both its performance and its internal cleanliness, and the OCM is not used for that purpose. Each OCM unit is designed for one use only, and the choice to do so will be made by the project science group (PSG).
Advanced Curation Protocols for Mars Returned Sample Handling
NASA Astrophysics Data System (ADS)
Bell, M.; Mickelson, E.; Lindstrom, D.; Allton, J.
Introduction: Johnson Space Center has over 30 years experience handling precious samples which include Lunar rocks and Antarctic meteorites. However, we recognize that future curation of samples from such missions as Genesis, Stardust, and Mars S mple Return, will require a high degree of biosafety combined witha extremely low levels of inorganic, organic, and biological contamination. To satisfy these requirements, research in the JSC Advanced Curation Lab is currently focused toward two major areas: preliminary examination techniques and cleaning and verification techniques . Preliminary Examination Techniques : In order to minimize the number of paths for contamination we are exploring the synergy between human &robotic sample handling in a controlled environment to help determine the limits of clean curation. Within the Advanced Curation Laboratory is a prototype, next-generation glovebox, which contains a robotic micromanipulator. The remotely operated manipulator has six degrees-of- freedom and can be programmed to perform repetitive sample handling tasks. Protocols are being tested and developed to perform curation tasks such as rock splitting, weighing, imaging, and storing. Techniques for sample transfer enabling more detailed remote examination without compromising the integrity of sample science are also being developed . The glovebox is equipped with a rapid transfer port through which samples can be passed without exposure. The transfer is accomplished by using a unique seal and engagement system which allows passage between containers while maintaining a first seal to the outside environment and a second seal to prevent the outside of the container cover and port door from becoming contaminated by the material being transferred. Cleaning and Verification Techniques: As part of the contamination control effort, innovative cleaning techniques are being identified and evaluated in conjunction with sensitive cleanliness verification methods. Towards this end, cleaning techniques such as ultrasonication in ultra -pure water (UPW), oxygen (O2) plasma, and carbon dioxide (CO2) "snow" are being used to clean a variety of different contaminants on a variety of different surfaces. Additionally, once cleaned, techniques to directly verify the s rface cleanliness are being developed. Theseu include X ray photoelectron spectroscopy (XPS) quantification, and screening with- contact angle measure ments , which can be correlated with XPS standards. Methods developed in the Advanced Curation Laboratory will determine the extent to which inorganic and biological contamination can be controlled and minimized.
Quantitative Hydrocarbon Surface Analysis
NASA Technical Reports Server (NTRS)
Douglas, Vonnie M.
2000-01-01
The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.
Hardware cleanliness methodology and certification
NASA Technical Reports Server (NTRS)
Harvey, Gale A.; Lash, Thomas J.; Rawls, J. Richard
1995-01-01
Inadequacy of mass loss cleanliness criteria for selection of materials for contamination sensitive uses, and processing of flight hardware for contamination sensitive instruments is discussed. Materials selection for flight hardware is usually based on mass loss (ASTM E-595). However, flight hardware cleanliness (MIL 1246A) is a surface cleanliness assessment. It is possible for materials (e.g. Sil-Pad 2000) to pass ASTM E-595 and fail MIL 1246A class A by orders of magnitude. Conversely, it is possible for small amounts of nonconforming material (Huma-Seal conformal coating) to not present significant cleanliness problems to an optical flight instrument. Effective cleaning (precleaning, precision cleaning, and ultra cleaning) and cleanliness verification are essential for contamination sensitive flight instruments. Polish cleaning of hardware, e.g. vacuum baking for vacuum applications, and storage of clean hardware, e.g. laser optics, is discussed. Silicone materials present special concerns for use in space because of the rapid conversion of the outgassed residues to glass by solar ultraviolet radiation and/or atomic oxygen. Non ozone depleting solvent cleaning and institutional support for cleaning and certification are also discussed.
NASA Technical Reports Server (NTRS)
Burns, H. D.; Mitchell, M. A.; McMillian, J. H.; Farner, B. R.; Harper, S. A.; Peralta, S. F.; Lowrey, N. M.; Ross, H. R.; Juarez, A.
2015-01-01
Since the 1990's, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have used hydrochlorofluorocarbon-225 (HCFC-225), a Class II ozone-depleting substance, to safety clean and verify the cleanliness of large scale propulsion oxygen systems and associated test facilities. In 2012 through 2014, test laboratories at MSFC, SSC, and Johnson Space Center-White Sands Test Facility collaborated to seek out, test, and qualify an environmentally preferred replacement for HCFC-225. Candidate solvents were selected, a test plan was developed, and the products were tested for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Honewell Soltice (TradeMark) Performance Fluid (trans-1-chloro-3,3, 3-trifluoropropene) was selected to replace HCFC-225 at NASA's MSFC and SSC rocket propulsion test facilities.
Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1995-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.
Developing the Cleanliness Requirements for an Organic-detection Instrument MOMA-MS
NASA Technical Reports Server (NTRS)
Perry, Radford; Canham, John; Lalime, Erin
2015-01-01
The cleanliness requirements for an organic-detection instrument, like the Mars Organic Molecule Analyzer Mass Spectrometer (MOMA-MS), on a Planetary Protection Class IVb mission can be extremely stringent. These include surface molecular and particulate, outgassing, and bioburden. The prime contractor for the European Space Agencys ExoMars 2018 project, Thales Alenia Space Italy, provided requirements based on a standard, conservative approach of defining limits which yielded levels that are unverifiable by standard cleanliness verification methods. Additionally, the conservative method for determining contamination surface area uses underestimation while conservative bioburden surface area relies on overestimation, which results in inconsistencies for the normalized reporting. This presentation will provide a survey of the challenge to define requirements that can be reasonably verified and still remain appropriate to the core science of the ExoMars mission.
Supersonic gas-liquid cleaning system
NASA Technical Reports Server (NTRS)
Caimi, Raoul E. B.; Thaxton, Eric A.
1994-01-01
A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.
Supersonic gas-liquid cleaning system
NASA Astrophysics Data System (ADS)
Caimi, Raoul E. B.; Thaxton, Eric A.
1994-02-01
A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.
Environmental monitoring of the orbiter payload bay and Orbiter Processing Facilities
NASA Technical Reports Server (NTRS)
Bartelson, D. W.; Johnson, A. M.
1985-01-01
Contamination control in the Orbiter Processing Facility (OPF) is studied. The clean level required in the OPF is generally clean, which means no residue, dirt, debris, or other extraneous contamination; various methods of maintaining this level of cleanliness are described. The monitoring and controlling of the temperature, relative humidity, and air quality in the OPF are examined. Additional modifications to the OPF to improve contamination control are discussed. The methods used to maintain the payload changeout room at a level of visually clean, no particulates are to be detected by the unaided eye, are described. The payload bay (PLB) must sustain the cleanliness level required for the specific Orbiter's mission; the three levels of clean are defined as: (1) standard, (2) sensitive, and (3) high sensitive. The cleaning and inspection verification required to achieve the desired cleanliness level on a variety of PLB surface types are examined.
Cleaning and Cleanliness Measurement of Additive Manufactured Parts
NASA Technical Reports Server (NTRS)
Welker, Roger W.; Mitchell, Mark A.
2015-01-01
The successful acquisition and utilization of piece parts and assemblies for contamination sensitive applications requires application of cleanliness acceptance criteria. Contamination can be classified using many different schemes. One common scheme is classification as organic, ionic and particulate contaminants. These may be present in and on the surface of solid components and assemblies or may be dispersed in various gaseous or liquid media. This discussion will focus on insoluble particle contamination on the surface of piece parts and assemblies. Cleanliness of parts can be controlled using two strategies, referred to as gross cleanliness and precision cleanliness. Under a gross cleanliness strategy acceptance is based on visual cleanliness. This approach introduces a number of concerns that render it unsuitable for controlling cleanliness of high technology products. Under the precision cleanliness strategy, subjective, visual assessment of cleanliness is replaced by objective measurement of cleanliness. When a precision cleanliness strategy is adopted there naturally arises the question: How clean is clean enough? The six commonly used methods for establishing objective cleanliness acceptance limits will be discussed. Special emphasis shall focus on the use of multiple extraction, a technique that has been demonstrated for additively manufactured parts.
Replacement Technologies for Precision Cleaning of Aerospace Hardware for Propellant Service
NASA Technical Reports Server (NTRS)
Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul
1997-01-01
The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-l13- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. Replacement technologies are being investigated for aerospace hardware and for gauges and instrumentation. This paper includes the findings of investigations of aqueous cleaning and verification of aerospace hardware using known contaminants, such as hydraulic fluid and commonly used oils. The results correlate nonvolatile residue with CFC 113. The studies also include enhancements to aqueous sampling for organic and particulate contamination. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon 225 (HCFC 225), HCFC 141b, HFE 7100(R), and Vertrel MCA(R) was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC 113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autogenous ignition and liquid oxygen mechanical impact testing.
Aqueous cleaning and verification processes for precision cleaning of small parts
NASA Technical Reports Server (NTRS)
Allen, Gale J.; Fishell, Kenneth A.
1995-01-01
The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.
The Mars Science Laboratory Organic Check Material
NASA Technical Reports Server (NTRS)
Conrad, Pamela G.; Eigenbrode, J. E.; Mogensen, C. T.; VonderHeydt, M. O.; Glavin, D. P.; Mahaffy, P. M.; Johnson, J. A.
2011-01-01
The Organic Check Material (OCM) has been developed for use on the Mars Science Laboratory mission to serve as a sample standard for verification of organic cleanliness and characterization of potential sample alteration as a function of the sample acquisition and portioning process on the Curiosity rover. OCM samples will be acquired using the same procedures for drilling, portioning and delivery as are used to study martian samples with The Sample Analysis at Mars (SAM) instrument suite during MSL surface operations. Because the SAM suite is highly sensitive to organic molecules, the mission can better verify the cleanliness of Curiosity's sample acquisition hardware if a known material can be processed through SAM and compared with the results obtained from martian samples.
Tebbutt, G; Bell, V; Aislabie, J
2007-04-01
The aim of this study was to determine whether or not the assessment of surface cleanliness could make a contribution to visual inspections of food premises. Forty-five premises were studied with both rapid (ATP) and traditional microbiological swabbing being used to test surfaces that either come into direct contact with prepared foods or were likely to be touched by hands during food preparation. A significant link was found between aerobic colony counts and ATP measurements. In most cases, the visual appearance of surfaces could not be used to accurately predict either microbial or ATP results. This study suggests that ATP testing is a useful indicator of surface cleanliness and could be helpful to local authority officers as part of risk assessment inspections. This study provides further evidence that visual inspection alone may not always be adequate to assess surface cleanliness. In high-risk premises, ATP could, if appropriately targeted, help identify potential problem areas. The results are available at the time of the inspection and can be used as an on-the-spot teaching aid.
Cleaning and Cleanliness Measurement of Additive Manufactured Parts
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Edwards, Kevin; Fox, Eric; Boothe, Richard
2017-01-01
Additive Manufacturing processes allow for the manufacture of complex three dimensional components that otherwise could not be manufactured. Post treatment processes require the removal of any remnant bulk powder that may become entrapped within small cavities and channels within a component. This project focuses on several gross cleaning methods and the verification metrics associated with additive manufactured parts for oxygen propulsion usage.
Supersonic Gas-Liquid Cleaning System
NASA Technical Reports Server (NTRS)
Kinney, Frank
1996-01-01
The Supersonic Gas-Liquid Cleaning System Research Project consisted mainly of a feasibility study, including theoretical and engineering analysis, of a proof-of-concept prototype of this particular cleaning system developed by NASA-KSC. The cleaning system utilizes gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the device to be cleaned. The cleaning fluid being accelerated to these high velocities may consist of any solvent or liquid, including water. Compressed air or any inert gas is used to provide the conveying medium for the liquid, as well as substantially reduce the total amount of liquid needed to perform adequate surface cleaning and cleanliness verification. This type of aqueous cleaning system is considered to be an excellent way of conducting cleaning and cleanliness verification operations as replacements for the use of CFC 113 which must be discontinued by 1995. To utilize this particular cleaning system in various cleaning applications for both the Space Program and the commercial market, it is essential that the cleaning system, especially the supersonic nozzle, be characterized for such applications. This characterization consisted of performing theoretical and engineering analysis, identifying desirable modifications/extensions to the basic concept, evaluating effects of variations in operating parameters, and optimizing hardware design for specific applications.
Surface contamination analysis technology team overview
NASA Astrophysics Data System (ADS)
Burns, H. Dewitt, Jr.
1996-11-01
The surface contamination analysis technology (SCAT) team was originated as a working roup of NASA civil service, Space Shuttle contractor, and university groups. Participating members of the SCAT Team have included personnel from NASA Marshall Space Flight Center's Materials and Processes Laboratory and Langley Research Center's Instrument Development Group; contractors-Thiokol Corporation's Inspection Technology Group, AC Engineering support contractor, Aerojet, SAIC, and Lockheed MArtin/Oak Ridge Y-12 support contractor and Shuttle External Tank prime contractor; and the University of Alabama in Huntsville's Center for Robotics and Automation. The goal of the SCAT team as originally defined was to develop and integrate a multi-purpose inspection head for robotic application to in-process inspection of contamination sensitive surfaces. One area of interest was replacement of ozone depleting solvents currently used for surface cleanliness verification. The team approach brought together the appropriate personnel to determine what surface inspection techniques were applicable to multi-program surface cleanliness inspection. Major substrates of interest were chosen to simulate space shuttle critical bonding surface or surfaces sensitive to contamination such as fuel system component surfaces. Inspection techniques evaluated include optically stimulated electron emission or photoelectron emission; Fourier transform infrared spectroscopy; near infrared fiber optic spectroscopy; and, ultraviolet fluorescence. Current plans are to demonstrate an integrated system in MSFC's Productivity Enhancement Complex within five years from initiation of this effort in 1992. Instrumentation specifications and designs developed under this effort include a portable diffuse reflectance FTIR system built by Surface Optics Corporation and a third generation optically stimulated electron emission system built by LaRC. This paper will discuss the evaluation of the various techniques on a number of substrate materials contaminated with hydrocarbons, silicones, and fluorocarbons. Discussion will also include standards development for instrument calibration and testing.
Cleanliness verification process at Martin Marietta Astronautics
NASA Astrophysics Data System (ADS)
King, Elizabeth A.; Giordano, Thomas J.
1994-06-01
The Montreal Protocol and the 1990 Clean Air Act Amendments mandate CFC-113, other chlorinated fluorocarbons (CFC's) and 1,1,1-Trichloroethane (TCA) be banned from production after December 31, 1995. In response to increasing pressures, the Air Force has formulated policy that prohibits purchase of these solvents for Air Force use after April 1, 1994. In response to the Air Force policy, Martin Marietta Astronautics is in the process of eliminating all CFC's and TCA from use at the Engineering Propulsion Laboratory (EPL), located on Air Force property PJKS. Gross and precision cleaning operations are currently performed on spacecraft components at EPL. The final step of the operation is a rinse with a solvent, typically CFC-113. This solvent is then analyzed for nonvolatile residue (NVR), particle count and total filterable solids (TFS) to determine cleanliness of the parts. The CFC-113 used in this process must be replaced in response to the above policies. Martin Marietta Astronautics, under contract to the Air Force, is currently evaluating and testing alternatives for a cleanliness verification solvent. Completion of test is scheduled for May, 1994. Evaluation of the alternative solvents follows a three step approach. This first is initial testing of solvents picked from literature searches and analysis. The second step is detailed testing of the top candidates from the initial test phase. The final step is implementation and validation of the chosen alternative(s). Testing will include contaminant removal, nonvolatile residue, material compatibility and propellant compatibility. Typical materials and contaminants will be tested with a wide range of solvents. Final results of the three steps will be presented as well as the implementation plan for solvent replacement.
NASA Technical Reports Server (NTRS)
Bartelson, D.
1984-01-01
The PLB, its cargo, and payload canister must satisfy the cleanliness requirements of visual clean (VC) level 1, 2, 3, or special as stated in NASA document SN-C-0005A. The specific level of cleanliness is chosen by the payload bay customer for their mission. During orbiter turnaround processing at KSC, the payload bay is exposed to the environments of the Orbiter Processing Facility (OPF) and the Payload Changeout Room (PCR). In supportive response to the orbiter payload bay/facility interface, it is necessary that the facility environment be controlled and monitored to protect the cleanliness/environmental integrity of the payload bay and its cargo. Techniques used to meet environmental requirements during orbiter processing are introduced.
Grazing-Angle Fourier Transform Infrared Spectroscopy for Surface Cleanliness Verification
2003-03-01
coating. 34 North Island personnel were also interested in using the portable FTIR instrument to detect a trivalent chromium conversion coating on... trivalent chromium coating on aluminum panels. 35 Following the successful field-test at NADEP North Island in December 2000, a second demonstration of...contaminated, the panels were allowed to dry under a fume hood to evaporate the solvent. They were then placed in a desiccator for final drying. This
Surface contamination analysis technology team overview
NASA Technical Reports Server (NTRS)
Burns, H. Dewitt
1995-01-01
A team was established which consisted of representatives from NASA (Marshall Space Flight Center and Langley Research Center), Thiokol Corporation, the University of Alabama in Huntsville, AC Engineering, SAIC, Martin Marietta, and Aerojet. The team's purpose was to bring together the appropriate personnel to determine what surface inspection techniques were applicable to multiprogram bonding surface cleanliness inspection. In order to identify appropriate techniques and their sensitivity to various contaminant families, calibration standards were developed. Producing standards included development of consistent low level contamination application techniques. Oxidation was also considered for effect on inspection equipment response. Ellipsometry was used for oxidation characterization. Verification testing was then accomplished to show that selected inspection techniques could detect subject contaminants at levels found to be detrimental to critical bond systems of interest. Once feasibility of identified techniques was shown, selected techniques and instrumentation could then be incorporated into a multipurpose inspection head and integrated with a robot for critical surface inspection. Inspection techniques currently being evaluated include optically stimulated electron emission (OSEE); near infrared (NIR) spectroscopy utilizing fiber optics; Fourier transform infrared (FTIR) spectroscopy; and ultraviolet (UV) fluorescence. Current plans are to demonstrate an integrated system in MSFC's Productivity Enhancement Complex within five years from initiation of this effort in 1992 assuming appropriate funding levels are maintained. This paper gives an overview of work accomplished by the team and future plans.
The Search for Nonflammable Solvent Alternatives for Cleaning Aerospace Oxygen Systems
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Lowrey, Nikki
2012-01-01
To obtain a high degree of cleanliness without risk of corrosion or hazardous reactivity, hydrochlorofluorocarbon (HCFC)-225 is used for cleaning and cleanliness verification of oxygen system components used on NASA fs bipropellant launch vehicles, associated test stands and support equipment. HCFC-225 is a Class II Ozone Depleting Substance (ODS ]II) that was introduced to replace chlorofluorocarbon (CFC)-113, a Class I ODS solvent that is now banned. To meet environmental regulations to eliminate the use of ozone depleting substances, a replacement solvent is required for HCFC ]225 that is effective at removing oils, greases, and particulate from large oxygen system components, is compatible with materials used in the construction of these systems, and is nonflammable and non ]reactive in enriched oxygen environments. A solvent replacement is also required for aviator fs breathing oxygen systems and other related equipment currently cleaned and verified with HCFC ]225 and stockpiled CFC -113. Requirements and challenges in the search for nonflammable replacement solvents are discussed.
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Lowrey, Nikki M.
2015-01-01
Since the 1990's, when the Class I Ozone Depleting Substance (ODS) chlorofluorocarbon-113 (CFC-113) was banned, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have relied upon hydrochlorofluorocarbon-225 (HCFC-225) to safely clean and verify the cleanliness of large scale propulsion oxygen systems. Effective January 1, 2015, the production, import, export, and new use of HCFC-225, a Class II ODS, was prohibited by the Clean Air Act. In 2012 through 2014, leveraging resources from both the NASA Rocket Propulsion Test Program and the Defense Logistics Agency - Aviation Hazardous Minimization and Green Products Branch, test labs at MSFC, SSC, and Johnson Space Center's White Sands Test Facility (WSTF) collaborated to seek out, test, and qualify a replacement for HCFC-225 that is both an effective cleaner and safe for use with oxygen systems. Candidate solvents were selected and a test plan was developed following the guidelines of ASTM G127, Standard Guide for the Selection of Cleaning Agents for Oxygen Systems. Solvents were evaluated for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Two solvents were determined to be acceptable for cleaning oxygen systems and one was chosen for implementation at NASA's rocket propulsion test facilities. The test program and results are summarized. This project also demonstrated the benefits of cross-agency collaboration in a time of limited resources.
The effect of environmental initiatives on NASA specifications and standards activities
NASA Technical Reports Server (NTRS)
Griffin, Dennis; Webb, David; Cook, Beth
1995-01-01
The NASA Operational Environment Team (NOET) has conducted a survey of NASA centers specifications and standards that require the use of Ozone Depleting Substances (ODS's) (Chlorofluorocarbons (CFCs), Halons, and chlorinated solvents). The results of this survey are presented here, along with a pathfinder approach utilized at Marshall Space Flight Center (MSFC) to eliminate the use of ODS's in targeted specifications and standards. Presented here are the lessons learned from a pathfinder effort to replace CFC-113 in a significant MSFC specification for cleaning and cleanliness verification methods for oxygen, fuel and pneumatic service, including Shuttle propulsion elements.
Cleaning and Cleanliness Measurement of Additive Manufactured Parts
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Raley, Randy
2016-01-01
The successful acquisition and utilization of piece parts and assemblies for contamination sensitive applications requires application of cleanliness acceptance criteria. Contamination can be classified using many different schemes. One common scheme is classification as organic, ionic and particulate contaminants. These may be present in and on the surface of solid components and assemblies or may be dispersed in various gaseous or liquid media. This discussion will focus on insoluble particle contamination on the surfaces of piece parts and assemblies. Cleanliness of parts can be controlled using two strategies, referred to as gross cleanliness and precision cleanliness. Under a gross cleanliness strategy acceptance is based on visual cleanliness. This approach introduces a number of concerns that render it unsuitable for controlling cleanliness of high technology products. Under the precision cleanliness strategy, subjective, visual assessment of cleanliness is replaced by objective measurement of cleanliness. When a precision cleanliness strategy is adopted there naturally arises the question: How clean is clean enough? The methods for establishing objective cleanliness acceptance limits will be discussed.
Establishing and Monitoring an Aseptic Workspace for Building the MOMA Mass Spectrometer
NASA Technical Reports Server (NTRS)
Lalime, Erin
2016-01-01
Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the ESA ExoMars 2018 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). As MOMA-MS is a life-detection instrument and it thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 sporem2 is allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three cleanrooms with varying levels of bioburden control. The Aseptic Assembly Cleanroom has the highest level of control, applying three different bioburden reducing methods: 70 IPA, 7.5 Hydrogen Peroxide, and Ultra-Violet C light. The three methods are used in rotation and each kills microbes by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Cleanrooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of cleanrooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and DHMR verification. The cleanrooms are monitored both for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic the other cleanroom.
Clean and Cold Sample Curation
NASA Technical Reports Server (NTRS)
Allen, C. C.; Agee, C. B.; Beer, R.; Cooper, B. L.
2000-01-01
Curation of Mars samples includes both samples that are returned to Earth, and samples that are collected, examined, and archived on Mars. Both kinds of curation operations will require careful planning to ensure that the samples are not contaminated by the instruments that are used to collect and contain them. In both cases, sample examination and subdivision must take place in an environment that is organically, inorganically, and biologically clean. Some samples will need to be prepared for analysis under ultra-clean or cryogenic conditions. Inorganic and biological cleanliness are achievable separately by cleanroom and biosafety lab techniques. Organic cleanliness to the <50 ng/sq cm level requires material control and sorbent removal - techniques being applied in our Class 10 cleanrooms and sample processing gloveboxes.
Litman, Leib; Robinson, Jonathan; Weinberger-Litman, Sarah L; Finkelstein, Ron
2017-08-24
In the present study, we explore how intrinsic and extrinsic religious orientations are associated with cleanliness attitudes. We find that reported importance of religion is associated with increased cleanliness concerns and interest in cleanliness. Attitudes toward cleanliness were also associated with both intrinsic religious orientation and extrinsic religious orientation. Together, religiosity and religious orientation account for 14.7% of cleanliness attitudes and remained significant in the presence of personality, socioeconomic status, age, education, obsessive-compulsive attitudes toward cleanliness, and other covariates. These results show that religiosity is associated with cleanliness via multiple routes. We suggest that intrinsic religious orientation leads to increased interest in cleanliness due to the link between physical and spiritual purity. Extrinsic religious orientation may be linked with cleanliness because of the secondary benefits, including health and the facilitation in communal cohesiveness, that cleanliness rituals offer. The implications of these findings for the relationship between religion and health are discussed.
NASA Technical Reports Server (NTRS)
Caruso, Salvadore V.; Cox, Jack A.; McGee, Kathleen A.
1998-01-01
Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration performs many research and development programs that require hardware and assemblies to be cleaned to levels that are compatible with fuels and oxidizers (liquid oxygen, solid propellants, etc.). Also, MSFC is responsible for developing large telescope satellites which require a variety of optical systems to be cleaned. A precision cleaning shop is operated within MSFC by the Fabrication Services Division of the Materials & Processes Laboratory. Verification of cleanliness is performed for all precision cleaned articles in the Environmental and Analytical Chemistry Branch. Since the Montreal Protocol was instituted, MSFC had to find substitutes for many materials that have been in use for many years, including cleaning agents and organic solvents. As MSFC is a research center, there is a great variety of hardware that is processed in the Precision Cleaning Shop. This entails the use of many different chemicals and solvents, depending on the nature and configuration of the hardware and softgoods being cleaned. A review of the manufacturing cleaning and verification processes, cleaning materials and solvents used at MSFC and changes that resulted from the Montreal Protocol will be presented.
NASA Technical Reports Server (NTRS)
Caruso, Salvadore V.
1999-01-01
Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) performs many research and development programs that require hardware and assemblies to be cleaned to levels that are compatible with fuels and oxidizers (liquid oxygen, solid propellants, etc.). Also, the Center is responsible for developing large telescope satellites which requires a variety of optical systems to be cleaned. A precision cleaning shop is operated with-in MSFC by the Fabrication Services Division of the Materials & Processes Division. Verification of cleanliness is performed for all precision cleaned articles in the Analytical Chemistry Branch. Since the Montreal Protocol was instituted, MSFC had to find substitutes for many materials that has been in use for many years, including cleaning agents and organic solvents. As MSFC is a research Center, there is a great variety of hardware that is processed in the Precision Cleaning Shop. This entails the use of many different chemicals and solvents, depending on the nature and configuration of the hardware and softgoods being cleaned. A review of the manufacturing cleaning and verification processes, cleaning materials and solvents used at MSFC and changes that resulted from the Montreal Protocol will be presented.
Techniques of biological contamination avoidance by atmospheric probes
NASA Technical Reports Server (NTRS)
Defrees, R. E.
1974-01-01
The likelihood of biologically contaminating a planet by an atmospheric probe has a low probability of occurring if the probe is kept biologically clean during terrestrial operations and if the structure remains in tact until the planets life zone is completely penetrated. High standards of cleanliness, monitoring and estimating for remedial actions must be maintained in a probe program. It is not a foregone conclusion, however, that heat sterilization needs to be employed. The use of several techniques having a good potential for lower probe costs are available and appear adequate to render a probe sterile within acceptable bounds. The techniques considered to be satisfactory for minimizing microbial load include: (1) combined heat (at 95-105 C) and gamma radiation; (2) short term heating at 105 + or - 5 C to inactivate all vegetative microbes; (3) irradiation routinely by ultraviolet light; (4) wiping by a bactericidal agent with or without a penetrant; and (5) cleanliness alone.
Balanced Rotating Spray Tank and Pipe Cleaning and Cleanliness Verification System
NASA Technical Reports Server (NTRS)
Caimi, Raoul E. B. (Inventor); Thaxton, Eric A. (Inventor)
1998-01-01
A system for cleaning and verifying the cleanliness of the interior surfaces of hollow items, such as small bottles, tanks, pipes and tubes, employs a rotating spray head for supplying a gas-liquid cleaning mixture to the item's surface at a supersonic velocity. The spray head incorporates a plurality of nozzles having diverging cross sections so that the incoming gas-liquid mixture is first converged within the spray head and then diverged through the nozzles, thereby accelerating the mixture to a supersonic velocity. In the preferred embodiment, three nozzles are employed; one forwardly facing nozzle at the end of the spray head and two oppositely facing angled nozzles exiting on opposite sides of the spray head which balance each other, and therefore impart no net side load on the spray head. A drive mechanism is provided to rotate the spray head and at the same time move the head back and forth within the item to be cleaned. The drive mechanism acts on a long metal tube to which the spray head is fixed, and thus no moving parts are exposed to the interior surfaces of the items to be cleaned, thereby reducing the risk of contamination.
A noncontacting scanning photoelectron emission technique for bonding surface cleanliness inspection
NASA Technical Reports Server (NTRS)
Gause, Raymond L.
1989-01-01
Molecular contamination of bonding surfaces can drastically affect the bond strength that can be achieved and therefore the structural integrity and reliability of the bonded part. The presence of thin contaminant films on bonding surfaces can result from inadequate or incomplete cleaning methods, from oxide growth during the time between cleaning (such as grit blasting) and bonding, or from failure to properly protect cleaned surfaces from oils, greases, fingerprints, release agents, or deposition of facility airborne molecules generated by adjacent manufacturing or processing operations. Required cleanliness levels for desired bond performance can be determined by testing to correlate bond strength with contaminant type and quantity, thereby establishing the degree of contamination that can be tolerated based on the strength that is needed. Once the maximum acceptable contaminant level is defined, a method is needed to quantitatively measure the contaminant level on the bonding surface prior to bonding to verify that the surface meets the established cleanliness requirement. A photoelectron emission technique for the nondestructive inspection of various bonding surfaces, both metallic and nonmetallic, to provide quantitative data on residual contaminant levels is described. The technique can be used to scan surfaces at speeds of at least 30 ft/min using a servo system to maintain required sensor to surface spacing. The fundamental operation of the photoelectron emission sensor system is explained and the automated scanning system and computer data acquisition hardware and software are described.
Establishing and monitoring an aseptic workspace for building the MOMA mass spectrometer
NASA Astrophysics Data System (ADS)
Lalime, Erin N.; Berlin, David
2016-09-01
Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the European Space Agency (ESA) ExoMars 2020 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). MOMA-MS is a life-detection instrument and thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 spore/m2 are allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three clean rooms with varying levels of bioburden control. The Aseptic Assembly Clean room has the highest level of control, applying three different bioburden reducing methods: 70% Isopropyl Alcohol (IPA), 7.5% Hydrogen Peroxide, and Ultra-Violet C (UVC) light. The three methods are used in rotation and each kills microorganisms by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Clean rooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of clean rooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and Dry Heat microbial Reduction (DHMR) verification. The clean rooms are monitored for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic and other clean room.
Establishing and Monitoring an Aseptic Workspace for Building the MOMA Mass Spectrometer
NASA Technical Reports Server (NTRS)
Lalime, Erin N.; Berlin, David
2016-01-01
Mars Organic Molecule Analyzer (MOMA) is an instrument suite on the European Space Agency (ESA) ExoMars 2020 Rover, and the Mass Spectrometer (MOMA-MS) is being built at Goddard Space Flight Center (GSFC). MOMA-MS is a life-detection instrument and thus falls in the most stringent category of Planetary Protection (PP) biological cleanliness requirements. Less than 0.03 spore/m2 are allowed in the instrument sample path. In order to meet these PP requirements, MOMA-MS must be built and maintained in a low bioburden environment. The MOMA-MS project at GSFC maintains three clean rooms with varying levels of bioburden control. The Aseptic Assembly Clean room has the highest level of control, applying three different bioburden reducing methods: 70% Isopropyl Alcohol (IPA), 7.5% Hydrogen Peroxide, and Ultra-Violet C (UVC) light. The three methods are used in rotation and each kills microorganisms by a different mechanism, reducing the likelihood of microorganisms developing resistance to all three. The Integration and Mars Chamber Clean rooms use less biocidal cleaning, with the option to deploy extra techniques as necessary. To support the monitoring of clean rooms and verification that MOMA-MS hardware meets PP requirements, a new Planetary Protection lab was established that currently has the capabilities of standard growth assays for spore or vegetative bacteria, rapid bioburden analysis that detects Adenosine Triphosphate (ATP), plus autoclave and Dry Heat microbial Reduction (DHMR) verification. The clean rooms are monitored for vegetative microorganisms and by rapid ATP assay, and a clear difference in bioburden is observed between the aseptic and other clean room.
Asthma and the hygiene hypothesis. Does cleanliness matter?
Weber, Juliane; Illi, Sabina; Nowak, Dennis; Schierl, Rudolf; Holst, Otto; von Mutius, Erika; Ege, Markus J
2015-03-01
The early hygiene hypothesis explained the development of allergies by a lack of infections; nowadays, the aspect of excessive cleanliness in affluent populations seems to have replaced this concept. Yet, no investigation has shown that home or personal cleanliness relate to allergic diseases. To relate personal and home cleanliness to risk of asthma and allergies. Comprehensive questionnaire information on home or personal cleanliness and allergic health conditions at school age was collected in 399 participants of the urban Perinatale Asthma Umwelt Langzeit Allergie Studie (PAULA) birth cohort. Bacterial markers were assessed in floor and mattress dust and were related to cleanliness and allergic diseases. Personal cleanliness was inversely related to bacterial compounds on floors and mattresses, whereas home cleanliness effectively reduced dust amount but not microbial markers. Exposure to muramic acid related to a lower prevalence of school-age asthma (adjusted odds ratio, 0.59 [95% confidence interval, 0.39; 0.90]). Mattress endotoxin in the first year of life was inversely associated with atopic sensitization (0.73 [0.56-0.96]) and asthma at school age (0.72 [0.55-0.95]). Despite the associations of dust parameters both with cleanliness and allergic health conditions, the development of allergies was not related to home and personal cleanliness. Bacterial exposure in house dust determined childhood asthma and allergies. Personal cleanliness, such as washing hands, and home cleanliness were objectively reflected by dust parameters in homes. However, neither personal nor home cleanliness was associated with a risk for asthma and allergies. Other microbial components in house dust not affected by personal hygiene are likely to play a role.
Using Image Pro Plus Software to Develop Particle Mapping on Genesis Solar Wind Collector Surfaces
NASA Technical Reports Server (NTRS)
Rodriquez, Melissa C.; Allton, J. H.; Burkett, P. J.
2012-01-01
The continued success of the Genesis mission science team in analyzing solar wind collector array samples is partially based on close collaboration of the JSC curation team with science team members who develop cleaning techniques and those who assess elemental cleanliness at the levels of detection. The goal of this collaboration is to develop a reservoir of solar wind collectors of known cleanliness to be available to investigators. The heart and driving force behind this effort is Genesis mission PI Don Burnett. While JSC contributes characterization, safe clean storage, and benign collector cleaning with ultrapure water (UPW) and UV ozone, Burnett has coordinated more exotic and rigorous cleaning which is contributed by science team members. He also coordinates cleanliness assessment requiring expertise and instruments not available in curation, such as XPS, TRXRF [1,2] and synchrotron TRXRF. JSC participates by optically documenting the particle distributions as cleaning steps progress. Thus, optical document supplements SEM imaging and analysis, and elemental assessment by TRXRF.
Haidar Ahmad, Imad A; Tam, James; Li, Xue; Duffield, William; Tarara, Thomas; Blasko, Andrei
2017-02-05
The parameters affecting the recovery of pharmaceutical residues from the surface of stainless steel coupons for quantitative cleaning verification method development have been studied, including active pharmaceutical ingredient (API) level, spiking procedure, API/excipient ratio, analyst-to-analyst variability, inter-day variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned coupon surface was identified as the major contributor to low and variable recoveries. Assessment of acid, base, and oxidant washes, as well as the order of treatment, showed that a base-water-acid-water-oxidizer-water wash procedure resulted in consistent, accurate spiked recovery (>90%) and reproducible results (S rel ≤4%). By applying this cleaning procedure to the previously used coupons that failed the cleaning acceptance criteria, multiple analysts were able to obtain consistent recoveries from day-to-day for different APIs, and API/excipient ratios at various spike levels. We successfully applied our approach for cleaning verification of small molecules (MW<1000Da) as well as large biomolecules (MW up to 50,000Da). Method robustness was greatly influenced by the sample preparation procedure, especially for analyses using total organic carbon (TOC) determination. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei
2003-09-01
As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.
Mass Spectrometer for Airborne Micro-Organisms
NASA Technical Reports Server (NTRS)
Sinha, M. P.; Friedlander, S. K.
1986-01-01
Bacteria and other micro-organisms identified continously with aid of new technique for producing samples for mass spectrometer. Technique generates aerosol of organisms and feeds to spectrometer. Given species of organism produces characteristic set of peaks in mass spectrum and thereby identified. Technique useful for monitoring bacterial makeup in environmental studies and in places where cleanliness is essential, such as hospital operating rooms, breweries, and pharmaceutical plants.
Precision Clean Hardware: Maintenance of Fluid Systems Cleanliness
NASA Technical Reports Server (NTRS)
Sharp, Sheila; Pedley, Mike; Bond, Tim; Quaglino, Joseph; Lorenz, Mary Jo; Bentz, Michael; Banta, Richard; Tolliver, Nancy; Golden, John; Levesque, Ray
2003-01-01
The ISS fluid systems are so complex that fluid system cleanliness cannot be verified at the assembly level. A "build clean / maintain clean" approach was used by all major fluid systems: Verify cleanliness at the detail and subassembly level. Maintain cleanliness during assembly.
Seixas, Fábio Heredia; Estrela, Carlos; Bueno, Mike Reis; Sousa-Neto, Manoel Damião; Pécora, Jesus Djalma
2015-06-01
The aim of this study was to determine the root canal area before and after the instrumentation 1 mm short of the apical foramen by clinical and cone beam computed tomography (CBCT) methods, and to evaluate the cleanliness of the apical region in mesiodistal flattened teeth by using optical microscopy. Forty-two human single-canal mandibular incisors were instrumented using the Free Tip Preparation technique up to three, four or five instruments from the initial. Cone beam computed tomography scans were acquired of the samples before and after root canal preparation (RCP). Irrigation was performed by conventional or hydrodynamic means, using 2.5% sodium hypochlorite. The samples were prepared for observation under an optical microscope. Images were digitally obtained, analyzed and the results were submitted to statistical analysis (two-way ANOVA complemented by Bonferroni's post-test). There was no significant difference between the studied anatomical areas with both CBCT and clinical methods. There were no differences between irrigation methods. It was verified differences between instrumentation techniques. Instrumentation with four instruments from the initial instrument determined a significant increase in the contact area when compared to preparation with three instruments, but RCP with 5 instruments did not result in a better cleanliness. The analysis with CBCT was not capable to determine the precise shape of surgical apical area comparing to the clinical method. Both the conventional and hydrodynamic irrigation techniques were not able to promote root canals debris-free. The instruments action in root canal walls was proportional to the number of instruments used from the initial apical instrument.
Cooper, Moogega; La Duc, Myron T; Probst, Alexander; Vaishampayan, Parag; Stam, Christina; Benardini, James N; Piceno, Yvette M; Andersen, Gary L; Venkateswaran, Kasthuri
2011-08-01
A bacterial spore assay and a molecular DNA microarray method were compared for their ability to assess relative cleanliness in the context of bacterial abundance and diversity on spacecraft surfaces. Colony counts derived from the NASA standard spore assay were extremely low for spacecraft surfaces. However, the PhyloChip generation 3 (G3) DNA microarray resolved the genetic signatures of a highly diverse suite of microorganisms in the very same sample set. Samples completely devoid of cultivable spores were shown to harbor the DNA of more than 100 distinct microbial phylotypes. Furthermore, samples with higher numbers of cultivable spores did not necessarily give rise to a greater microbial diversity upon analysis with the DNA microarray. The findings of this study clearly demonstrated that there is not a statistically significant correlation between the cultivable spore counts obtained from a sample and the degree of bacterial diversity present. Based on these results, it can be stated that validated state-of-the-art molecular techniques, such as DNA microarrays, can be utilized in parallel with classical culture-based methods to further describe the cleanliness of spacecraft surfaces.
Litman, Leib; Williams, Monnica T; Rosen, Zohn; Weinberger-Litman, Sarah L; Robinson, Jonathan
2017-09-22
The present study has three objectives (1) to examine whether there are differences in cleanliness concerns between African Americans and European Americans toward kitchen items that are known to be vectors of disease, (2) to examine whether disparities in cleanliness attitudes have an impact on purchasing attitudes toward kitchen cleaning products, and (3) to explore the mechanisms that may account for these differences utilizing a serial mediation model. Five hundred participants, 50% African American and 50% European American were shown a picture of a sponge cleaning product and filled out multiple survey instruments relating to cleanliness attitudes. We found greater concern with cleanliness of kitchen items (d = .46) and a greater willingness to purchase cleaning products among African Americans compared to European Americans (17 vs 10%). A serial mediation analysis revealed that general cleanliness concerns account for the increased willingness to spend money on cleaning products among African Americans. These results suggest that African Americans are more sensitive to issues of cleanliness compared to European Americans and, in particular, are more sensitive to cleanliness of kitchen items such as sponges, which can be vectors of food-borne pathogens. Potential reasons for the observed racial disparities in cleanliness attitudes and the implications of these results for public health are discussed.
NASA Astrophysics Data System (ADS)
Niu, Longfei; Liu, Hao; Miao, Xinxiang; Lv, Haibing; Yuan, Xiaodong; Zhou, Hai; Yao, Caizhen; Zhou, Guorui; Li, Qin
2017-05-01
The cleaning mechanism of optical surface particle contaminants in the light pneumatic tube was simulated based on the static equations and JKR model. Cleaning verification experiment based on air knife sweeping system and on-line monitoring system in high power laser facility was set up in order to verify the simulated results. Results showed that the removal ratio is significantly influenced by sweeping velocity and angle. The removal ratio can reach to 94.3% by using higher input pressure of the air knife, demonstrating that the air knife sweeping technology is useful for maintaining the surface cleanliness of optical elements, and thus guaranteeing the long-term stable running of the high power laser facility.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Kreeft, Davey; Arkenbout, Ewout Aart; Henselmans, Paulus Wilhelmus Johannes; van Furth, Wouter R.; Breedveld, Paul
2017-01-01
A clear visualization of the operative field is of critical importance in endoscopic surgery. During surgery the endoscope lens can get fouled by body fluids (eg, blood), ground substance, rinsing fluid, bone dust, or smoke plumes, resulting in visual impairment. As a result, surgeons spend part of the procedure on intermittent cleaning of the endoscope lens. Current cleaning methods that rely on manual wiping or a lens irrigation system are still far from ideal, leading to longer procedure times, dirtying of the surgical site, and reduced visual acuity, potentially reducing patient safety. With the goal of finding a solution to these issues, a literature review was conducted to identify and categorize existing techniques capable of achieving optically clean surfaces, and to show which techniques can potentially be implemented in surgical practice. The review found that the most promising method for achieving surface cleanliness consists of a hybrid solution, namely, that of a hydrophilic or hydrophobic coating on the endoscope lens and the use of the existing lens irrigation system. PMID:28511635
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
Does Flooring Substrate Impact Kennel and Dog Cleanliness in Commercial Breeding Facilities?
Stella, Judith; Hurt, Moriah; Bauer, Amy; Gomes, Paulo; Ruple, Audrey; Beck, Alan; Croney, Candace
2018-04-21
Evaluation of kennel flooring surfaces is needed to understand their impacts on dog health and well-being. This pilot study aimed to characterize aspects of physical health, kennel cleanliness, and dog body cleanliness on flooring types common in US breeding kennels. Subjects were 118 adult dogs housed on diamond-coated expanded metal (DCEM), polypropylene (POLY), or concrete (CON) flooring at five commercial breeding facilities in Indiana, U.S. Body condition, paw, elbow, and hock health scores were recorded. Each indoor kennel and dog was visually assessed for cleanliness. Kennels were swabbed immediately after cleaning with electrostatic dry cloths and cultured for Escherichia coli . Descriptive statistics were used for analysis. Mean body condition score (BCS), kennel and dog cleanliness scores were all near ideal (3, 1.15, and 1.04, respectively). Thirty-one percent or fewer kennels at each facility were culture-positive for E. coli after cleaning. No serious paw, elbow, or hock problems were identified. Overall, the findings indicate that with appropriate management and regular access to additional surfaces, dog foot health, cleanliness, and kennel cleanliness can be maintained on the flooring types investigated.
Environmental control and waste management system design concept
NASA Technical Reports Server (NTRS)
Gandy, A. R.
1974-01-01
Passive device contains both solid and liquid animal waste matter for extended period without being cleaned and without contaminating animal. Constant airflow dries solid waste and evaporates liquid matter. Technique will maintain controlled atmospheric conditions and cage cleanliness during periods of 6 months to 1 year.
Assessing the cleanliness of surfaces: Innovative molecular approaches vs. standard spore assays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.; Duc, M.T. La; Probst, A.
2011-04-01
A bacterial spore assay and a molecular DNA microarray method were compared for their ability to assess relative cleanliness in the context of bacterial abundance and diversity on spacecraft surfaces. Colony counts derived from the NASA standard spore assay were extremely low for spacecraft surfaces. However, the PhyloChip generation 3 (G3) DNA microarray resolved the genetic signatures of a highly diverse suite of microorganisms in the very same sample set. Samples completely devoid of cultivable spores were shown to harbor the DNA of more than 100 distinct microbial phylotypes. Furthermore, samples with higher numbers of cultivable spores did not necessarilymore » give rise to a greater microbial diversity upon analysis with the DNA microarray. The findings of this study clearly demonstrated that there is not a statistically significant correlation between the cultivable spore counts obtained from a sample and the degree of bacterial diversity present. Based on these results, it can be stated that validated state-of-the-art molecular techniques, such as DNA microarrays, can be utilized in parallel with classical culture-based methods to further describe the cleanliness of spacecraft surfaces.« less
Surface cleanliness of fluid systems, specification for
NASA Technical Reports Server (NTRS)
1995-01-01
This specification establishes surface cleanliness levels, test methods, cleaning and packaging requirements, and protection and inspection procedures for determining surface cleanliness. These surfaces pertain to aerospace parts, components, assemblies, subsystems, and systems in contact with any fluid medium.
Air pollution: Household soiling and consumer welfare losses
Watson, W.D.; Jaksch, J.A.
1982-01-01
This paper uses demand and supply functions for cleanliness to estimate household benefits from reduced particulate matter soiling. A demand curve for household cleanliness is estimated, based upon the assumption that households prefer more cleanliness to less. Empirical coefficients, related to particulate pollution levels, for shifting the cleanliness supply curve, are taken from available studies. Consumer welfare gains, aggregated across 123 SMSAs, from achieving the Federal primary particulate standard, are estimated to range from $0.9 to $3.2 million per year (1971 dollars). ?? 1982.
NASA Astrophysics Data System (ADS)
Shazwan, M. A.; Quintin, J. V.; Osman, N. A.; Suhaida, S. K.; Ma'arof, M. I. N.
2017-11-01
Construction site’s cleanliness and tidiness is one of top main concerns of construction site management. A good site management with adequate planning in regards to housekeeping will ensure safety to both the site’s working personnel and the neighbouring environment. This is especially of priority today due to the rapid growth of construction projects in Malaysia. Nevertheless, to date, statistics had shown that housekeeping related accidents happened repeatedly despite the awareness on site’s cleanliness and tidiness. The objective of this study was to explore constructor’s perspective on site cleanliness and tidiness. A set of questionnaire was distributed to thirty-four (34) Grade 7 CIDB contractors’ firms in Petaling Jaya. Petaling Jaya was chosen since this area are developed area covered with residential area and cleaning management an importance issues. The goals of the survey study were to identify the following items from the perception of the contractors: (i) the hierarchy of importance of several purposes in ensuring construction site’s cleanliness and tidiness, and (ii) the risk factors that influence a construction site’s level of cleanliness and tidiness. It was found that from the contractor’s perspective, ensuring a site cleanliness and tidiness is of importance mainly due to the need in protecting the environment, whilst, the least in cost saving. In addition, poor personnel’s working attitude was found to be the main risk factor that will influences a construction site’s level of cleanliness and tidiness. Conclusively, construction site’s cleanliness and tidiness is highly vital. Even so, for nationwide awareness on the topic matter will requires full cooperation from all related parties.
NASA Technical Reports Server (NTRS)
Allton, J. H.; Zeigler, R. A.; Calaway, M. J.
2016-01-01
The Lunar Receiving Laboratory (LRL) was planned and constructed in the 1960s to support the Apollo program in the context of landing on the Moon and safely returning humans. The enduring science return from that effort is a result of careful curation of planetary materials. Technical decisions for the first facility included sample handling environment (vacuum vs inert gas), and instruments for making basic sample assessment, but the most difficult decision, and most visible, was stringent biosafety vs ultra-clean sample handling. Biosafety required handling of samples in negative pressure gloveboxes and rooms for containment and use of sterilizing protocols and animal/plant models for hazard assessment. Ultra-clean sample handling worked best in positive pressure nitrogen environment gloveboxes in positive pressure rooms, using cleanable tools of tightly controlled composition. The requirements for these two objectives were so different, that the solution was to design and build a new facility for specific purpose of preserving the scientific integrity of the samples. The resulting Lunar Curatorial Facility was designed and constructed, from 1972-1979, with advice and oversight by a very active committee comprised of lunar sample scientists. The high precision analyses required for planetary science are enabled by stringent contamination control of trace elements in the materials and protocols of construction (e.g., trace element screening for paint and flooring materials) and the equipment used in sample handling and storage. As other astromaterials, especially small particles and atoms, were added to the collections curated, the technical tension between particulate cleanliness and organic cleanliness was addressed in more detail. Techniques for minimizing particulate contamination in sample handling environments use high efficiency air filtering techniques typically requiring organic sealants which offgas. Protocols for reducing adventitious carbon on sample handling surfaces often generate particles. Further work is needed to achieve both minimal particulate and adventitious carbon contamination. This paper will discuss these facility topics and others in the historical context of nearly 50 years' curation experience for lunar rocks and regolith, meteorites, cosmic dust, comet particles, solar wind atoms, and asteroid particles at Johnson Space Center.
Replacement of HCFC-225 Solvent for Cleaning NASA Propulsion Oxygen Systems
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Lowrey, Nikki M.
2015-01-01
Since the 1990's, when the Class I Ozone Depleting Substance (ODS) chlorofluorocarbon-113 (CFC-113) was banned, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have relied upon hydrochlorofluorocarbon-225 (HCFC-225) to safely clean and verify the cleanliness of large scale propulsion oxygen systems. Effective January 1, 2015, the production, import, export, and new use of HCFC-225, a Class II ODS, was prohibited by the Clean Air Act. In 2012 through 2014, leveraging resources from both the NASA Rocket Propulsion Test Program and the Defense Logistics Agency - Aviation Hazardous Minimization and Green Products Branch, test labs at MSFC, SSC, and Johnson Space Center's White Sands Test Facility (WSTF) collaborated to seek out, test, and qualify a replacement for HCFC-225 that is both an effective cleaner and safe for use with oxygen systems. Candidate solvents were selected and a test plan was developed following the guidelines of ASTM G127, Standard Guide for the Selection of Cleaning Agents for Oxygen Systems. Solvents were evaluated for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Two solvents were determined to be acceptable for cleaning oxygen systems and one was chosen for implementation at NASA's rocket propulsion test facilities. The test program and results are summarized. This project also demonstrated the benefits of cross-agency collaboration in a time of limited resources.
The human dimensions of urban greenways: planning for recreation and related experiences
Paul H. Gobster; Lynne M. Westpahl
2004-01-01
In this paper, we summarize findings from a series of interrelated studies that examine an urban greenway, the 150 mile Chicago River corridor in Chicago, USA, from multiple perspectives, stakeholder viewpoints, and methodological techniques. Six interdependent "human dimensions" of greenways are identified in the studies: cleanliness, naturalness, aesthetics...
Kids and Manners - A Ticket to Success. Kindergarten-6th.
ERIC Educational Resources Information Center
Cunningham, Patricia; And Others
Arranged into six parts, the booklet offers practical and motivating techniques for teaching elementary school students the basic rules of etiquette. The areas of general etiquette, cleanliness, introductions, table manners, telephoning, and thank you notes are included. Each section contains simple guidelines on how to act and react in social…
49 CFR 174.715 - Cleanliness of transport vehicles after use.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 2 2010-10-01 2010-10-01 false Cleanliness of transport vehicles after use. 174... RAIL Detailed Requirements for Class 7 (Radioactive) Materials § 174.715 Cleanliness of transport vehicles after use. (a) Each transport vehicle used for transporting Class 7 (radioactive) materials as...
49 CFR 174.715 - Cleanliness of transport vehicles after use.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 2 2011-10-01 2011-10-01 false Cleanliness of transport vehicles after use. 174... RAIL Detailed Requirements for Class 7 (Radioactive) Materials § 174.715 Cleanliness of transport vehicles after use. (a) Each transport vehicle used for transporting Class 7 (radioactive) materials as...
NASA Technical Reports Server (NTRS)
Allton, J. H.; Burkett, P. J.
2011-01-01
NASA Johnson Space Center operates clean curation facilities for Apollo lunar, Antarctic meteorite, stratospheric cosmic dust, Stardust comet and Genesis solar wind samples. Each of these collections is curated separately due unique requirements. The purpose of this abstract is to highlight the technical tensions between providing particulate cleanliness and molecular cleanliness, illustrated using data from curation laboratories. Strict control of three components are required for curating samples cleanly: a clean environment; clean containers and tools that touch samples; and use of non-shedding materials of cleanable chemistry and smooth surface finish. This abstract focuses on environmental cleanliness and the technical tension between achieving particulate and molecular cleanliness. An environment in which a sample is manipulated or stored can be a room, an enclosed glovebox (or robotic isolation chamber) or an individual sample container.
Does Flooring Substrate Impact Kennel and Dog Cleanliness in Commercial Breeding Facilities?
Stella, Judith; Hurt, Moriah; Bauer, Amy; Croney, Candace
2018-01-01
Simple Summary It is important to understand how the flooring substrate used in dog housing impacts dog health and well-being. Aspects to consider include paw, elbow, and hock health, the cleanliness of the dog, and the ability of the floors to be cleaned easily and thoroughly. This pilot study assessed the health and cleanliness of 118 dogs housed on three different types of flooring commonly found in commercial breeding kennels. No serious paw, elbow, or hock problems were identified. Thirty-one percent or fewer kennels at each facility were found to have fecal contamination after routine cleaning and the majority of dogs were clean. These findings indicate that a well-managed kennel can maintain clean, healthy dogs on different types of flooring substrates. Abstract Evaluation of kennel flooring surfaces is needed to understand their impacts on dog health and well-being. This pilot study aimed to characterize aspects of physical health, kennel cleanliness, and dog body cleanliness on flooring types common in US breeding kennels. Subjects were 118 adult dogs housed on diamond-coated expanded metal (DCEM), polypropylene (POLY), or concrete (CON) flooring at five commercial breeding facilities in Indiana, U.S. Body condition, paw, elbow, and hock health scores were recorded. Each indoor kennel and dog was visually assessed for cleanliness. Kennels were swabbed immediately after cleaning with electrostatic dry cloths and cultured for Escherichia coli. Descriptive statistics were used for analysis. Mean body condition score (BCS), kennel and dog cleanliness scores were all near ideal (3, 1.15, and 1.04, respectively). Thirty-one percent or fewer kennels at each facility were culture-positive for E. coli after cleaning. No serious paw, elbow, or hock problems were identified. Overall, the findings indicate that with appropriate management and regular access to additional surfaces, dog foot health, cleanliness, and kennel cleanliness can be maintained on the flooring types investigated. PMID:29690514
Surface cleanliness measurement procedure
Schroder, Mark Stewart; Woodmansee, Donald Ernest; Beadie, Douglas Frank
2002-01-01
A procedure and tools for quantifying surface cleanliness are described. Cleanliness of a target surface is quantified by wiping a prescribed area of the surface with a flexible, bright white cloth swatch, preferably mounted on a special tool. The cloth picks up a substantial amount of any particulate surface contamination. The amount of contamination is determined by measuring the reflectivity loss of the cloth before and after wiping on the contaminated system and comparing that loss to a previous calibration with similar contamination. In the alternative, a visual comparison of the contaminated cloth to a contamination key provides an indication of the surface cleanliness.
Tools for measuring surface cleanliness
Schroder, Mark Stewart; Woodmansee, Donald Ernest; Beadie, Douglas Frank
2002-01-01
A procedure and tools for quantifying surface cleanliness are described. Cleanliness of a target surface is quantified by wiping a prescribed area of the surface with a flexible, bright white cloth swatch, preferably mounted on a special tool. The cloth picks up a substantial amount of any particulate surface contamination. The amount of contamination is determined by measuring the reflectivity loss of the cloth before and after wiping on the contaminated system and comparing that loss to a previous calibration with similar contamination. In the alternative, a visual comparison of the contaminated cloth to a contamination key provides an indication of the surface cleanliness.
How Clean Are Hotel Rooms? Part II: Examining the Concept of Cleanliness Standards.
Almanza, Barbara A; Kirsch, Katie; Kline, Sheryl Fried; Sirsat, Sujata; Stroia, Olivia; Choi, Jin Kyung; Neal, Jay
2015-01-01
Hotel room cleanliness is based on observation and not on microbial assessment even though recent reports suggest that infections may be acquired while staying in hotel rooms. Exploratory research in the first part of the authors' study was conducted to determine if contamination of hotel rooms occurs and whether visual assessments are accurate indicators of hotel room cleanliness. Data suggested the presence of microbial contamination that was not reflective of visual assessments. Unfortunately, no standards exist for interpreting microbiological data and other indicators of cleanliness in hotel rooms. The purpose of the second half of the authors' study was to examine cleanliness standards in other industries to see if they might suggest standards in hotels. Results of the authors' study indicate that standards from other related industries do not provide analogous criteria, but do provide suggestions for further research.
NASA Technical Reports Server (NTRS)
Lowrey, Nikki M.
2016-01-01
It has been well documented in the literature that contamination within oxygen systems can create significant fire hazards. Cleanliness limits for nonvolatile residues, ranging from 10 to 500 milligrams per square meter, have been established for various industries and types of oxygen systems to reduce the risk of ignition of flammable organic films. Particulate cleanliness limits used for oxygen systems, however, vary considerably, notably within the aerospace industry. Maximum allowed particle size, quantity limits, and allocations for fibers or metallic particles are all variables seen in aerospace cleanliness limits. Particles are known to have the potential to ignite within oxygen systems and must be limited to prevent fires. Particulate contamination may also pose risks to the performance of oxygen systems that are unrelated to ignition hazards. An extensive literature search was performed to better understand the relative importance of particle ignition mechanisms versus other deleterious effects of particles on oxygen systems and to identify rationale for derivation of particulate cleanliness limits for specific systems. The identified risks of different types and sizes of particles and fibers were analyzed. This paper summarizes the risks identified and rationale that may be used to derive particulate cleanliness limits for specific oxygen systems.
NASA Technical Reports Server (NTRS)
Lowrey, Nikki M.
2016-01-01
It has been well documented in the literature that contamination within oxygen systems can create significant fire hazards. Cleanliness limits for nonvolatile residues, ranging from 10 to 500 mg/sq m, have been established for various industries and types of oxygen systems to reduce the risk of ignition of flammable organic films. Particulate cleanliness limits used for oxygen systems vary considerably. Maximum allowed particle size, quantity limits, and allocations for fibers or metallic particles are all variables seen in aerospace cleanliness limits. Particles are known to have the potential to ignite within oxygen systems and must be limited to prevent fires. Particulate contamination may also pose risks to the performance of oxygen systems that are unrelated to ignition hazards. An extensive literature search was performed to better understand the relative importance of particle ignition mechanisms versus other deleterious effects of particles on oxygen systems and to identify rationale for derivation of particulate cleanliness limits for specific systems. The identified risks of different types and sizes of particles and fibers were analyzed. This paper summarizes the risks identified and rationale that may be used to derive particulate cleanliness limits for specific oxygen systems.
[Psychology of Hygiene: Rsult of a Comparative Study 1968/1976 (author's transl)].
Bergler, R
1976-01-01
Object of the investigation is: (1) the analysis of the behaviour observed in adults between 18 and 23 years of age regarding cleanliness, body hygiene and changing of underwear,and a comparison of the results with those for 1968; (2) Differential "diagnosis" of the relationship between the degree of cleanliness and the various styles of the upbringing received at the hands of the parents, the different assessment of the personal body image and of cleanliness-related values ("cleanliness ideology"). Main results of the investigation (1) In 1976, too, women remain cleaner than men; the latter, however, have clearly improved their daily washing habits (lower part of the body, feet) and have taken to changing their underwear, night clothes and coloured shirts more frequently. With respect to the frequency with which women change their underwear there has been no change. (2) With respect to body hygiene and cosmetic care, women are found to be making increasing use of mouth-wash, foam bath agents, face lotions and face milk, nail polish, while men are making more use of mouth-wash, bubble bath agents as well as deodorants. In contrast, women are making less use of lip-sticks, eyebrow pencils, Eau de Cologne, (3) Women's behaviour with respect to the observed and evaluated parental upbringing and to their own body image is more differentiated: The 4-factor solution for women contrasts with the 3-factor solution for men. (4) Only a form of cleanliness training involving corporal punishment and associated with a tense or hostile relationship towards the mother or father cannot be correlated to a desirable observance of cleanliness. Other possible forms of cleanliness training including supervisory forms, do not prevent the learning and adoption of desirable practices for achieving cleanliness. (5) Less clean women show a generally less pronounced body-feeling as manifested in active physical exercise in some sports disciplines and in body care. There is also a tendency towards a less motivating criticism of the figure and a lower degree of positive bodily sensibility. As far as men are concerned, the intensity of body hygiene is positively linked with the degree of superficial good grooming and the simultaneous dissatisfaction with the figure. (6) Cleanliness-related concepts (cleanliness ideology) e.g. health, work, generosity rectitude, authority, attractiveness, integrity etc., are, in their evaluation, influenced both by sex and by the hygienic behavior practiced.
Evaluation of a bonded particle cartridge filtration system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, W.; Krug, H.P.; Dopp, V.
1996-10-01
Metal cleanliness is a major issue in today`s aluminum casthouse, especially in the production of critical products such as canstock, litho sheet and foil. Bonded particle cartridge filters are widely regarded as the most effective means available for inclusion removal from critical production items. V.A.W. and Foseco have carried out a joint program of evaluation of a cartridge filter system in conjunction with ceramic foam filters and an in-line degassing unit--in various configurations. The ceramic foam filters ranged from standard, coarse pore types to new generation all-ceramic bonded, fine pore types. Metal cleanliness was assessed using LiMCA, PoDFA, and LAISmore » sampling techniques, as well as metallographic and scanning electron microscope examinations. This paper outlines the findings of this work which was carried out a V.A.W.`s full scale experimental D.C. slab casting unit as Neuss in Germany.« less
Cleanliness inspection tool for RSRM bond surfaces
NASA Technical Reports Server (NTRS)
Mattes, Robert A.
1995-01-01
Using optically stimulated electron emission (OSEE), Thiokol has monitored bond surfaces in process for contamination on the Redesigned Solid Rocket Motor (RSRM). This technique provides process control information to help assure bond surface quality and repeatability prior to bonding. This paper will describe OSEE theory of operation and the instrumentation implemented at Thiokol Corporation since 1987. Data from process hardware will be presented.
A clean self reduces bribery intent.
Li, Chao; Liu, Li; Zheng, Wenwen; Dang, Jianning; Liang, Yuan
2017-08-14
The present research aimed at investigating the effect of physical cleanliness on bribery intent and the moderating role of personal need for structure (PNS) on this relationship. In Study 1, we used questionnaires to establish the correlation between bodily cleanliness and bribery intent. In Study 2, we examined the effect by priming sense of self-cleanliness. Study 3 was conducted outside a public bath to test our finding that physical purity decreases bribery intent again; we further found that individuals with high PNS showed no reduction in bribery intent even after cleaning themselves. We thus connected physical cleanliness with the corruption field and improved our understanding of its underlying moderating mechanism. © 2017 International Union of Psychological Science.
Contamination control program for the Cosmic Background Explorer
NASA Technical Reports Server (NTRS)
Barney, Richard D.
1991-01-01
Each of the three state of the art instruments flown aboard NASA's Cosmic Background Explorer (COBE) were designed, fabricated, and integrated using unique contamination control procedures to ensure accurate characterization of the diffuse radiation in the universe. The most stringent surface level cleanliness specifications ever attempted by NASA were required by the Diffuse Infrared Background Experiment (DRIBE) which is located inside a liquid helium cooled dewar along with the Far Infrared Absolute Spectrophotometer (FIRAS). The DRIBE instrument required complex stray radiation suppression that defined a cold primary optical baffle system surface cleanliness level of 100A. The cleanliness levels of the cryogenic FIRAS instrument and the Differential Microwave Radiometer (DMR) which were positioned symmetrically around the dewar were less stringent ranging from 300 to 500A. To achieve these instrument cleanliness levels, the entire flight spacecraft was maintained at level 500A throughout each phase of development. The COBE contamination control program is described along with the difficulties experienced in maintaining the cleanliness quality of personnel and flight hardware throughout instrument assembly.
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
Built-in-Test Verification Techniques
1987-02-01
report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical
Advanced Curation: Solving Current and Future Sample Return Problems
NASA Technical Reports Server (NTRS)
Fries, M.; Calaway, M.; Evans, C.; McCubbin, F.
2015-01-01
Advanced Curation is a wide-ranging and comprehensive research and development effort at NASA Johnson Space Center that identifies and remediates sample related issues. For current collections, Advanced Curation investigates new cleaning, verification, and analytical techniques to assess their suitability for improving curation processes. Specific needs are also assessed for future sample return missions. For each need, a written plan is drawn up to achieve the requirement. The plan draws while upon current Curation practices, input from Curators, the analytical expertise of the Astromaterials Research and Exploration Science (ARES) team, and suitable standards maintained by ISO, IEST, NIST and other institutions. Additionally, new technologies are adopted on the bases of need and availability. Implementation plans are tested using customized trial programs with statistically robust courses of measurement, and are iterated if necessary until an implementable protocol is established. Upcoming and potential NASA missions such as OSIRIS-REx, the Asteroid Retrieval Mission (ARM), sample return missions in the New Frontiers program, and Mars sample return (MSR) all feature new difficulties and specialized sample handling requirements. The Mars 2020 mission in particular poses a suite of challenges since the mission will cache martian samples for possible return to Earth. In anticipation of future MSR, the following problems are among those under investigation: What is the most efficient means to achieve the less than 1.0 ng/sq cm total organic carbon (TOC) cleanliness required for all sample handling hardware? How do we maintain and verify cleanliness at this level? The Mars 2020 Organic Contamination Panel (OCP) predicts that organic carbon, if present, will be present at the "one to tens" of ppb level in martian near-surface samples. The same samples will likely contain wt% perchlorate salts, or approximately 1,000,000x as much perchlorate oxidizer as organic carbon. The chemical kinetics of this reaction are poorly understood at present under the conditions of cached or curated martian samples. Among other parameters, what is the maximum temperature allowed during storage in order to preserve native martian organic compounds for analysis? What is the best means to collect headspace gases from cached martian (and other) samples? This gas will contain not only martian atmosphere but also off-gassed volatiles from the cached solids.
NASA Technical Reports Server (NTRS)
1976-01-01
The recommendations, procedures, and techniques are summarized which provided by the Kodak Apparatus Division to the Ames Research Center to support the Earth Resources Aircraft Program at that facility. Recommendations, procedures, and calibration data are included for sensitometry, densitometry, laboratory cleanliness, and determination of camera exposure. Additional comments are made regarding process control procedures and general laboratory operations.
Holopainen, R; Tuomainen, M; Asikainen, V; Pasanen, P; Säteri, J; Seppänen, O
2002-09-01
The aim of this study was to evaluate the amount of dust in supply air ducts in recently installed ventilation systems. The samples for the determination of dust accumulation were collected from supply air ducts in 18 new buildings that have been constructed according to two different cleanliness control levels classified as category P1 (low oil residues and protected against contaminations) and category P2, as defined in the Classification of Indoor Climate, Construction and Building Materials. In the ducts installed according to the requirements of cleanliness category P1 the mean amount of accumulated dust was 0.9 g/m2 (0.4-2.9 g/m2), and in the ducts installed according to the cleanliness category P2 it was 2.3 g/m2 (1.2-4.9 g/m2). A significant difference was found in the mean amounts of dust between ducts of categories P1 and P2 (P < 0.008). The cleanliness control procedure in category P1 proved to be a useful and effective tool for preventing dust accumulation in new air ducts during the construction process. Additionally, the ducts without residual oil had lower amounts of accumulated dust indicating that the demand for oil free components in the cleanliness classification is reasonable.
Kanter, Valerie; Weldon, Emily; Nair, Uma; Varella, Claudio; Kanter, Keith; Anusavice, Kenneth; Pileggi, Roberta
2011-12-01
The purpose of this study was to compare 2 irrigation techniques by evaluating canal cleanliness and obturation of lateral/accessory canals. Seventy-five extracted canines were instrumented to a size #40/0.06 taper. The EndoActivator (EA) was compared with an ultrasonic unit for final irrigation. Each unit was used for 1 minute each with 6.15% NaOCl and 17% EDTA. A control group received syringe irrigation. Thirty teeth were sectioned and evaluated for debris removal and open dentinal tubules at 3/5 mm from the apical foramen with a scanning electron microscope. Forty-five teeth were examined for obturation of lateral canals. The EA was significantly better in removing debris at all levels when compared with other treatment groups (P < .05) and resulted in obturation of significantly more numbers of lateral canals (P < .01.) The EA provided better obturation of lateral and accessory canals and resulted in less remaining debris. Copyright © 2011 Mosby, Inc. All rights reserved.
Effect of Ladle Usage on Cleanliness of Bearing Steel
NASA Astrophysics Data System (ADS)
Chi, Yunguang; Deng, Zhiyin; Zhu, Miaoyong
2018-02-01
To investigate the effects of ladle usage on the inclusions and total oxygen contents of bearing steel, MgO refractory rods with different glazes were used to simulate different ladle usages. The results show that the effects of different ladle usages on the cleanliness of the steel differ from each other. The total oxygen content of steel increases with the decreasing glaze basicity. Ladle glaze having a lower basicity has a more negative impact on the cleanliness of steel in the subsequent production. Inclusions can be generated by the flush-off of ladle glaze, and the initial glaze is important in the evolution of inclusions in the subsequent heats. To avoid the negative effect of ladle usage and to improve the steel cleanliness as much as possible, specialized ladles were suggested for producing high-quality steel grades.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
[Personal hygiene and cleanliness in an international comparison].
Bergler, R
1989-04-01
The investigation was intended to analyse the attitude towards hygiene and cleanliness in the Federal Republic of Germany, France, and Spain. On the basis of a theoretical explanatory model and empirically gained qualitative raw data a standardized questionnaire was prepared; in a number of questions comparability with a study carried out in the Federal Republic of Germany in 1968 was ensured. In all countries the population was subjected to representative random tests (Federal Republic of Germany n = 1016; France n = 517; Spain n = 514). The paper presents a review of the hypothesis that quality and intensity of the cleanliness increases with (1) the extent of personal sensitivity to hygiene in the private, professional and public sphere, (2) with increasing physical sensibility, (3) the increase in our knowledge of hygiene and health, (4) with the increase in the personal behavioral standards for measures concerned with prevention, hygiene of the body, household, underwear and environment, (5) with the increasing weight given to hygiene and toilet during the process of development, and (6) with the extent of regular control of education in cleanliness based on established rules of behavior. The spheres of behavior investigated and mentioned below confirmed the validity of the hypothesis for (1) household hygiene (spring-cleaning, window cleaning, cleaning of the home: dusting, vacuum-cleaning, cleaning of the floor), (2) hygiene of the body (frequency of taking a shower, bathing, toothbrushing, intimate hygiene), (3) hygiene of the laundry (frequency of changing underclothes such as panties/underpants, brassieres, nightgowns/pyjamas, stockings/socks, linen, pillows, dish and kitchen towels). The following general findings were established: (1) In the Federal Republic of Germany the attitude towards hygiene and cleanliness has improved over the last 20 years. (2) The level of hygiene and cleanliness in France and Spain is significantly higher than in the FRG. (3) The use of regular and also strict parental controls of the child's attitude towards cleanliness, as well as the continuity and systematic pursuance of this hygiene-orientated education is far more spread in France and in Spain than in the Federal Republic of Germany. (4) Within the population of a country differences are attributable to the quality of the parental educational efforts, to the importance of sex-specific cleanliness and toilet standards, to the knowledge of or the prejudices against the interrelationship between hygiene, cleanliness and health, the degree of private, public and professional hygiene sensibility, and physical sensibility.(ABSTRACT TRUNCATED AT 400 WORDS)
Cleanliness Policy Implementation: Evaluating Retribution Model to Rise Public Satisfaction
NASA Astrophysics Data System (ADS)
Dailiati, Surya; Hernimawati; Prihati; Chintia Utami, Bunga
2018-05-01
This research is based on the principal issues concerning the evaluation of cleanliness retribution policy which has not been optimally be able to improve the Local Revenue of Pekanbaru City and has not improved the cleanliness of Pekanbaru City. It was estimated to be caused by the performance of Garden and Sanitation Department are not in accordance with the requirement of society of Pekanbaru City. The research method used in this study is a mixed method with sequential exploratory strategy. The data collection used are observation, interview and documentation for qualitative research as well as questionnaires for quantitative research. The collected data were analyzed with interactive model of Miles and Huberman for qualitative research and multiple regression analysis for quantitative research. The research result indicated that the model of cleanliness policy implementation that can increase of PAD Pekanbaru City and be able to improve people’s satisfaction divided into two (2) which are the evaluation model and the society satisfaction model. The evaluation model influence by criteria/variable of effectiveness, efficiency, adequacy, equity, responsiveness, and appropriateness, while the society satisfaction model influence by variables of society satisfaction, intentions, goals, plans, programs, and appropriateness of cleanliness retribution collection policy.
Indium adhesion provides quantitative measure of surface cleanliness
NASA Technical Reports Server (NTRS)
Krieger, G. L.; Wilson, G. J.
1968-01-01
Indium tipped probe measures hydrophobic and hydrophilic contaminants on rough and smooth surfaces. The force needed to pull the indium tip, which adheres to a clean surface, away from the surface provides a quantitative measure of cleanliness.
Bourne, L. B.; Milner, F. J. M.
1963-01-01
Polyester resins are being increasingly used in industry. These resins require the addition of catalysts and accelerators. The handling of polyester resin system materials may give rise to skin irritations, allergic reactions, and burns. The burns are probably due to styrene and organic peroxides. Atmospheric pollution from styrene and explosion and fire risks from organic peroxides must be prevented. Where dimethylaniline is used scrupulous cleanliness and no-touch technique must be enforced. Handling precautions are suggested. Images PMID:14014495
ERIC Educational Resources Information Center
Rosenfiled-Schlichter, M. D.; And Others
1983-01-01
An ecobehavioral intervention was developed in which agencies and services cooperated, circumventing the mother, to improve the personal cleanliness of two severely neglected children. Intervention included contingent allowance, visits, and laundry assistance. (CL)
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Zhao, Xin-Ru; Nasier, Telajin; Cheng, Yong-Yi; Zhan, Jiang-Yu; Yang, Jian-Hong
2014-06-01
Environmental geochemical baseline models of Cu, Zn, Pb, As, Hg were established by standardized method in the ehernozem, chestnut soil, sierozem and saline soil from the Ili river valley region. The theoretical baseline values were calculated. Baseline factor pollution index evaluation method, environmental background value evaluation method and heavy metal cleanliness evaluation method were used to compare soil pollution degrees. The baseline factor pollution index evaluation showed that As pollution was the most prominent among the four typical types of soils within the river basin, with 7.14%, 9.76%, 7.50% of sampling points in chernozem, chestnut soil and sierozem reached the heavy pollution, respectively. 7.32% of sampling points of chestnut soil reached the permitted heavy metal Pb pollution index in the chestnut soil. The variation extent of As and Pb was the largest, indicating large human disturbance. Environmental background value evaluation showed that As was the main pollution element, followed by Cu, Zn and Pb. Heavy metal cleanliness evaluation showed that Cu, Zn and Pb were better than cleanliness level 2 and Hg was the of cleanliness level 1 in all four types of soils. As showed moderate pollution in sierozem, and it was of cleanliness level 2 or better in chernozem, chestnut soil and saline-alkali soil. Comparing the three evaluation systems, the baseline factor pollution index evaluation more comprehensively reflected the geochemical migration characteristics of elements and the soil formation processes, and the pollution assessment could be specific to the sampling points. The environmental background value evaluation neglected the natural migration of heavy metals and the deposition process in the soil since it was established on the regional background values. The main purpose of the heavy metal cleanliness evaluation was to evaluate the safety degree of soil environment.
Investigation of high-strength bolt-tightening verification techniques.
DOT National Transportation Integrated Search
2016-03-01
The current means and methods of verifying that high-strength bolts have been properly tightened are very laborious and time : consuming. In some cases, the techniques require special equipment and, in other cases, the verification itself may be some...
Clean Up Your School Custodial Program.
ERIC Educational Resources Information Center
Steller, Arthur W.; Pell, Carroll
1986-01-01
Administrators can improve their school's custodial program by following steps that increase productivity, reduce costs, and provide long-term benefits of higher cleanliness standards. Administrators should work toward improved building cleanliness by insisting on a school board policy that establishes objectives for the custodial department.…
Mosler, Hans-Joachim; Sonego, Ina Lucia
2017-10-01
Latrine cleanliness increased in the intervention group compared to the control group (increase from 21 to 31 % of latrines classified as clean in intervention [N = 198] and decrease from 37 to 27 % in control [N = 91]). Improved habitual latrine cleaning lead to latrines being 3.5 times more likely to improve in observed latrine cleanliness (χ 2 = 16.36, p < .001) and so did improvements in quality of latrine construction, eg households that had installed a lid were 7.39 times more likely to have a cleaner latrine (χ 2 = 4.46, p < .05). Changes in psychosocial factors, namely forgetting, personal norm, satisfaction with cleanliness, explained much of the change in habitual latrine cleaning (adj. r 2 = .46). Behaviour change interventions targeting psychosocial factors and quality of latrine construction seem promising to ensure clean and hygienic latrines.
NASA Technical Reports Server (NTRS)
Skinner, S. Ballou
1991-01-01
Chlorofluorocarbons (CFC's) in the atmosphere are believed to present a major environmental problem because they are able to interact with and deplete the ozone layer. NASA has been mandated to replace chlorinated solvents in precision cleaning, cleanliness verification, and degreasing of aerospace fluid systems hardware and ground support equipment. KSC has a CFC phase-out plan which provides for the elimination of over 90 percent of the CFC and halon use by 1995. The Materials Science Laboratory and KSC is evaluating four analytical methods for the determination of nonvolatile residues removal by water: (1) infrared analyses using an attenuated total reflectance; (2) surface tension analyses, (3) total organic content analyses, and (4) turbidity analyses. This research project examined the ultrasonic-turbidity responses for 22 hydrocarbons in an effect to determine: (1) if ultrasonics in heated water (70 C) will clean hydrocarbons (oils, greases, gels, and fluids) from aerospace hardware; (2) if the cleaning process by ultrasonics will simultaneously emulsify the removed hydrocarbons in the water; and (3) if a turbidimeter can be used successfully as an analytical instrument for quantifying the removal of hydrocarbons. Sixteen of the 22 hydrocarbons tested showed that ultrasonics would remove it at least 90 percent of the contaminated hydrocarbon from the hardware in 10 minutes or less giving a good ultrasonic-turbidity response. Six hydrocarbons had a lower percentage removal, a slower removal rate, and a marginal ultrasonic-turbidity response.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Sensor for the working surface cleanliness definition in vacuum
NASA Astrophysics Data System (ADS)
Deulin, E. A.; Mashurov, S. S.; Gatsenko, A. A.
2016-07-01
Modern development of nanotechnology as one of the modern science priority directions is impossible to imagine without the use of vacuum systems and technologies. And the better the vacuum (lower the pressure), the “cleaner” we get a surface, which is very important for nanotechnology. Determination of the cleanliness of the surface or the amount of molecular layers of adsorbed gases on the working surface of the products especially in industry, where the cleanliness of the working surface is a key parameter of the technological process and has a significant influence on the output parameters of the final product is the main goal of this work.
Investigation of high-strength bolt-tightening verification techniques : tech transfer summary.
DOT National Transportation Integrated Search
2016-03-01
The primary objective of this project was to explore the current state-of-practice and the state-of-the-art techniques for high-strength bolt tightening and verification in structural steel connections. This project was completed so that insight coul...
Verification of component mode techniques for flexible multibody systems
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1990-01-01
Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.
Quantified Event Automata: Towards Expressive and Efficient Runtime Monitors
NASA Technical Reports Server (NTRS)
Barringer, Howard; Falcone, Ylies; Havelund, Klaus; Reger, Giles; Rydeheard, David
2012-01-01
Runtime verification is the process of checking a property on a trace of events produced by the execution of a computational system. Runtime verification techniques have recently focused on parametric specifications where events take data values as parameters. These techniques exist on a spectrum inhabited by both efficient and expressive techniques. These characteristics are usually shown to be conflicting - in state-of-the-art solutions, efficiency is obtained at the cost of loss of expressiveness and vice-versa. To seek a solution to this conflict we explore a new point on the spectrum by defining an alternative runtime verification approach.We introduce a new formalism for concisely capturing expressive specifications with parameters. Our technique is more expressive than the currently most efficient techniques while at the same time allowing for optimizations.
Treatment of Childhood Encopresis: Full Cleanliness Training
ERIC Educational Resources Information Center
Arnold, Susan; Doleys, Daniel M.
1975-01-01
Full Cleanliness Training (a procedure in which the trainee is required to correct the results of inappropriate toileting behavior by cleaning himself and his clothing) was used in combination with positive reinforcement to deal with a trainable retarded 8 year old boy with encopresis and a toilet phobia. (Author/CL)
49 CFR 174.715 - Cleanliness of transport vehicles after use.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 2 2012-10-01 2012-10-01 false Cleanliness of transport vehicles after use. 174.715 Section 174.715 Transportation Other Regulations Relating to Transportation PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION HAZARDOUS MATERIALS REGULATIONS CARRIAGE BY RAIL Detailed Requirements for Class 7 ...
Applications of Laser-Induced Breakdown Spectroscopy (LIBS) in Molten Metal Processing
NASA Astrophysics Data System (ADS)
Hudson, Shaymus W.; Craparo, Joseph; De Saro, Robert; Apelian, Diran
2017-10-01
In order for metals to meet the demand for critical applications in the automotive, aerospace, and defense industries, tight control over the composition and cleanliness of the metal must be achieved. The use of laser-induced breakdown spectroscopy (LIBS) for applications in metal processing has generated significant interest for its ability to perform quick analyses in situ. The fundamentals of LIBS, current techniques for deployment on molten metal, demonstrated capabilities, and possible avenues for development are reviewed and discussed.
NASA Technical Reports Server (NTRS)
Roman, Juan A.; Stitt, George F.; Roman, Felix R.
1997-01-01
This paper will provide a general overview of the molecular contamination philosophy of the Space Simulation Test Engineering Section and how the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) space simulation laboratory controls and maintains the cleanliness of all its facilities, thereby, minimizing down time between tests. It will also briefly cover the proper selection and safety precautions needed when using some chemical solvents for wiping, washing, or spraying thermal shrouds when molecular contaminants increase to unacceptable background levels.
Deductive Verification of Cryptographic Software
NASA Technical Reports Server (NTRS)
Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara
2009-01-01
We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
Verification of Java Programs using Symbolic Execution and Invariant Generation
NASA Technical Reports Server (NTRS)
Pasareanu, Corina; Visser, Willem
2004-01-01
Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.
An index to quantify street cleanliness: the case of Granada (Spain).
Sevilla, Aitana; Rodríguez, Miguel Luis; García-Maraver, Angela; Zamorano, Montserrat
2013-05-01
Urban surfaces receive waste deposits from natural and human sources, which create a negative visual impact and are identified as potentially significant contributors to water and air pollution. Local councils are usually responsible for the sweep of roads and footpaths to keep the environment clean and free of litter. Quality controls are useful in order to check whether the services are being executed according to the quantity, quality and performance standards that are provided. In this sense, several factors might affect the efficiency of the management of cleaning and waste collection services; however, only a few contributions are available in the literature on the various aspects associated with the level of street cleanliness. In this paper, the suitability of a Cleanliness Index has been checked, for the case of Granada (South of Spain), in order to contribute to the proper management of public expenditure, improving the quality and cost of an essential service for any municipality. Results have concluded that the city exhibits a good level of cleanliness, although the standard of cleaning varied from one area of the city to another. The Cleaning Index fits well to the general situation of the different districts of Granada and thus, it could be considered a useful tool for measuring the level of cleanliness of the streets of the city and for evaluating the organization of the cleaning service, such that an outsourced company would not be responsible for controlling all the cleaning services. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makhlouf M. Makhlouf; Diran Apelian
The objective of this project is to develop a technology for clean metal processing that is capable of consistently providing a metal cleanliness level that is fit for a given application. The program has five tasks: Development of melt cleanliness assessment technology, development of melt contamination avoidance technology, development of high temperature phase separation technology, establishment of a correlation between the level of melt cleanliness and as cast mechanical properties, and transfer of technology to the industrial sector. Within the context of the first task, WPI has developed a standardized Reduced Pressure Test that has been endorsed by AFS asmore » a recommended practice. In addition, within the context of task1, WPI has developed a melt cleanliness sensor based on the principles of electromagnetic separation. An industrial partner is commercializing the sensor. Within the context of the second task, WPI has developed environmentally friendly fluxes that do not contain fluorine. Within the context of the third task, WPI modeled the process of rotary degassing and verified the model predictions with experimental data. This model may be used to optimize the performance of industrial rotary degassers. Within the context of the fourth task, WPI has correlated the level of melt cleanliness at various foundries, including a sand casting foundry, a permanent mold casting foundry, and a die casting foundry, to the casting process and the resultant mechanical properties. This is useful in tailoring the melt cleansing operations at foundries to the particular casting process and the desired properties of cast components.« less
Development of a radiation-hard CMOS process
NASA Technical Reports Server (NTRS)
Power, W. L.
1983-01-01
It is recommended that various techniques be investigated which appear to have the potential for improving the radiation hardness of CMOS devices for prolonged space flight mission. The three key recommended processing techniques are: (1) making the gate oxide thin. It has been shown that radiation degradation is proportional to the cube of oxide thickness so that a relatively small reduction in thickness can greatly improve radiation resistance; (2) cleanliness and contamination control; and (3) to investigate different oxide growth (low temperature dry, TCE and HCL). All three produce high quality clean oxides, which are more radiation tolerant. Technique 2 addresses the reduction of metallic contamination. Technique 3 will produce a higher quality oxide by using slow growth rate conditions, and will minimize the effects of any residual sodium contamination through the introduction of hydrogen and chlorine into the oxide during growth.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H; Liang, X; Kalbasi, A
2014-06-01
Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less
Asessment of adequacy of the monitoring method in the activity of a verification laboratory
NASA Astrophysics Data System (ADS)
Ivanov, R. N.; Grinevich, V. A.; Popov, A. A.; Shalay, V. V.; Malaja, L. D.
2018-04-01
Questions of assessing adequacy of a risk monitoring technique for a verification laboratory operation concerning the conformity to the accreditation criteria, and aimed at decision-making on advisability of a verification laboratory activities in the declared area of accreditation are considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Latty, Drew, E-mail: drew.latty@health.nsw.gov.au; Stuart, Kirsty E; Westmead Breast Cancer Institute, Sydney, New South Wales
Radiation treatment to the left breast is associated with increased cardiac morbidity and mortality. The deep inspiration breath-hold technique (DIBH) can decrease radiation dose delivered to the heart and this may facilitate the treatment of the internal mammary chain nodes. The aim of this review is to critically analyse the literature available in relation to breath-hold methods, implementation, utilisation, patient compliance, planning methods and treatment verification of the DIBH technique. Despite variation in the literature regarding the DIBH delivery method, patient coaching, visual feedback mechanisms and treatment verification, all methods of DIBH delivery reduce radiation dose to the heart. Furthermore » research is required to determine optimum protocols for patient training and treatment verification to ensure the technique is delivered successfully.« less
Systematic study of source mask optimization and verification flows
NASA Astrophysics Data System (ADS)
Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi
2012-06-01
Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.
Glove-based approach to online signature verification.
Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A
2008-06-01
Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.
The Learner Verification of Series r: The New Macmillan Reading Program; Highlights.
ERIC Educational Resources Information Center
National Evaluation Systems, Inc., Amherst, MA.
National Evaluation Systems, Inc., has developed curriculum evaluation techniques, in terms of learner verification, which may be used to help the curriculum-development efforts of publishing companies, state education departments, and universities. This document includes a summary of the learner-verification approach, with data collected about a…
Using Small-Step Refinement for Algorithm Verification in Computer Science Education
ERIC Educational Resources Information Center
Simic, Danijela
2015-01-01
Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…
Evaluation of Mesoscale Model Phenomenological Verification Techniques
NASA Technical Reports Server (NTRS)
Lambert, Winifred
2006-01-01
Forecasters at the Spaceflight Meteorology Group, 45th Weather Squadron, and National Weather Service in Melbourne, FL use mesoscale numerical weather prediction model output in creating their operational forecasts. These models aid in forecasting weather phenomena that could compromise the safety of launch, landing, and daily ground operations and must produce reasonable weather forecasts in order for their output to be useful in operations. Considering the importance of model forecasts to operations, their accuracy in forecasting critical weather phenomena must be verified to determine their usefulness. The currently-used traditional verification techniques involve an objective point-by-point comparison of model output and observations valid at the same time and location. The resulting statistics can unfairly penalize high-resolution models that make realistic forecasts of a certain phenomena, but are offset from the observations in small time and/or space increments. Manual subjective verification can provide a more valid representation of model performance, but is time-consuming and prone to personal biases. An objective technique that verifies specific meteorological phenomena, much in the way a human would in a subjective evaluation, would likely produce a more realistic assessment of model performance. Such techniques are being developed in the research community. The Applied Meteorology Unit (AMU) was tasked to conduct a literature search to identify phenomenological verification techniques being developed, determine if any are ready to use operationally, and outline the steps needed to implement any operationally-ready techniques into the Advanced Weather Information Processing System (AWIPS). The AMU conducted a search of all literature on the topic of phenomenological-based mesoscale model verification techniques and found 10 different techniques in various stages of development. Six of the techniques were developed to verify precipitation forecasts, one to verify sea breeze forecasts, and three were capable of verifying several phenomena. The AMU also determined the feasibility of transitioning each technique into operations and rated the operational capability of each technique on a subjective 1-10 scale: (1) 1 indicates that the technique is only in the initial stages of development, (2) 2-5 indicates that the technique is still undergoing modifications and is not ready for operations, (3) 6-8 indicates a higher probability of integrating the technique into AWIPS with code modifications, and (4) 9-10 indicates that the technique was created for AWIPS and is ready for implementation. Eight of the techniques were assigned a rating of 5 or below. The other two received ratings of 6 and 7, and none of the techniques a rating of 9-10. At the current time, there are no phenomenological model verification techniques ready for operational use. However, several of the techniques described in this report may become viable techniques in the future and should be monitored for updates in the literature. The desire to use a phenomenological verification technique is widespread in the modeling community, and it is likely that other techniques besides those described herein are being developed, but the work has not yet been published. Therefore, the AMIU recommends that the literature continue to be monitored for updates to the techniques described in this report and for new techniques being developed whose results have not yet been published. 111
[Development of a microenvironment test chamber for airborne microbe research].
Zhan, Ningbo; Chen, Feng; Du, Yaohua; Cheng, Zhi; Li, Chenyu; Wu, Jinlong; Wu, Taihu
2017-10-01
One of the most important environmental cleanliness indicators is airborne microbe. However, the particularity of clean operating environment and controlled experimental environment often leads to the limitation of the airborne microbe research. This paper designed and implemented a microenvironment test chamber for airborne microbe research in normal test conditions. Numerical simulation by Fluent showed that airborne microbes were evenly dispersed in the upper part of test chamber, and had a bottom-up concentration growth distribution. According to the simulation results, the verification experiment was carried out by selecting 5 sampling points in different space positions in the test chamber. Experimental results showed that average particle concentrations of all sampling points reached 10 7 counts/m 3 after 5 minutes' distributing of Staphylococcus aureus , and all sampling points showed the accordant mapping of concentration distribution. The concentration of airborne microbe in the upper chamber was slightly higher than that in the middle chamber, and that was also slightly higher than that in the bottom chamber. It is consistent with the results of numerical simulation, and it proves that the system can be well used for airborne microbe research.
Ultra Pure Water Cleaning Baseline Study on NASA JSC Astromaterial Curation Gloveboxes
NASA Technical Reports Server (NTRS)
Calaway, Michael J.; Burkett, P. J.; Allton, J. H.; Allen, C. C.
2013-01-01
Future sample return missions will require strict protocols and procedures for reducing inorganic and organic contamination in isolation containment systems. In 2012, a baseline study was orchestrated to establish the current state of organic cleanliness in gloveboxes used by NASA JSC astromaterials curation labs [1, 2]. As part of this in-depth organic study, the current curatorial technical support procedure (TSP) 23 was used for cleaning the gloveboxes with ultra pure water (UPW) [3-5]. Particle counts and identification were obtained that could be used as a benchmark for future mission designs that require glovebox decontamination. The UPW baseline study demonstrates that TSP 23 works well for gloveboxes that have been thoroughly degreased. However, TSP 23 could be augmented to provide even better glovebox decontamination. JSC 03243 could be used as a starting point for further investigating optimal cleaning techniques and procedures. DuPont Vertrel XF or other chemical substitutes to replace Freon- 113, mechanical scrubbing, and newer technology could be used to enhance glovebox cleanliness in addition to high purity UPW final rinsing. Future sample return missions will significantly benefit from further cleaning studies to reduce inorganic and organic contamination.
NASA Astrophysics Data System (ADS)
Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB
2017-11-01
Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.
Food safety hazards lurk in the kitchens of young adults.
Byrd-Bredbenner, Carol; Maurer, Jaclyn; Wheatley, Virginia; Cottone, Ellen; Clancy, Michele
2007-04-01
Food mishandling in home kitchens likely causes a significant amount of foodborne disease; however, little is known about the food safety hazards lurking in home kitchens. The purposes of this study were to audit the kitchens of young adults with education beyond high school to identify food safety problems and develop recommendations for education efforts. Researchers developed a criterion-referenced home kitchen observation instrument to assess compliance of home food storage and rotation practices (e.g., temperature), sanitation and chemical storage, and general kitchen condition (e.g., infestation) with recommended practices. The instrument contained seven scales: Kitchen Cleanliness (eight items), Appliance Cleanliness (three items), Cleaning Supplies Availability (eight items), Temperatures (Food Thermometer Access & Refrigerator/Freezer Temperatures) (five items), Cold Food Storage (seven items), Dry Food Storage (eight items), and Poisons Storage (two items). Descriptive statistics were conducted to describe the study population, as a whole, and by gender. A total of 154 young adults (mean age, 20.7+/- 1.3 SD) enrolled in a northeastern university participated. Participants scored 70% or higher on Poisons Storage, Dry Food Storage, Kitchen Cleanliness, and Cleaning Supplies Availability scales but less than 60% on the Appliance Cleanliness and Cold Food Storage scales. Performance was lowest on the Temperatures scale. Females scored significantly higher than males on the Kitchen Cleanliness and Cleaning Supply Availability scales. Average refrigerator and freezer temperatures were higher than recommendations. Food safety education targeted at this young adult population needs to evolve into focused messages pertaining to the key food safety violations in this population.
Effect of Cleanliness on Hydrogen Tolerance in High-Strength Steel
2014-04-01
in High-Strength Steel 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Scott M. Grendahl, Franklyn Kellogg ,* and...Effect of Cleanliness on Hydrogen Tolerance in High-Strength Steel by Scott M. Grendahl, Franklyn Kellogg , and Hoang Nguyen ARL-TR...Directorate, ARL Franklyn Kellogg and Hoang Nguyen Bowhead Technical Services Approved for public
2016-01-14
hyperproperty and a liveness hyperproperty. A verification technique for safety hyperproperties is given and is shown to generalize prior tech- niques for...liveness properties are affiliated with specific verification methods. An analogous theory for security policies would be appealing. The fact that security...verified by using invariance arguments. Our verification methodology generalizes prior work on using invariance arguments to verify information-flow
Dosimetric Verification of IMRT Treatment Plans Using an Electronic Portal Imaging Device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruszyna, Marta
This paper presents the procedures and results of dosimetric verification using an Electronic Portal Imaging Device as a tool for pre-treatment dosimetry in IMRT technique at the Greater Poland Cancer Centre in Poznan, Poland. The evaluation of dosimetric verification for various organ, during a 2 year period is given.
1984-07-01
results are caused only by the individual aerodynamic cleanliness. A table in thr lower part of this figure gives a, frst impression of the surface quality ...for the Tornado which will be the firso, iarc-iaft in operational service in t•• .. , SGersany and Italy to be so equipped. -nen considering the...and must be tested to the limits of their capability before being released to Service . The. final lesson is that when a high risk trial is undertaken
On verifying a high-level design. [cost and error analysis
NASA Technical Reports Server (NTRS)
Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.
1993-01-01
An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.
An elementary tutorial on formal specification and verification using PVS
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
1993-01-01
A tutorial on the development of a formal specification and its verification using the Prototype Verification System (PVS) is presented. The tutorial presents the formal specification and verification techniques by way of specific example - an airline reservation system. The airline reservation system is modeled as a simple state machine with two basic operations. These operations are shown to preserve a state invariant using the theorem proving capabilities of PVS. The technique of validating a specification via 'putative theorem proving' is also discussed and illustrated in detail. This paper is intended for the novice and assumes only some of the basic concepts of logic. A complete description of user inputs and the PVS output is provided and thus it can be effectively used while one is sitting at a computer terminal.
NASA Astrophysics Data System (ADS)
Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus
2011-11-01
A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.
A new technique for measuring listening and reading literacy in developing countries
NASA Astrophysics Data System (ADS)
Greene, Barbara A.; Royer, James M.; Anzalone, Stephen
1990-03-01
One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.
Interpreter composition issues in the formal verification of a processor-memory module
NASA Technical Reports Server (NTRS)
Fura, David A.; Cohen, Gerald C.
1994-01-01
This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.
Signature Verification Using N-tuple Learning Machine.
Maneechot, Thanin; Kitjaidure, Yuttana
2005-01-01
This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.
Verification of Autonomous Systems for Space Applications
NASA Technical Reports Server (NTRS)
Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.
2006-01-01
Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.
Knape, L; Hambraeus, A; Lytsy, B
2015-10-01
The adenosine triphosphate (ATP) method is widely accepted as a quality control method to complement visual assessment, in the specifications of requirements, when purchasing cleaning contractors in Swedish hospitals. To examine whether the amount of biological load, as measured by ATP on frequently touched near-patient surfaces, had been reduced after an intervention; to evaluate the correlation between visual assessment and ATP levels on the same surfaces; to identify aspects of the performance of the ATP method as a tool in evaluating hospital cleanliness. A prospective intervention study in three phases was carried out in a medical ward and an intensive care unit (ICU) at a regional hospital in mid-Sweden between 2012 and 2013. Existing cleaning procedures were defined and baseline tests were sampled by visual inspection and ATP measurements of ten frequently touched surfaces in patients' rooms before and after intervention. The intervention consisted of educating nursing staff about the importance of hospital cleaning and direct feedback of ATP levels before and after cleaning. The mixed model showed a significant decrease in ATP levels after the intervention (P < 0.001). Relative light unit values were lower in the ICU. Cleanliness as judged by visual assessments improved. In the logistic regression analysis, there was a significant association between visual assessments and ATP levels. Direct feedback of ATP levels, together with education and introduction of written cleaning protocols, were effective tools to improve cleanliness. Visual assessment correlated with the level of ATP but the correlation was not absolute. The ATP method could serve as an educational tool for staff, but is not enough to assess hospital cleanliness in general as only a limited part of a large area is covered. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Cehreli, Zafer C; Uyanik, M Ozgur; Nagas, Emre; Tuncel, Behram; Er, Nuray; Comert, Fugen Dagli
2013-09-01
To compare the smear layer removal efficacy and erosive effects of different irrigation protocols under clinical and laboratory conditions. Mandibular third molars (n = 32) of 30-45 year-old patients were instrumented with rotary files and were randomly assigned to one of the following groups for final irrigation: (1) 5.25% NaOCl; (2) 17% EDTA; and (3) BioPure MTAD. Thereafter, the teeth were immediately extracted and processed for micromorphological investigation. In vitro specimen pairs were prepared by repeating the clinical experiments on freshly-extracted mandibular third molars. To compare open and closed systems, laboratory experiments were repeated on 32 additional teeth with enlarged apical foramen. The cleanliness of the root canals and the extent of erosion were assessed by environmental scanning electron microscopy. Specimens prepared under clinical and laboratory conditions had similar cleanliness and erosion scores (p > 0.05). Under both conditions, the tested solutions were more effective in removing the smear layer in the coronal and middle regions than in the apical one. Comparison of closed and open systems showed similar levels of cleanliness and erosion in all regions (p > 0.05), with the exception of 17% EDTA showing significantly higher levels of cleanliness and erosion in the apical third of open-end specimens. Based on clinical correlates of in vitro root canal cleanliness and erosion, laboratory testing of root canal irrigants on extracted teeth with closed apices can serve as a reliable method to simulate the clinical condition. EDTA was the most effective final irrigation solution in removing the smear layer at the expense of yielding the greatest erosive effect.
Smoking, tooth brushing and oral cleanliness among 15-year-olds in Tehran, Iran.
Yazdani, Reza; Vehkalahti, Miira M; Nouri, Mahtab; Murtomaa, Heikki
2008-01-01
To assess smoking, tooth brushing and oral cleanliness and their relationships among 15-year-olds in Tehran, Iran. A cross-sectional study based on World Health Organization criteria and the methods of the Second International Collaborative Study was carried out in autumn 2004 among 15-year-olds (n=502) in Tehran. Data were based on a self-administered questionnaire and a clinical dental examination. Smokers comprised 5% of the boys and 2% of the girls (p = 0.02). Smoking was more common among students of less-educated parents (50% vs. 30%, p < 0.05). Of all students, 26% reported twice-daily tooth brushing; those of higher socio-economic backgrounds and girls did so more frequently. Of the smokers, 11% reported no tooth brushing compared to 6% of the non-smokers. Oral cleanliness was good for 13%, moderate for 32%, and poor for 55%; the rates associated positively with female gender (p = 0.002), having higher-educated parents (p = 0.03), and reporting a higher frequency of tooth brushing (p < 0.001). Those students reporting twice-daily tooth brushing had less dental plaque and gingival bleeding (p < or = 0.01) on both anterior and posterior teeth. In multivariable analyses, the best predictors for a good level of oral cleanliness were female gender (OR = 2.0) or twice-daily tooth brushing (OR = 1.7). Oral cleanliness and tooth brushing among 15-year-olds were at poor levels, particularly among boys. Such poor levels call for intensive attempts to enhance rates of twice-daily tooth brushing and to improve its quality. For this age group, anti-smoking purposes should be combined into school-based oral health promotion programmes as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohashi, Haruhiko, E-mail: hohashi@spring8.or.jp; Senba, Yasunori; Yumoto, Hirokatsu
We studied typical forms of contamination on X-ray mirrors that cause degradation of beam quality, investigated techniques to remove the contaminants, and propose methods to eliminate the sources of the contamination. The total amount of carbon-containing substances on various materials in the vicinity of a mirror was measured by thermal desorption-gas chromatography/mass spectrometry and thermal desorption spectroscopy. It was found that cleanliness and ultra-high vacuum techniques are required to produce the contamination-free surfaces that are essential for the propagation of high-quality X-ray beams. The reduction of carbonaceous residue adsorbed on the surfaces, and absorbed into the bulk, of the materialsmore » in the vicinity of the mirrors is a key step toward achieving contamination-free X-ray optics.« less
Inclusion Detection in Aluminum Alloys Via Laser-Induced Breakdown Spectroscopy
NASA Astrophysics Data System (ADS)
Hudson, Shaymus W.; Craparo, Joseph; De Saro, Robert; Apelian, Diran
2018-04-01
Laser-induced breakdown spectroscopy (LIBS) has shown promise as a technique to quickly determine molten metal chemistry in real time. Because of its characteristics, LIBS could also be used as a technique to sense for unwanted inclusions and impurities. Simulated Al2O3 inclusions were added to molten aluminum via a metal-matrix composite. LIBS was performed in situ to determine whether particles could be detected. Outlier analysis on oxygen signal was performed on LIBS data and compared to oxide volume fraction measured through metallography. It was determined that LIBS could differentiate between melts with different amounts of inclusions by monitoring the fluctuations in signal for elements of interest. LIBS shows promise as an enabling tool for monitoring metal cleanliness.
NASA Technical Reports Server (NTRS)
Law, R. D.
1989-01-01
A contaminant is any material or substance which is potentially undesirable or which may adversely affect any part, component, or assembly. Contamination control of SRM hardware surfaces is a serious concern, for both Thiokol and NASA, with particular concern for contaminants which may adversely affect bonding surfaces. The purpose of this study is to develop laboratory analytical techniques which will make it possible to certify the cleanliness of any designated surface, with special focus on particulates (dust, dirt, lint, etc.), oils (hydrocarbons, silicones, plasticizers, etc.), and greases (HD-2, fluorocarbon grease, etc.). The hardware surfaces of concern will include D6AC steel, aluminum alloys, anodized aluminum alloys, glass/phenolic, carbon/phenolic, NBR/asbestos-silica, and EPDM rubber.
NASA Technical Reports Server (NTRS)
Underwood, Lauren
2013-01-01
TiO2 coated surfaces demonstrated both visually through photographic representation, and quantitatively, through reflectance measurements that they improved upon the current state of cleanliness upon the surfaces that they were applied to. TiO2 has the potential to both maintain and increase building s sustainability and the overall appearance of cleanliness TiO2 coated slides degraded soot under UV light compared to soot samples on plain uncoated slides under the same conditions Degradation of soot by photocatalysis was far more apparent than degradation of soot by UV light alone This demonstration provides the foundation for a laboratory model that could be used to simulate real world applications for photocatalytic materials Additional research is required to better understand the full potential of TiO2
Tang, Honghong; Lu, Xiaping; Su, Rui; Liang, Zilu; Mai, Xiaoqin
2017-01-01
Abstract The association between moral purity and physical cleanliness has been widely discussed recently. Studies found that moral threat initiates the need of physical cleanliness, but actual physical cleaning and priming of cleaning have inconsistent effects on subsequent attitudes and behaviors. Here, we used resting-state functional magnetic resonance imaging to explore the underlying neural mechanism of actual physical cleaning and priming of cleaning. After recalling moral transgression with strong feelings of guilt and shame, participants either actually cleaned their faces with a wipe or were primed with cleanliness through viewing its pictures. Results showed that actual physical cleaning reduced the spontaneous brain activities in the right insula and MPFC, regions that involved in embodied moral emotion processing, while priming of cleaning decreased activities in the right superior frontal gyrus and middle frontal gyrus, regions that participated in executive control processing. Additionally, actual physical cleaning also changed functional connectivity between insula/MPFC and emotion related regions, whereas priming of cleaning modified connectivity within both moral and sensorimotor areas. These findings revealed that actual physical cleaning and priming of cleaning led to changes in different brain regions and networks, providing neural evidence for the inconsistent effects of cleanliness on subsequent attitudes and behaviors. PMID:28338887
Tang, Honghong; Lu, Xiaping; Su, Rui; Liang, Zilu; Mai, Xiaoqin; Liu, Chao
2017-07-01
The association between moral purity and physical cleanliness has been widely discussed recently. Studies found that moral threat initiates the need of physical cleanliness, but actual physical cleaning and priming of cleaning have inconsistent effects on subsequent attitudes and behaviors. Here, we used resting-state functional magnetic resonance imaging to explore the underlying neural mechanism of actual physical cleaning and priming of cleaning. After recalling moral transgression with strong feelings of guilt and shame, participants either actually cleaned their faces with a wipe or were primed with cleanliness through viewing its pictures. Results showed that actual physical cleaning reduced the spontaneous brain activities in the right insula and MPFC, regions that involved in embodied moral emotion processing, while priming of cleaning decreased activities in the right superior frontal gyrus and middle frontal gyrus, regions that participated in executive control processing. Additionally, actual physical cleaning also changed functional connectivity between insula/MPFC and emotion related regions, whereas priming of cleaning modified connectivity within both moral and sensorimotor areas. These findings revealed that actual physical cleaning and priming of cleaning led to changes in different brain regions and networks, providing neural evidence for the inconsistent effects of cleanliness on subsequent attitudes and behaviors. © The Author (2017). Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
1995-01-01
The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Evidence Points To 'Gaming' At Hospitals Subject To National Health Service Cleanliness Inspections.
Toffolutti, Veronica; McKee, Martin; Stuckler, David
2017-02-01
Inspections are a key way to monitor and ensure quality of care and maintain high standards in the National Health Service (NHS) in England. Yet there is a perception that inspections can be gamed. This can happen, for example, when staff members know that an inspection will soon take place. Using data for 205 NHS hospitals for the period 2011-14, we tested whether patients' perceptions of cleanliness increased during periods when inspections occurred. Our results show that during the period within two months of an inspection, there was a significant elevation (2.5-11.0 percentage points) in the share of patients who reported "excellent" cleanliness. This association was consistent even after adjustment for secular time trends. The association was concentrated in hospitals that outsourced cleaning services and was not detected in those that used NHS cleaning services. Project HOPE—The People-to-People Health Foundation, Inc.
NASA Astrophysics Data System (ADS)
Géraud-Grenier, I.; Desdions, W.; Faubert, F.; Mikikian, M.; Massereau-Guilbaud, V.
2018-01-01
The methane decomposition in a planar RF discharge (13.56 MHz) leads both to a dust-particle generation in the plasma bulk and to a coating growth on the electrodes. Growing dust-particles fall onto the grounded electrode when they are too heavy. Thus, at the end of the experiment, the grounded electrode is covered by a coating and by fallen dust-particles. During the dust-particle growth, the negative DC self-bias voltage (VDC) increases because fewer electrons reach the RF electrode, leading to a more resistive plasma and to changes in the plasma chemical composition. In this paper, the cleanliness influence of the RF electrode on the dust-particle growth, on the plasma characteristics and composition is investigated. A cleanliness electrode is an electrode without coating and dust-particles on its surface at the beginning of the experiment.
NASA Technical Reports Server (NTRS)
Fay, M.
1998-01-01
This Contamination Control Plan is submitted in response the Contract Document requirements List (CDRL) 007 under contract NAS5-32314 for the Earth Observing System (EOS) Advanced Microwave Sounding Unit A (AMSU-A). In response to the CDRL instructions, this document defines the level of cleanliness and methods/procedures to be followed to achieve adequate cleanliness/contamination control, and defines the required approach to maintain cleanliness/contamination control through shipping, observatory integration, test, and flight. This plan is also applicable to the Meteorological Satellite (METSAT) except where requirements are identified as EOS-specific. This plan is based on two key factors: a. The EOS/METSAT AMSU-A Instruments are not highly contamination sensitive. b. Potential contamination of other EOS Instruments is a key concern as addressed in Section 9/0 of the Performance Assurance Requirements for EOS/METSAT Integrated Programs AMSU-A Instrument (MR) (NASA Specification S-480-79).
NASA Technical Reports Server (NTRS)
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
Verification, Validation and Sensitivity Studies in Computational Biomechanics
Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.
2012-01-01
Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar A.
2009-01-01
This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.
Minimization of nanosatellite low frequency magnetic fields.
Belyayev, S M; Dudkin, F L
2016-03-01
Small weight and dimensions of the micro- and nanosatellites constrain researchers to place electromagnetic sensors on short booms or on the satellite body. Therefore the electromagnetic cleanliness of such satellites becomes a central question. This paper describes the theoretical base and practical techniques for determining the parameters of DC and very low frequency magnetic interference sources. One of such sources is satellite magnetization, the reduction of which improves the accuracy and stability of the attitude control system. We present design solutions for magnetically clean spacecraft, testing equipment, and technology for magnetic moment measurements, which are more convenient, efficient, and accurate than the conventional ones.
Note: Work function change measurement via improved Anderson method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sabik, A., E-mail: sabik@ifd.uni.wroc.pl; Gołek, F.; Antczak, G.
We propose the modification to the Anderson method of work function change (Δϕ) measurements. In this technique, the kinetic energy of the probing electrons is already low enough for non-destructive investigation of delicate molecular systems. However, in our implementation, all electrodes including filament of the electron gun are polarized positively. As a consequence, electron bombardment of any elements of experimental system is eliminated. Our modification improves cleanliness of the ultra-high vacuum system. As an illustration of the solution capabilities, we present Δϕ of the Ag(100) surface induced by cobalt phthalocyanine layers.
Sterilization, high-level disinfection, and environmental cleaning.
Rutala, William A; Weber, David J
2011-03-01
Failure to perform proper disinfection and sterilization of medical devices may lead to introduction of pathogens, resulting in infection. New techniques have been developed for achieving high-level disinfection and adequate environmental cleanliness. This article examines new technologies for sterilization and high-level disinfection of critical and semicritical items, respectively, and because semicritical items carry the greatest risk of infection, the authors discuss reprocessing semicritical items such as endoscopes and automated endoscope reprocessors, endocavitary probes, prostate biopsy probes, tonometers, laryngoscopes, and infrared coagulation devices. In addition, current issues and practices associated with environmental cleaning are reviewed. Copyright © 2011. Published by Elsevier Inc.
Contamination detection NDE for cleaning process inspection
NASA Technical Reports Server (NTRS)
Marinelli, W. J.; Dicristina, V.; Sonnenfroh, D.; Blair, D.
1995-01-01
In the joining of multilayer materials, and in welding, the cleanliness of the joining surface may play a large role in the quality of the resulting bond. No non-intrusive techniques are currently available for the rapid measurement of contamination on large or irregularly shaped structures prior to the joining process. An innovative technique for the measurement of contaminant levels in these structures using laser based imaging is presented. The approach uses an ultraviolet excimer laser to illuminate large and/or irregular surface areas. The UV light induces fluorescence and is scattered from the contaminants. The illuminated area is viewed by an image-intensified CCD (charge coupled device) camera interfaced to a PC-based computer. The camera measures the fluorescence and/or scattering from the contaminants for comparison with established standards. Single shot measurements of contamination levels are possible. Hence, the technique may be used for on-line NDE testing during manufacturing processes.
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin
2010-04-01
The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Hughes, David W.; Hedgeland, Randy J.; Chivatero, Craig J.; Studer, Robert J.; Kostos, Peter J.
1994-01-01
The Scientific Instrument Protective Enclosures were designed for the Hubble Space Telescope Servicing Missions to provide a beginning environment to a Scientific Instrument during ground and on orbit activities. The Scientific Instruments required very stringent surface cleanliness and molecular outgassing levels to maintain ultraviolet performance. Data from the First Servicing Mission verified that both the Scientific Instruments and Scientific Instrument Protective Enclosures met surface cleanliness level requirements during ground and on-orbit activities.
Salsgiver, Elizabeth; Bernstein, Daniel; Simon, Matthew S; Greendyke, William; Jia, Haomiao; Robertson, Amy; Salter, Selma; Schuetz, Audrey N; Saiman, Lisa; Furuya, E Yoko; Calfee, David P
2018-05-01
The correlation between ATP concentration and bacterial burden in the patient care environment was assessed. These findings suggest that a correlation exists between ATP concentration and bacterial burden, and they generally support ATP technology manufacturer-recommended cutoff values. Despite relatively modest discriminative ability, this technology may serve as a useful proxy for cleanliness.Infect Control Hosp Epidemiol 2018;39:622-624.
Chen, Lu; Xu, YingJun; Zhang, Fengxia; Yang, Qingfeng; Yuan, Juxiang
2016-11-01
Dirty medical lead clothes, contaminated with blood or other infected material, may carry ongoing bioburden, which increase the risk of hospital-acquired infection. In this study, we investigated medical lead clothes contamination levels and assessed the effectiveness of the intervention that was constructed to improve the cleanliness of lead clothes. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Davidson, C A; Griffith, C J; Peters, A C; Fielding, L M
1999-01-01
The minimum bacterial detection limits and operator reproducibility of the Biotrace Clean-Tracetrade mark Rapid Cleanliness Test and traditional hygiene swabbing were determined. Areas (100 cm2) of food grade stainless steel were separately inoculated with known levels of Staphylococcus aureus (NCTC 6571) and Escherichia coli (ATCC 25922). Surfaces were sampled either immediately after inoculation while still wet, or after 60 min when completely dry. For both organisms the minimum detection limit of the ATP Clean-Tracetrade mark Rapid Cleanliness Test was 10(4) cfu/100 cm2 (p < 0.05) and was the same for wet and dry surfaces. Both organism type and surface status (i.e. wet or dry) influenced the minimum detection limits of hygiene swabbing, which ranged from 10(2) cfu/100 cm2 to >10(7) cfu/100 cm2. Hygiene swabbing percentage recovery rates for both organisms were less than 0.1% for dried surfaces but ranged from 0.33% to 8.8% for wet surfaces. When assessed by six technically qualified operators, the Biotrace Clean-Tracetrade mark Rapid Cleanliness Test gave superior reproducibility for both clean and inoculated surfaces, giving mean coefficients of variation of 24% and 32%, respectively. Hygiene swabbing of inoculated surfaces gave a mean CV of 130%. The results are discussed in the context of hygiene monitoring within the food industry. Copyright 1999 John Wiley & Sons, Ltd.
Study on contaminants on flight and other critical surfaces
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Hughes, Charles; Arendale, William F.
1994-01-01
The control of surface contamination in the manufacture of space hardware can become a critical step in the production process. Bonded surfaces have been shown to be affected markedly by contamination. It is important to insure surface cleanliness by preventing contamination prior to bonding. In this vein techniques are needed in which the contamination which may affect bonding are easily found and removed. Likewise, if materials which are detrimental to bonding are not easily removed, then they should not be used in the manufacturing process. This study will address the development of techniques to locate and quantify contamination levels of particular contaminants. With other data becoming available from MSFC and its contractors, this study will also quantify how certain contaminants affect bondlines and how easily they are removed in manufacturing.
NASA Astrophysics Data System (ADS)
1989-01-01
A "NASA Tech Briefs" article describing an inspection tool and technique known as Optically Stimulated Electron Emission (OSEE) led to the formation of Photo Acoustic Technology, Inc. (PAT). PAT produces sensors and scanning systems which assure surface cleanliness prior to bonding, coating, painting, etc. The company's OP1000 series realtime pre-processing detection capability assures 100 percent surface quality testing. The technique involves brief exposure of the inspection surface to ultraviolet radiation. The energy interacts with the surface layer, causing free electrons to be emitted from the surface to be picked up by the detector. When contamination is present, it interferes with the electron flow in proportion to the thickness of the contaminant layer enabling measurement by system signal output. OP1000 systems operate in conventional atmospheres on all types of material and detect both organic and inorganic contamination.
Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification
NASA Technical Reports Server (NTRS)
Melton, D. M.
1998-01-01
Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.
2016-10-01
comes when considering numerous scores and statistics during a preliminary evaluation of the applicability of the fuzzy- verification minimum coverage...The selection of thresholds with which to generate categorical-verification scores and statistics from the application of both traditional and...of statistically significant numbers of cases; the latter presents a challenge of limited application for assessment of the forecast models’ ability
QPF verification using different radar-based analyses: a case study
NASA Astrophysics Data System (ADS)
Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.
2009-09-01
Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.
Damen, Tom G E; Müller, Barbara C N; van Baaren, Rick B; Dijksterhuis, Ap
2015-01-01
In the present study we investigated whether differences in the sense of agency influenced the effectiveness of both direct persuasion and self-persuasion techniques. By manipulating both the delay and contingency of the outcomes of actions, participants were led to experience either a low or high sense of agency. Participants were subsequently presented with arguments as to why a clean local environment is important (direct persuasion), or were asked to generate those arguments themselves (self-persuasion). Subsequently, participants' cleanliness attitudes and willingness to participate in a campus cleanup were measured. The results show that techniques of direct persuasion influenced attitudes and volunteering behavior under conditions of low rather than high agency, whereas techniques of self-persuasion were most effective under conditions of high rather than low agency. The present findings therefore show how recent experiences of agency, a state based experience of control, can influence the effectiveness of both external and internal persuasion techniques.
Re-Examining the Agentic Shift: The Sense of Agency Influences the Effectiveness of (Self)Persuasion
Damen, Tom G. E.; Müller, Barbara C. N.; van Baaren, Rick B.; Dijksterhuis, Ap
2015-01-01
In the present study we investigated whether differences in the sense of agency influenced the effectiveness of both direct persuasion and self-persuasion techniques. By manipulating both the delay and contingency of the outcomes of actions, participants were led to experience either a low or high sense of agency. Participants were subsequently presented with arguments as to why a clean local environment is important (direct persuasion), or were asked to generate those arguments themselves (self-persuasion). Subsequently, participants’ cleanliness attitudes and willingness to participate in a campus cleanup were measured. The results show that techniques of direct persuasion influenced attitudes and volunteering behavior under conditions of low rather than high agency, whereas techniques of self-persuasion were most effective under conditions of high rather than low agency. The present findings therefore show how recent experiences of agency, a state based experience of control, can influence the effectiveness of both external and internal persuasion techniques. PMID:26053303
High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
David L. Chichester; James T. Johnson; Edward H. Seabury
2012-07-01
Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials.more » The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and experiments, using fission-spectrum neutron sources to assess neutron transmission through composite low-Z attenuators.« less
Formal specification and verification of Ada software
NASA Technical Reports Server (NTRS)
Hird, Geoffrey R.
1991-01-01
The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.
Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth
2016-01-01
If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.
NASA Astrophysics Data System (ADS)
Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.
Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.
NASA Astrophysics Data System (ADS)
Maulidya, Hasana P.
2018-02-01
This research is motivated by the difference of market hygiene condition, where the market hygiene level is influenced by the environment around the market. In general, markets located near densely populated housing tend to be overlooked, while markets near elite housing tend to be clean. This condition is also influenced by marketers' awareness of market hygiene. If the market is near the elite neighbourhood, the level of awareness of sellers on cleanliness will be high. If the market is located in a densely populated area, sellers generally do not pay attention to cleanliness. The purpose of this research is to know the sellers's awareness of environmental cleanliness of Market Bulak, Klender Market and Rawamangun Market. Respondents in this study are sellers and buyers who make transactions in these 3 markets. This type of research is descriptive analysis with the method of observation and interview to 10 sellers in each market. Seller hygiene awareness are poor.
Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,
State of personal hygiene among primary school children: A community based cohort study.
Ahmadu, Baba Usman; Rimamchika, Musa; Ibrahim, Ahmad; Nnanubumom, Andy Angela; Godiya, Andrew; Emmanuel, Pembi
2013-01-01
Good personal hygiene in primary school children could be effective towards preventing infectious diseases. This work examined personal cleanliness of primary school children in Banki based on the following variables: bathing, state of uniforms, hair, nails and oral hygiene. One hundred and fifty primary school children in Banki community were selected using the cluster random sampling method. Analysis of variance was used to compare means and to test for significance of data, and coefficient of correlation to investigate the relationship between cleanliness and age of subjects. There were 87 (58 %) boys and 63 (42 %) girls in a ratio of 1.4:1. Ninety six (64 %) pupils belong to low socioeconomic class. Whereas, 53 (35.3 %) were found within 11-13 years age group, the overall mean age was 9 years (Standard deviation [SD] was 2.2), 95 CI (7.0 - 11.0) years. Comparing means for the different categories of personal hygiene, there was significant difference (F= 61.47, p < 0.0001). General personal cleanliness in our participants improved with age, and a positive significant correlation was observed between age and personal cleanliness in (r = 0.971, p = 0.026). In conclusion, significant number of primary school pupils in Banki community had good personal hygiene, which was observed to be directly proportional with age. Therefore, all efforts towards quality health education on personal hygiene as a means of primary prevention of illnesses in primary school pupils should be sustained.
2010-03-01
is to develop a novel clinical useful delivered-dose verification protocol for modern prostate VMAT using Electronic Portal Imaging Device (EPID...technique. A number of important milestones have been accomplished, which include (i) calibrated CBCT HU vs. electron density curve; (ii...prostate VMAT using Electronic Portal Imaging Device (EPID) and onboard Cone beam Computed Tomography (CBCT). The specific aims of this project
77 FR 64596 - Proposed Information Collection (Income Verification) Activity: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-22
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0518] Proposed Information Collection (Income... to income- dependent benefits. DATES: Written comments and recommendations on the proposed collection... techniques or the use of other forms of information technology. Title: Income Verification, VA Form 21-0161a...
Minimization of nanosatellite low frequency magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belyayev, S. M., E-mail: belyayev@isr.lviv.ua; Royal Institute of Technology, Stockholm 11428; Dudkin, F. L.
2016-03-15
Small weight and dimensions of the micro- and nanosatellites constrain researchers to place electromagnetic sensors on short booms or on the satellite body. Therefore the electromagnetic cleanliness of such satellites becomes a central question. This paper describes the theoretical base and practical techniques for determining the parameters of DC and very low frequency magnetic interference sources. One of such sources is satellite magnetization, the reduction of which improves the accuracy and stability of the attitude control system. We present design solutions for magnetically clean spacecraft, testing equipment, and technology for magnetic moment measurements, which are more convenient, efficient, and accuratemore » than the conventional ones.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peter Kneisel
2005-03-19
This contribution summarizes the surface preparation procedures for niobium cavities presently used both in laboratory experiments and for modules, such as buffered chemical polishing (BCP), electropolishing (EP), high pressure ultrapure water rinsing (HPR), CO{sub 2} snow cleaning and high temperature heat treatments for hydrogen degassing or postpurification. The impact of surface treatments and the degree of cleanliness during assembly procedures on cavity performance (Q - value and accelerating gradient E{sub acc}) will be discussed. In addition, an attempt will be made to summarize the experiences made in module assemblies in different labs/projects such as DESY(TTF), Jlab (Upgrade) and SNS.
Characterization of welded HP 9-4-30 steel for the advanced solid rocket motor
NASA Technical Reports Server (NTRS)
Watt, George William
1990-01-01
Solid rocket motor case materials must be high-strength, high-toughness, weldable alloys. The Advanced Solid Rocket Motor (ASRM) cases currently being developed will be made from a 9Ni-4Co quench and temper steel called HP 9-4-30. These ultra high-strength steels must be carefully processed to give a very clean material and a fine grained microstructure, which insures excellent ductility and toughness. The HP 9-4-30 steels are vacuum arc remelted and carbon deoxidized to give the cleanliness required. The ASRM case material will be formed into rings and then welded together to form the case segments. Welding is the desired joining technique because it results in a lower weight than other joining techniques. The mechanical and corrosion properties of the weld region material were fully studied.
Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean
2009-01-01
Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.
Test load verification through strain data analysis
NASA Technical Reports Server (NTRS)
Verderaime, V.; Harrington, F.
1995-01-01
A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
The use of filter media to determine filter cleanliness
NASA Astrophysics Data System (ADS)
Van Staden, S. J.; Haarhoff, J.
It is general believed that a sand filter starts its life with new, perfectly clean media, which becomes gradually clogged with each filtration cycle, eventually getting to a point where either head loss or filtrate quality starts to deteriorate. At this point the backwash cycle is initiated and, through the combined action of air and water, returns the media to its original perfectly clean state. Reality, however, dictates otherwise. Many treatment plants visited a decade or more after commissioning are found to have unacceptably dirty filter sand and backwash systems incapable of returning the filter media to a desired state of cleanliness. In some cases, these problems are common ones encountered in filtration plants but many reasons for media deterioration remain elusive, falling outside of these common problems. The South African conditions of highly eutrophic surface waters at high temperatures, however, exacerbate the problems with dirty filter media. Such conditions often lead to the formation of biofilm in the filter media, which is shown to inhibit the effective backwashing of sand and carbon filters. A systematic investigation into filter media cleanliness was therefore started in 2002, ending in 2005, at the University of Johannesburg (the then Rand Afrikaans University). This involved media from eight South African Water Treatment Plants, varying between sand and sand-anthracite combinations and raw water types from eutrophic through turbid to low-turbidity waters. Five states of cleanliness and four fractions of specific deposit were identified relating to in situ washing, column washing, cylinder inversion and acid-immersion techniques. These were measured and the results compared to acceptable limits for specific deposit, as determined in previous studies, though expressed in kg/m 3. These values were used to determine the state of the filters. In order to gain greater insight into the composition of the specific deposits stripped from the media, a four-point characterisation step was introduced for the resultant suspensions based on acid-solubility and volatility. Results showed that a reasonably effective backwash removed a median specific deposit of 0.89 kg/m 3. Further washing in a laboratory column removed a median specific deposit of 1.34 kg/m 3. Media subjected to a standardised cylinder inversion procedure removed a median specific deposit of 2.41 kg/m 3. Immersion in a strong acid removed a median specific deposit of 35.2 kg/m 3. The four-point characterisation step showed that the soluble-volatile fraction was consistently small in relation to the other fractions. The organic fraction was quite high at the RG treatment plant and the soluble-non-volatile fraction was particularly high at the BK treatment plant.
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K, Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Cluster man/system design requirements and verification. [for Skylab program
NASA Technical Reports Server (NTRS)
Watters, H. H.
1974-01-01
Discussion of the procedures employed for determining the man/system requirements that guided Skylab design, and review of the techniques used for implementing the man/system design verification. The foremost lesson learned from the design need anticipation and design verification experience is the necessity to allow for human capabilities of in-flight maintenance and repair. It is now known that the entire program was salvaged by a series of unplanned maintenance and repair events which were implemented in spite of poor design provisions for maintenance.
NASA Astrophysics Data System (ADS)
Lindstrom, D.; Allen, C.
One of the strong scientific reasons for returning samples from Mars is to search for evidence of current or past life in the samples. Because of the remote possibility that the samples may contain life forms that are hazardous to the terrestrial biosphere, the National Research Council has recommended that all samples returned from Mars be kept under strict biological containment until tests show that they can safely be released to other laboratories. It is possible that Mars samples may contain only scarce or subtle traces of life or prebiotic chemistry that could readily be overwhelmed by terrestrial contamination. Thus, the facilities used to contain, process, and analyze samples from Mars must have a combination of high-level biocontainment and organic / inorganic chemical cleanliness that is unprecedented. We have been conducting feasibility studies and developing designs for a facility that would be at least as capable as current maximum containment BSL-4 (BioSafety Level 4) laboratories, while simultaneously maintaining cleanliness levels exceeding those of the cleanest electronics manufacturing labs. Unique requirements for the processing of Mars samples have inspired a program to develop handling techniques that are much more precise and reliable than the approach (currently used for lunar samples) of employing gloved human hands in nitrogen-filled gloveboxes. Individual samples from Mars are expected to be much smaller than lunar samples, the total mass of samples returned by each mission being 0.5- 1 kg, compared with many tens of kg of lunar samples returned by each of the six Apollo missions. Smaller samp les require much more of the processing to be done under microscopic observation. In addition, the requirements for cleanliness and high-level containment would be difficult to satisfy while using traditional gloveboxes. JSC has constructed a laboratory to test concepts and technologies important to future sample curation. The Advanced Curation Laboratory includes a new- generation glovebox equipped with a robotic arm to evaluate the usability of robotic and teleoperated systems to perform curatorial tasks. The laboratory also contains equipment for precision cleaning and the measurement of trace organic contamination.
Hotoda, S; Aoyama, T; Sato, A; Yamamura, Y; Nakajima, K; Nakamura, K; Sato, H; Iga, T
1999-12-01
We quantitatively studied factors influencing the environment cleanliness for intravenous hyperalimentation (IVH) admixing. The environment cleanliness was evaluated by measuring the counts of particles (> 0.5 micron) and bacteria floating in 1 ft3 of the air inside the clean room (23.6 m3) and in the clean bench built in the department of pharmacy, The University of Tokyo Hospital in 1998. The number of particles at the center of the clean room during IVH admixing by 4 pharmacists was higher than that at the medicine passing area (150 +/- 50/ft3 vs. 260 +/- 60/ft3; mean +/- S.D., n = 12). The cleanliness inside the clean room was improved as the measurement point became higher from the floor (600 +/- 180/ft3, 150 +/- 50/ft3, and 35 +/- 15/ft3 at 50, 100, and 150 cm height, respectively) and the number of persons working inside the room decreased. The changes in the counts of floating bacteria were similar to that of floating particles under the same conditions. In addition the effect of disinfection on the counts of bacteria was clearly observed. When the cleanliness of the room became lower by turning off the air conditioning, the particle counts inside the clean bench became lower along with the distance from the front glass becoming deeper (i.e., 1400 +/- 550/ft3, 140 +/- 70/ft3, and 40 +/- 30/ft3 at 0, 5, and 15 cm, respectively). From these lines of evidence, the following items were suggested in order to maintain the environment cleanliness for IVH admixing. First, the number of persons residing in the clean room should be kept to be minimum. Second, the clean bench should be set up in the center of the clean room. Finally IVH admixing operation should be performed at more than 15 cm depth inside the front glass surface of the clean bench. Moreover, the effect of mopping-up of the clean room with 0.1% benzethonium chloride clearly demonstrated the importance of disinfection on a routine basis.
Numerical Modeling of Inclusion Behavior in Liquid Metal Processing
NASA Astrophysics Data System (ADS)
Bellot, Jean-Pierre; Descotes, Vincent; Jardy, Alain
2013-09-01
Thermomechanical performance of metallic alloys is directly related to the metal cleanliness that has always been a challenge for metallurgists. During liquid metal processing, particles can grow or decrease in size either by mass transfer with the liquid phase or by agglomeration/fragmentation mechanisms. As a function of numerical density of inclusions and of the hydrodynamics of the reactor, different numerical modeling approaches are proposed; in the case of an isolated particle, the Lagrangian technique coupled with a dissolution model is applied, whereas in the opposite case of large inclusion phase concentration, the population balance equation must be solved. Three examples of numerical modeling studies achieved at Institut Jean Lamour are discussed. They illustrate the application of the Lagrangian technique (for isolated exogenous inclusion in titanium bath) and the Eulerian technique without or with the aggregation process: for precipitation and growing of inclusions at the solidification front of a Maraging steel, and for endogenous inclusions in the molten steel bath of a gas-stirred ladle, respectively.
A study of trends and techniques for space base electronics
NASA Technical Reports Server (NTRS)
Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.
1978-01-01
A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.
Aiemjoy, Kristen; Stoller, Nicole E; Gebresillasie, Sintayehu; Shiferaw, Ayalew; Tadesse, Zerihun; Sewnet, Tegene; Ayele, Bezuayehu; Chanyalew, Melsew; Callahan, Kelly; Stewart, Aisha; Emerson, Paul M; Lietman, Thomas M; Keenan, Jeremy D; Oldenburg, Catherine E
2016-10-01
Face cleanliness is a core component of the SAFE (Surgery, Antibiotics, Facial cleanliness, and Environmental improvements) strategy for trachoma control. Understanding knowledge, attitudes, and behaviors related to face washing may be helpful for designing effective interventions for improving facial cleanliness. In April 2014, a mixed methods study including focus groups and a quantitative cross-sectional study was conducted in the East Gojjam zone of the Amhara region of Ethiopia. Participants were asked about face washing practices, motivations for face washing, use of soap (which may reduce bacterial load), and fly control strategies. Overall, both knowledge and reported practice of face washing was high. Participants reported they knew that washing their own face and their children's faces daily was important for hygiene and infection control. Although participants reported high knowledge of the importance of soap for face washing, quantitative data revealed strong variations by community in the use of soap for face washing, ranging from 4.4% to 82.2% of households reporting using soap for face washing. Cost and forgetfulness were cited as barriers to the use of soap for face washing. Keeping flies from landing on children was a commonly cited motivator for regular face washing, as was trachoma prevention. Interventions aiming to improve facial cleanliness for trachoma prevention should focus on habit formation (to address forgetfulness) and address barriers to the use of soap, such as reducing cost. Interventions that focus solely on improving knowledge may not be effective for changing face-washing behaviors.
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swart, Peter K.; Dixon, Tim
2014-09-30
A series of surface geophysical and geochemical techniques are tested in order to demonstrate and validate low cost approaches for Monitoring, Verification and Accounting (MVA) of the integrity of deep reservoirs for CO 2 storage. These techniques are (i) surface deformation by GPS; ii) surface deformation by InSAR; iii) passive source seismology via broad band seismometers; and iv) soil gas monitoring with a cavity ring down spectrometer for measurement of CO 2 concentration and carbon isotope ratio. The techniques were tested at an active EOR (Enhanced Oil Recovery) site in Texas. Each approach has demonstrated utility. Assuming Carbon Capture, Utilizationmore » and Storage (CCUS) activities become operational in the future, these techniques can be used to augment more expensive down-hole techniques.« less
The use of positron emission tomography in pion radiotherapy.
Goodman, G B; Lam, G K; Harrison, R W; Bergstrom, M; Martin, W R; Pate, B D
1986-10-01
The radioactive debris produced by pion radiotherapy can be imaged by the technique of Positron Emission Tomography (PET) as a method of non-invasive in situ verification of the pion treatment. This paper presents the first visualization of the pion stopping distribution within a tumor in a human brain using PET. Together with the tissue functional information provided by the standard PET scans using radiopharmaceuticals, the combination of pion with PET technique can provide a much better form of radiotherapy than the use of conventional radiation in both treatment planning and verification.
A High-Level Language for Modeling Algorithms and Their Properties
NASA Astrophysics Data System (ADS)
Akhtar, Sabina; Merz, Stephan; Quinson, Martin
Designers of concurrent and distributed algorithms usually express them using pseudo-code. In contrast, most verification techniques are based on more mathematically-oriented formalisms such as state transition systems. This conceptual gap contributes to hinder the use of formal verification techniques. Leslie Lamport introduced PlusCal, a high-level algorithmic language that has the "look and feel" of pseudo-code, but is equipped with a precise semantics and includes a high-level expression language based on set theory. PlusCal models can be compiled to TLA + and verified using the model checker tlc.
Peer Review of a Formal Verification/Design Proof Methodology
NASA Technical Reports Server (NTRS)
1983-01-01
The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.
Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach
NASA Technical Reports Server (NTRS)
Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip
2017-01-01
While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.
Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop
NASA Technical Reports Server (NTRS)
Rozier, Kristin Yvonne (Editor)
2008-01-01
Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.
NASA Astrophysics Data System (ADS)
Wentworth, Mami Tonoe
Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.
Neutron spectrometry for UF 6 enrichment verification in storage cylinders
Mengesha, Wondwosen; Kiff, Scott D.
2015-01-29
Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less
Authoring and verification of clinical guidelines: a model driven approach.
Pérez, Beatriz; Porres, Ivan
2010-08-01
The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.
Simulation verification techniques study: Simulation self test hardware design and techniques report
NASA Technical Reports Server (NTRS)
1974-01-01
The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
Formulating face verification with semidefinite programming.
Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S
2007-11-01
This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.
Megasonic cleaning strategy for sub-10nm photomasks
NASA Astrophysics Data System (ADS)
Hsu, Jyh-Wei; Samayoa, Martin; Dress, Peter; Dietze, Uwe; Ma, Ai-Jay; Lin, Chia-Shih; Lai, Rick; Chang, Peter; Tuo, Laurent
2016-10-01
One of the main challenges in photomask cleaning is balancing particle removal efficiency (PRE) with pattern damage control. To overcome this challenge, a high frequency megasonic cleaning strategy is implemented. Apart from megasonic frequency and power, photomask surface conditioning also influences cleaning performance. With improved wettability, cleanliness is enhanced while pattern damage risk is simultaneously reduced. Therefore, a particle removal process based on higher megasonic frequencies, combined with proper surface pre-treatment, provides improved cleanliness without the unintended side effects of pattern damage, thus supporting the extension of megasonic cleaning technology into 10nm half pitch (hp) device node and beyond.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1994-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1995-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).
[Cleanliness Norms 1964-1975].
Noelle-Neumann, E
1976-01-01
In 1964 the Institut für Demoskopie Allensbach made a first survey taking stock of norms concerning cleanliness in the Federal Republic of Germany. At that time, 78% of respondents thought that the vogue among young people of cultivating an unkempt look was past or on the wane (Table 1.). Today we know that this fashion was an indicator of more serious desires for change in many different areas like politics, sexual morality, education and that its high point was still to come. In the fall of 1975 a second survey, modelled on the one of 1964, was conducted. Again, it concentrated on norms, not on behavior. As expected, norms have changed over this period but not in a one-directional or simple manner. In general, people are much more large-minded about children's looks: neat, clean school-dress, properly combed hair, clean shoes, all this and also holding their things in order has become less important in 1975 (Table 2). To carry a clean handkerchief is becoming oldfashioned (Table 3). On the other hand, principles of bringing-up children have not loosened concerning personal hygiene - brushing ones teeth, washing hands, feet, and neck, clean fingernails (Table 4). On one item related to protection of the environment, namely throwing around waste paper, standards have even become more strict (Table 5). With regard to school-leavers, norms of personal hygiene have generally become more strict (Table 6). As living standards have gone up and the number of full bathrooms has risen from 42% to 75% of households, norms of personal hygiene have also increased: one warm bath a week seemed enough to 56% of adults in 1964, but to only 32% in 1975 (Table 7). Also standards for changing underwear have changed a lot: in 1964 only 12% of respondents said "every day", in 1975 48% said so (Table 8). Even more stringent norms are applied to young women (Tables 9/10). For comparison: 1964 there were automatic washing machines in 16%, 1975 in 79% of households. Answers to questions which qualities men value especially in women and which qualities women value especially in men show a decrease in valutation of "cleanliness". These results can be interpreted in different ways (Tables 11/12). It seems, however, that "cleanliness" is not going out as a cultural value. We have found that young people today do not consider clean dress important but that they are probably better washed under their purposely neglected clothing than young people were ten years ago. As a nation, Germans still consider cleanliness to be a articularly German virtue, 1975 even more so than 1964 (Table 13). An association test, first made in March 1976, confirms this: When they hear "Germany", 68% of Germans think of "cleanliness" (Table 14).
SPICE SDM: Innovative Approaches for Linear Motion and Heat Management
NASA Astrophysics Data System (ADS)
Relecom, Ken; Larcheveque, Cyril; Constant, Joël; Autissier, Nordahl; Pornin, Arnaud; Martini, Nicolas
2015-09-01
The SPICE Door Mechanism (SDM) is foreseen to be flown on Solar Orbiter, to close the SPICE instrument aperture and shield it from the solar flux and from contamination. The environment it is exposed to is particularly extreme, as the Solar Orbiter mission will reach a distance of 0.28 AU (41’887’403.8 km) to the Sun, and the SPICE instrument will be looking directly at it. Because of its position at the far end of a cantilevered structure, the SDM is also exposed to amplified launch loads and must remain very light and compact. The cleanliness constraints are also very tight, as the mechanism is positioned directly at the aperture of the SPICE spectrometer.To tackle these issues, two novelties were introduced on the SPICE Door Mechanism:- A specifically engineered reflective coating toprotect the Aluminium door from the heat generatedby the solar flux- The use of miniature profile rail type linearbearings to support the door during launch andallow its motion during the missionThis paper details the design and verification approach applied for these two innovations and for the mechanism as a whole, as well as the results and findings from the testing carried out on the Bread Board, Qualification and Flight models.
Sabet, Mahsheed; O'Connor, Daryl J.; Greer, Peter B.
2011-01-01
There have been several manual, semi‐automatic and fully‐automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator‐based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine. PACS numbers: 87.53.Ly, 87.56.Fc, 87.56.‐v PMID:22089022
Experimental preparation and verification of quantum money
NASA Astrophysics Data System (ADS)
Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei
2018-03-01
A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.
Formal verification of automated teller machine systems using SPIN
NASA Astrophysics Data System (ADS)
Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam
2017-08-01
Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.
Static and Dynamic Verification of Critical Software for Space Applications
NASA Astrophysics Data System (ADS)
Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.
Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.
Verification of Space Weather Forecasts using Terrestrial Weather Approaches
NASA Astrophysics Data System (ADS)
Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.
2015-12-01
The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.
Apparel for Cleaner Clean Rooms
NASA Technical Reports Server (NTRS)
1983-01-01
In the 1960s NASA pioneered contamination control technology, providing a base from which aerospace contractors could develop control measures. NASA conducted special courses for clean room technicians and supervisors, and published a series of handbooks with input from various NASA field centers. These handbooks extended aerospace experience to the medical, pharmaceutical, electronics, and other industries where extreme cleanliness is important. American Hospital Supply Company (AHSC) felt that high technology products with increasingly stringent operating requirements in aerospace, electronics, pharmaceuticals and medical equipment manufacturing demanded improvement in contamination control techniques. After studying the NASA handbooks and visiting NASA facilities, the wealth of information gathered resulted in Micro-clean non-woven garments and testing equipment and procedures for evaluating effectiveness.
Bubble colloidal AFM probes formed from ultrasonically generated bubbles.
Vakarelski, Ivan U; Lee, Judy; Dagastine, Raymond R; Chan, Derek Y C; Stevens, Geoffrey W; Grieser, Franz
2008-02-05
Here we introduce a simple and effective experimental approach to measuring the interaction forces between two small bubbles (approximately 80-140 microm) in aqueous solution during controlled collisions on the scale of micrometers to nanometers. The colloidal probe technique using atomic force microscopy (AFM) was extended to measure interaction forces between a cantilever-attached bubble and surface-attached bubbles of various sizes. By using an ultrasonic source, we generated numerous small bubbles on a mildly hydrophobic surface of a glass slide. A single bubble picked up with a strongly hydrophobized V-shaped cantilever was used as the colloidal probe. Sample force measurements were used to evaluate the pure water bubble cleanliness and the general consistency of the measurements.
Using ICT techniques for improving mechatronic systems' dependability
NASA Astrophysics Data System (ADS)
Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe
2013-10-01
The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA
2011-01-25
A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.
Quinn, James; Lovasi, Gina; Bader, Michael; Yousefzadeh, Paulette; Weiss, Christopher; Neckerman, Kathryn
2013-01-01
Purpose To determine whether body mass index (BMI) is associated with proximity to neighborhood parks, the size of the parks, their cleanliness and the availability of recreational facilities in the parks. Design Cross-sectional. Setting New York City. Subjects 13,102 adults (median age 45 years, 36% male) recruited from 2000–2002. Measures Anthropometric and socio-demographic data from study subjects were linked to Department of Parks & Recreation data on park space, cleanliness, and facilities. Neighborhood level socio-demographic and park proximity metrics were created for half-mile radius circular buffers around each subject’s residence. Proximity to park space was measured as the proportion of the subject’s neighborhood buffer area that was total park space, large park space (a park > 6 acres) and small park space (a park <=6 acres). Analysis Hierarchical linear models were used to determine whether neighborhood park metrics were associated with BMI. Results Higher proximity to large park space was significantly associated with lower BMI (beta = −1.69 95% CI = −2.76, −0.63). Across the population distribution of proximity to large park space, compared to subjects living in neighborhoods at the 10th percentile of the distribution, the covariate adjusted average BMI was estimated to be 0.35 kg/m2 lower for those living in neighborhoods at the 90th percentile. The proportion of neighborhood area that was small park space was not associated with BMI, nor was park cleanliness or the availability of recreational facilities. Conclusions Neighborhood proximity to large park spaces is modestly associated with lower BMI in a diverse urban population. PMID:23448416
Alam, Mahbub-Ul; Winch, Peter J; Saxton, Ronald E; Nizame, Fosiul A; Yeasmin, Farzana; Norman, Guy; Masud, Abdullah-Al; Begum, Farzana; Rahman, Mahbubur; Hossain, Kamal; Layden, Anita; Unicomb, Leanne; Luby, Stephen P
2017-08-01
Shared toilets in urban slums are often unclean and poorly maintained, discouraging consistent use and thereby limiting impacts on health and quality of life. We developed behaviour change interventions to support shared toilet maintenance and improve user satisfaction. We report the intervention effectiveness on improving shared toilet cleanliness. We conducted a cluster-randomised controlled trial among users of 1226 shared toilets in 23 Dhaka slums. We assessed baseline toilet cleanliness in January 2015. The six-month intervention included provision of hardware (bin for solid waste, 4 l flushing bucket, 70 l water reservoir), and behaviour change communication (compound meetings, interpersonal household sessions, signs depicting rules for toilet use). We estimated the adjusted difference in difference (DID) to assess outcomes and accounted for clustering effects using generalised estimating equations. Compared to controls, intervention toilets were more likely to have water available inside toilet cubicles (DID: +4.7%, 95% CI: 0.2, 9.2), access to brush/broom for cleaning (DID: +8.4%, 95% CI: 2, 15) and waste bins (DID: +63%, 95% CI: 59, 66), while less likely to have visible faeces inside the pan (DID: -13%, 95% CI: -19, -5), the smell of faeces (DID: -7.6%, 95% CI: -14, -1.3) and household waste inside the cubicle (DID: -4%, 95% CI: -7, -1). In one of few efforts to promote shared toilet cleanliness, intervention compounds were significantly more likely to have cleaner toilets after six months. Future research might explore how residents can self-finance toilet maintenance, or employ mass media to reduce per-capita costs of behaviour change. © 2017 John Wiley & Sons Ltd.
Zhou, Meng-Qi; Wang, Hao-Ming; Xiao, Jia-Qi; Hong, Jin
2016-10-01
To histologically evaluate the efficacy of sodium hypochlorite (NaClO) in combination with Er:YAG (erbium-doped yttrium aluminium garnet) laser in dissolving necrotic tissue and cleaning root canals as well as canal isthmuses. After scanned by cone-beam CT (CBCT), 50 well-prepared premolars with root canal isthmuses were selected and randomly assigned into 5 groups. They were subsequently subjected to different regimens as followed: group A-irrigated with 1% NaClO for 1 minute, group B- irradiated by Er:YAG laser at 0.5 W combined with 1% NaClO irrigation for 1 minute, group C- irradiated by Er:YAG laser at 1.0 W combined with 1% NaClO irrigation for 1 minute, group D- irradiated by Er:YAG laser at 2.0 W combined with 1% NaClO irrigation for 1 minute,group E- negative control. After histological preparation and staining, the cross-sections were evaluated for percentage of tissue removal from root canals and isthmuses. The cleanliness values were calculated using SPSS 13.0 software package. The mean percentage of root canals in group A, B, C and D was 95.24%, 96.53%、97.63% and 98.22%, respectively, and the mean percentage of isthmuses was 16.50%, 51.48%, 52.56% and 53.83%, respectively. The mean percentage of root canal and isthmus cleanliness values were significantly higher in group B, C and D (P<0.05) than that in group A. There was no significant differences of root canal and isthmus cleanliness among group B, C and D. Er:YAG laser combined with 1% NaClO irrigation may be used effectively in root canal and root canal isthmus cleanliness as a new method.
Four Fallacies and an Oversight: Searching for Martian Life
NASA Astrophysics Data System (ADS)
Rummel, J. D.; Conley, C. A.
2017-10-01
While it is anticipated that future human missions to Mars will increase the amount of biological and organic contamination that might be distributed on that planet, robotic missions continue to grow in capability and complexity, requiring precautions to be taken now to protect Mars, and particularly areas of Mars that might be Special Regions. Such precautionary cleanliness requirements for spacecraft have evolved over the course of the space age, as we have learned more about planetary environments, and are the subject of regular deliberations and decisions sponsored by the Committee on Space Research (COSPAR). COSPAR's planetary protection policy is maintained as an international consensus standard for spacecraft cleanliness that is recognized by the United Nations Committee on the Peaceful Uses of Outer Space. In response to the paper presented in this issue by Fairén et al. (2017), we examine both their concept of evidence for possible life on Mars and their logic in recommending that spacecraft cleanliness requirements be relaxed to access Special Regions "before it is too late." We find that there are shortcomings in their plans to look for evidence of life on Mars, that they do not support their contention that appropriate levels of spacecraft cleanliness are unaffordable, that there are major risks in assuming martian life could be identified by nucleic acid sequence comparison (especially if those sequences are obtained from a Special Region contaminated with Earth life), and that the authors do not justify their contention that exploration with dirty robots, now, is preferable to the possibility that later contamination will be spread by human exploration. We also note that the potential effects of contaminating resources and environments essential to future human occupants of Mars are both significant and not addressed by Fairén et al. (2017).
Development of automated optical verification technologies for control systems
NASA Astrophysics Data System (ADS)
Volegov, Peter L.; Podgornov, Vladimir A.
1999-08-01
The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.
Automatic Methods and Tools for the Verification of Real Time Systems
1997-11-30
We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.
Applying Formal Verification Techniques to Ambient Assisted Living Systems
NASA Astrophysics Data System (ADS)
Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel
This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.
Formal verification of AI software
NASA Technical Reports Server (NTRS)
Rushby, John; Whitehurst, R. Alan
1989-01-01
The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.
Action-based verification of RTCP-nets with CADP
NASA Astrophysics Data System (ADS)
Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin
2015-12-01
The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.
Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar
2014-01-01
During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458
Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar
2014-01-02
During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures.
How Clean Are Hotel Rooms? Part I: Visual Observations vs. Microbiological Contamination.
Almanza, Barbara A; Kirsch, Katie; Kline, Sheryl Fried; Sirsat, Sujata; Stroia, Olivia; Choi, Jin Kyung; Neal, Jay
2015-01-01
Current evidence of hotel room cleanliness is based on observation rather than empirically based microbial assessment. The purpose of the study described here was to determine if observation provides an accurate indicator of cleanliness. Results demonstrated that visual assessment did not accurately predict microbial contamination. Although testing standards have not yet been established for hotel rooms and will be evaluated in Part II of the authors' study, potential microbial hazards included the sponge and mop (housekeeping cart), toilet, bathroom floor, bathroom sink, and light switch. Hotel managers should increase cleaning in key areas to reduce guest exposure to harmful bacteria.
Hopman, Joost; Donskey, Curtis J; Boszczowski, Icaro; Alfa, Michelle J
2018-05-23
The efficacy of discharge cleaning and disinfection of high-touch surfaces of intensive care unit patient rooms in Brazil, Canada, the Netherlands, and the United States was evaluated and the effect of an educational intervention was determined. Significant site-to-site differences in cleaning regimens and baseline cleanliness levels were observed using ATP levels, colony-forming units, and reflective surface marker removal percent pass rates. An educational intervention that includes rapid feedback of the ATP measurements could significantly improve the quality of the cleaning and disinfection regimens. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. All rights reserved.
Hygiene behaviour and hospitalized severe childhood diarrhoea: a case-control study.
Baltazar, J C; Tiglao, T V; Tempongko, S B
1993-01-01
The relationship between personal and domestic hygiene behaviour and hospitalized childhood diarrhoea was examined in a case-control study of 356 cases and 357 controls from low-income families in metropolitan Manila. Indices of hygiene behaviour were defined for overall cleanliness, kitchen hygiene, and living conditions. Only the indices for overall cleanliness and kitchen hygiene were significantly associated with diarrhoea. An increasing excess risk of hospitalization with severe diarrhoea was noted as the ratings for standards of hygiene became lower, and this excess risk persisted even after controlling for confounding variables. The implications of our findings for the control of diarrhoeal disease are discussed.
Hygiene behaviour and hospitalized severe childhood diarrhoea: a case-control study.
Baltazar, J. C.; Tiglao, T. V.; Tempongko, S. B.
1993-01-01
The relationship between personal and domestic hygiene behaviour and hospitalized childhood diarrhoea was examined in a case-control study of 356 cases and 357 controls from low-income families in metropolitan Manila. Indices of hygiene behaviour were defined for overall cleanliness, kitchen hygiene, and living conditions. Only the indices for overall cleanliness and kitchen hygiene were significantly associated with diarrhoea. An increasing excess risk of hospitalization with severe diarrhoea was noted as the ratings for standards of hygiene became lower, and this excess risk persisted even after controlling for confounding variables. The implications of our findings for the control of diarrhoeal disease are discussed. PMID:8324851
Mg-Ca Alloys Produced by Reduction of CaO: Understanding of ECO-Mg Alloy Production
NASA Astrophysics Data System (ADS)
Jung, In-Ho; Lee, Jin Kyu; Kim, Shae K.
2017-04-01
There have been long debates about the environment conscious (ECO) Mg technology which utilizes CaO to produce Ca-containing Mg alloys. Two key process technologies of the ECO-Mg process are the chemical reduction of CaO by liquid Mg and the maintenance of melt cleanliness during the alloying of Ca. Thermodynamic calculations using FactSage software were performed to explain these two key issues. In addition, an experimental study was performed to compare the melt cleanliness of the Ca-containing Mg alloys produced by the conventional route with metallic Ca and the ECO-Mg route with CaO.
Automated Counting of Particles To Quantify Cleanliness
NASA Technical Reports Server (NTRS)
Rhode, James
2005-01-01
A machine vision system, similar to systems used in microbiological laboratories to count cultured microbes, has been proposed for quantifying the cleanliness of nominally precisely cleaned hardware by counting residual contaminant particles. The system would include a microscope equipped with an electronic camera and circuitry to digitize the camera output, a personal computer programmed with machine-vision and interface software, and digital storage media. A filter pad, through which had been aspirated solvent from rinsing the hardware in question, would be placed on the microscope stage. A high-resolution image of the filter pad would be recorded. The computer would analyze the image and present a histogram of sizes of particles on the filter. On the basis of the histogram and a measure of the desired level of cleanliness, the hardware would be accepted or rejected. If the hardware were accepted, the image would be saved, along with other information, as a quality record. If the hardware were rejected, the histogram and ancillary information would be recorded for analysis of trends. The software would perceive particles that are too large or too numerous to meet a specified particle-distribution profile. Anomalous particles or fibrous material would be flagged for inspection.
Greenberg, D; Witztum, E
1994-01-01
Judaism is one of many religions that demand cleanliness and exactness, inculcate the performance of rituals from childhood and view their non-performance as wrong or sinful. Rituals concerning cleanliness and exactness are the commonest presentations of OCD. In a sample of 34 psychiatric out-patients with OCD in north Jerusalem, religious symptoms were found in 13 of the 19 ultra-orthodox patients, and in one of the 15 non-ultra-orthodox patients. Nine of the 15 OCD patients with religious symptoms also had non-religious symptoms. Four main topics of religious symptomatology were found: prayer, dietary practices, menstrual practices and cleanliness before prayer. The dictates of religious codes regarding these topics are presented and the law is rigorous in its demands, in many cases encouraging repeating rituals. Nevertheless, repetitive performance of religious rituals is recognized by OCD sufferers and their rabbis as expressing psychopathology rather than heightened spirituality. The forms of the religious obsessions and the associated rituals in this sample were similar to the presentation of OCD in non-religious patients. Religion appears not to be a distinctive topic of OCD, rather it is the setting for the condition in very religious patients.
NASA Technical Reports Server (NTRS)
Berkebile, Stephen; Gaier, James R.
2012-01-01
During the Apollo missions, the adhesion of dust to critical spacecraft systems was a greater problem than anticipated and resulted in functional degradation of thermal control surfaces, spacesuit seals, and other spacecraft components. Notably, Earth-based simulation efforts did not predict the magnitude and effects of dust adhesion in the lunar environment. Forty years later, we understand that the ultrahigh vacuum (UHV) environment, coupled with micrometeorite impacts and constant ion and photon bombardment from the sun result in atomically clean and high surface energy dust particles and spacecraft surfaces. However, both the dominant mechanism of adhesion in airless environments and the conditions for high fidelity simulation tests have still to be determined. The experiments presented in here aim to aid in the development of dust mitigation techniques for airless bodies (e.g., lunar surface, asteroids, moons of outer planets). The approach taken consists of (a) quantifying the adhesion between common polymer and metallic spacecraft materials and a synthetic noritic volcanic glass, as a function of surface cleanliness and of triboelectric charge transfer in a UHV environment, and (b) determining parameters for high fidelity tests through investigation of adhesion dependence on vacuum environment and sample treatment. Adhesion force has been measured between pins of spacecraft materials and a plate of synthetic volcanic glass by determining the pull-off force with a torsion balance. Although no significant adhesion is generally observed directly as a result of high surface energies, the adhesion due to induced electrostatic charge is observed to increase with spacecraft material cleanliness, in some cases by over a factor of 10. Furthermore, electrostatically-induced adhesion is found to decrease rapidly above pressures of 10-6 torr. It is concluded that high-fidelity tests should be conducted in high to ultrahigh vacuum and include an ionized surface cleaning process.
NASA Technical Reports Server (NTRS)
Kashangaki, Thomas A. L.
1992-01-01
This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.
Signature Verification Based on Handwritten Text Recognition
NASA Astrophysics Data System (ADS)
Viriri, Serestina; Tapamo, Jules-R.
Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.
Face verification with balanced thresholds.
Yan, Shuicheng; Xu, Dong; Tang, Xiaoou
2007-01-01
The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Active alignment/contact verification system
Greenbaum, William M.
2000-01-01
A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Croce Ferri, Lucilla
2003-06-01
Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Tessema, Shewaye Belay; Adane, Mesafint Molla
2015-09-10
Client satisfaction is a vital component and main concern intertwined with strategic decisions in service provisions. To improve efficiency of services, eliciting the opinion of users about the available services and identifying factors associated with dissatisfaction is very critical. Thus, the main objective of this study was to assess the perceived levels of clients' satisfaction with health services at ART clinic level in health centres of Tigray Region in Ethiopia. Cross sectional study was conducted from May to June 2013 in Tigray Region ART clinics. A total of 714 ART care user were included in the study using both purposive and probability sampling technique. Data was collected by using structured questionnaire and the collected data was analysed using Statistical Package for the Social Sciences (SPSS) version 16.0. Crude and Adjusted logistic regression analyses were carried out to identify the associated factors underlying perceived levels of clients' overall satisfaction. Finally, the results were presented with table as well as odds ratio (OR) and 95% confidence interval (CI). A total of 714 study participants were enrolled in this study. An overall satisfaction level of 89.6% was reported by ART care service users. Higher scores of satisfaction of services provisions were reported for courtesy and respect (95.80%) followed by privacy (93.28%). On the other hand, respondents' dissatisfaction was rated 35.32% for toilet cleanliness followed by 26.19% for availability of additional drugs. As for overall satisfaction and associated factors, adjusted logistic regression analyses showed that marital status [AOR = 2.01 (95% CI: 1.11, 3.60)], educational status [AOR = 3.13 (95% CI: 1.15, 8.53)], travel distance to reach health centre [AOR = 3.59 (95% CI: 1.23, 10.50)], toilet cleanliness [AOR = 2.22 (95% CI :1.62, 6.32)], and ART drug availability [AOR = 2.60 (95% CI :1.18, 6.52)] were found to have influence on overall ART service satisfaction status. This study revealed high level of client satisfaction rate and were associated with preventable and modifiable factors such as marital status, educational status, travel distance to reach health centre, toilet cleanliness and ART drug availability. Therefore, countermeasures such as increasing access to ART service, availing clean toilet and ART drugs may further increase client satisfaction level in the region.
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
Behavioral biometrics for verification and recognition of malicious software agents
NASA Astrophysics Data System (ADS)
Yampolskiy, Roman V.; Govindaraju, Venu
2008-04-01
Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...
2017-09-01
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
Safeguardability of the vitrification option for disposal of plutonium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pillay, K.K.S.
1996-05-01
Safeguardability of the vitrification option for plutonium disposition is rather complex and there is no experience base in either domestic or international safeguards for this approach. In the present treaty regime between the US and the states of the former Soviet Union, bilaterial verifications are considered more likely with potential for a third-party verification of safeguards. There are serious technological limitations to applying conventional bulk handling facility safeguards techniques to achieve independent verification of plutonium in borosilicate glass. If vitrification is the final disposition option chosen, maintaining continuity of knowledge of plutonium in glass matrices, especially those containing boron andmore » those spike with high-level wastes or {sup 137}Cs, is beyond the capability of present-day safeguards technologies and nondestructive assay techniques. The alternative to quantitative measurement of fissile content is to maintain continuity of knowledge through a combination of containment and surveillance, which is not the international norm for bulk handling facilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brubaker, Erik; Deland, Sharon M.
This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less
Survey of Verification and Validation Techniques for Small Satellite Software Development
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2015-01-01
The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.
Recent literature on structural modeling, identification, and analysis
NASA Technical Reports Server (NTRS)
Craig, Roy R., Jr.
1990-01-01
The literature on the mathematical modeling of large space structures is first reviewed, with attention given to continuum models, model order reduction, substructuring, and computational techniques. System identification and mode verification are then discussed with reference to the verification of mathematical models of large space structures. In connection with analysis, the paper surveys recent research on eigensolvers and dynamic response solvers for large-order finite-element-based models.
Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir
2018-05-01
In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.
Volumetric Verification of Multiaxis Machine Tool Using Laser Tracker
Aguilar, Juan José
2014-01-01
This paper aims to present a method of volumetric verification in machine tools with linear and rotary axes using a laser tracker. Beyond a method for a particular machine, it presents a methodology that can be used in any machine type. Along this paper, the schema and kinematic model of a machine with three axes of movement, two linear and one rotational axes, including the measurement system and the nominal rotation matrix of the rotational axis are presented. Using this, the machine tool volumetric error is obtained and nonlinear optimization techniques are employed to improve the accuracy of the machine tool. The verification provides a mathematical, not physical, compensation, in less time than other methods of verification by means of the indirect measurement of geometric errors of the machine from the linear and rotary axes. This paper presents an extensive study about the appropriateness and drawbacks of the regression function employed depending on the types of movement of the axes of any machine. In the same way, strengths and weaknesses of measurement methods and optimization techniques depending on the space available to place the measurement system are presented. These studies provide the most appropriate strategies to verify each machine tool taking into consideration its configuration and its available work space. PMID:25202744
Delea, Maryann G; Solomon, Hiwote; Solomon, Anthony W; Freeman, Matthew C
2018-01-01
Efforts are underway to scale-up the facial cleanliness and environmental improvement (F&E) components of the World Health Organization's SAFE strategy for elimination of trachoma as a public health problem. Improving understanding of the F&E intervention landscape could inform advancements prior to scale-up, and lead to more effective and sustained behavior change. We systematically searched for relevant grey literature published from January 1965 through August 2016. Publications were eligible for review if they described interventions addressing F&E in the context of trachoma elimination programs. Subsequent to screening, we mapped attributes of F&E interventions. We then employed three behavior change frameworks to synthesize mapped data and identify potential intervention gaps. We identified 27 documents meeting inclusion criteria. With the exception of some recent programming, F&E interventions have largely focused on intermediate and distal antecedents of behavior change. Evidence from our analyses suggests many interventions are not designed to address documented determinants of improved F&E practices. No reviewed documents endorsed inclusion of intervention components related to behavioral maintenance or resilience-factors critical for sustaining improved behaviors. If left unaddressed, identified gaps in intervention content may continue to challenge uptake and sustainability of improved F&E behaviors. Stakeholders designing and implementing trachoma elimination programs should review their F&E intervention content and delivery approaches with an eye toward improvement, including better alignment with established behavior change theories and empirical evidence. Implementation should move beyond information dissemination, and appropriately employ a variety of behavior change techniques to address more proximal influencers of change.
High cleanliness globe valve with sine mechanism drive
NASA Astrophysics Data System (ADS)
Luo, Hu
2018-06-01
This paper gives a new type of quick-opening globe valve for life support pneumatic control system of the safety cabin at underground coal mine. The valve adopts the sine mechanism to transmit the rotating of the handle in the range of 90° to the reciprocating motion of the spool. The mechanism implements the quick-opening function of the valve through controlling the contact and separation between the O-ring and the end face of the valve. Since there is no relative sliding between the sealing interfaces, the valve solute uncontrollable disadvantage wear particles which produced by package ball valve, to ensure high cleanliness in flow path. Traditional transmission mechanism has a reinforcement effect and reduce handle open torque. By the finite element method, the relationship between the contact force and the compression of O-ring is analyzed to provide the boundary condition for the calculation of the rotational torque. Meanwhile the velocity field and pressure field along the flow path are simulated. The caliber size of the valve and the flow resistance coefficient are obtained. There is higher cleanliness, more reliable sealing, smaller handle open torque advantage compared with existing packing ball valve. The above work presents a new technical approach for the design of pneumatic control valve of the safety cabin.
NASA Astrophysics Data System (ADS)
Varanasi, Rao; Mesawich, Michael; Connor, Patrick; Johnson, Lawrence
2017-03-01
Two versions of a specific 2nm rated filter containing filtration medium and all other components produced from high density polyethylene (HDPE), one subjected to standard cleaning, the other to specialized ultra-cleaning, were evaluated in terms of their cleanliness characteristics, and also defectivity of wafers processed with photoresist filtered through each. With respect to inherent cleanliness, the ultraclean version exhibited a 70% reduction in total metal extractables and 90% reduction in organics extractables compared to the standard clean version. In terms of particulate cleanliness, the ultraclean version achieved stability of effluent particles 30nm and larger in about half the time required by the standard clean version, also exhibiting effluent levels at stability almost 90% lower. In evaluating defectivity of blanket wafers processed with photoresist filtered through either version, initial defect density while using the ultraclean version was about half that observed when the standard clean version was in service, with defectivity also falling more rapidly during subsequent usage of the ultraclean version compared to the standard clean version. Similar behavior was observed for patterned wafers, where the enhanced defect reduction was primarily of bridging defects. The filter evaluation and actual process-oriented results demonstrate the extreme value in using filtration designed possessing the optimal intrinsic characteristics, but with further improvements possible through enhanced cleaning processes
An Optimized Online Verification Imaging Procedure for External Beam Partial Breast Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willis, David J., E-mail: David.Willis@petermac.or; Royal Melbourne Institute of Technology University, Melbourne, Victoria; Kron, Tomas
2011-07-01
The purpose of this study was to evaluate the capabilities of a kilovoltage (kV) on-board imager (OBI)-equipped linear accelerator in the setting of on-line verification imaging for external-beam partial breast irradiation. Available imaging techniques were optimized and assessed for image quality using a modified anthropomorphic phantom. Imaging dose was also assessed. Imaging techniques were assessed for physical clearance between patient and treatment machine using a volunteer. Nonorthogonal kV image pairs were identified as optimal in terms of image quality, clearance, and dose. After institutional review board approval, this approach was used for 17 patients receiving accelerated partial breast irradiation. Imagingmore » was performed before every fraction verification with online correction of setup deviations >5 mm (total image sessions = 170). Treatment staff rated risk of collision and visibility of tumor bed surgical clips where present. Image session duration and detected setup deviations were recorded. For all cases, both image projections (n = 34) had low collision risk. Surgical clips were rated as well as visualized in all cases where they were present (n = 5). The average imaging session time was 6 min, 16 sec, and a reduction in duration was observed as staff became familiar with the technique. Setup deviations of up to 1.3 cm were detected before treatment and subsequently confirmed offline. Nonorthogonal kV image pairs allowed effective and efficient online verification for partial breast irradiation. It has yet to be tested in a multicenter study to determine whether it is dependent on skilled treatment staff.« less
NASA Technical Reports Server (NTRS)
Lingbloom, Mike S.
2008-01-01
During redesign of the Space Shuttle reusable solid rocket motor (RSRM), NASA amended the contract with ATK Launch Systems (then Morton Thiokol Inc.) with Change Order 966 to implement a contamination control and cleanliness verification method. The change order required: (1) A quantitative inspection method (2) A written record of actual contamination levels versus a known reject level (3) A method that is more sensitive than existing methods of visual and black light inspection. Black light inspection is only useful for inspection of contaminants that fluoresce near the 365 nm spectral line and is not useful for inspection of most silicones that will not produce strong fluorescence. Black light inspection conducted by a qualified inspector under controlled light is capable of detecting Conoco HD-2 grease in gross amounts and is very subjective due to operator sensitivity. Optically stimulated electron emission (OSEE), developed at the Materials and Process Laboratory at Marshall Space Flight Center (MSFC), was selected to satisfy Change Order 966. OSEE offers several important advantages over existing laboratory methods with similar sensitivity, e.g., spectroscopy and nonvolatile residue sampling, which provide turn around time, real time capability, and full coverage inspection capability. Laboratory methods require sample gathering and in-lab analysis, which sometimes takes several days to get results. This is not practical in a production environment. In addition, these methods do not offer full coverage inspection of the large components
Four Fallacies and an Oversight: Searching for Martian Life
Conley, C.A.
2017-01-01
Abstract While it is anticipated that future human missions to Mars will increase the amount of biological and organic contamination that might be distributed on that planet, robotic missions continue to grow in capability and complexity, requiring precautions to be taken now to protect Mars, and particularly areas of Mars that might be Special Regions. Such precautionary cleanliness requirements for spacecraft have evolved over the course of the space age, as we have learned more about planetary environments, and are the subject of regular deliberations and decisions sponsored by the Committee on Space Research (COSPAR). COSPAR's planetary protection policy is maintained as an international consensus standard for spacecraft cleanliness that is recognized by the United Nations Committee on the Peaceful Uses of Outer Space. In response to the paper presented in this issue by Fairén et al. (2017), we examine both their concept of evidence for possible life on Mars and their logic in recommending that spacecraft cleanliness requirements be relaxed to access Special Regions “before it is too late.” We find that there are shortcomings in their plans to look for evidence of life on Mars, that they do not support their contention that appropriate levels of spacecraft cleanliness are unaffordable, that there are major risks in assuming martian life could be identified by nucleic acid sequence comparison (especially if those sequences are obtained from a Special Region contaminated with Earth life), and that the authors do not justify their contention that exploration with dirty robots, now, is preferable to the possibility that later contamination will be spread by human exploration. We also note that the potential effects of contaminating resources and environments essential to future human occupants of Mars are both significant and not addressed by Fairén et al. (2017). Key Words: Mars—Special Region—Mission—Life detection—Planetary protection. Astrobiology 17, 971–974. PMID:28920443
Lawson Tait: the forgotten gynecologist.
Golditch, Ira M
2002-01-01
The development of gynecology as a specialty, although primarily American in origin, was influenced in large degree by Robert Lawson Tait, a brilliant Scottish/English surgeon who practiced in the late 19th century. Tait, a self-proclaimed gynecologist, is perhaps most widely known as the first to perform salpingectomy to treat ruptured tubal pregnancy. He was also the first to record removal of an ovary for relief of pelvic pain and to induce menopause, perform salpingectomy for the treatment of tubal disease, and develop the technique of transverse transperineal repair of low rectovaginal fistulas. His scrupulous cleanliness was undoubtedly the forerunner of our modern aseptic methods. Tait's bold, innovative surgical techniques led to a significant decrease in surgical mortality, and his prescient, aggressive approach was at the forefront of changes in the practice of obstetrics, which resulted in a marked decrease in maternal morbidity and mortality. This master teacher, whose contributions inspired the next great generation of abdominal and pelvic surgeons, deserves greater recognition within our specialty.
An OSEE Based Portable Surface Contamination Monitor
NASA Technical Reports Server (NTRS)
Perey, Daniel F.
1997-01-01
Many industrial and aerospace processes involving the joining of materials, require sufficient surface cleanliness to insure proper bonding. Processes as diverse as painting, welding, or the soldering of electronic circuits will be compromised if prior inspection and removal of surface contaminants is inadequate. As process requirements become more stringent and the number of different materials and identified contaminants increases, various instruments and techniques have been developed for improved inspection. One such technique based on the principle of Optically Stimulated Electron Emission (OSEE) has been explored for a number of years as a tool for surface contamination monitoring. Some of the benefits of OSEE are: it's non-contacting; requires little operator training; and has very high contamination sensitivity. This paper describes the development of a portable OSEE based surface contamination monitor. The instrument is suitable for both hand-held and robotic inspections with either manual or automated control of instrument operation. In addition, instrument output data is visually displayed to the operator and may be output to an external computer for archiving or analysis.
NASA Technical Reports Server (NTRS)
Gassaway, J. D.; Mahmood, Q.; Trotter, J. D.
1978-01-01
A system was developed for depositing aluminum and aluminum alloys by the D.C. sputtering technique. This system which was designed for a high level of cleanliness and ion monitoring the deposition parameters during film preparation is ready for studying the deposition and annealing parameters upon double level metal preparation. The finite element method was studied for use in the computer modeling of two dimensional MOS transistor structures. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions typical of those used at MSFC. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients. Data are given showing both two dimensional effects and the evolution of the junction depth, sheet resistance and integrated dose with redistribution time.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.
Verification and Validation of Autonomy Software at NASA
NASA Technical Reports Server (NTRS)
Pecheur, Charles
2000-01-01
Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.
Forecast Verification: Identification of small changes in weather forecasting skill
NASA Astrophysics Data System (ADS)
Weatherhead, E. C.; Jensen, T. L.
2017-12-01
Global and regonal weather forecasts have improved over the past seven decades most often because of small, incrmental improvements. The identificaiton and verification of forecast improvement due to proposed small changes in forecasting can be expensive and, if not carried out efficiently, can slow progress in forecasting development. This presentation will look at the skill of commonly used verification techniques and show how the ability to detect improvements can depend on the magnitude of the improvement, the number of runs used to test the improvement, the location on the Earth and the statistical techniques used. For continuous variables, such as temperture, wind and humidity, the skill of a forecast can be directly compared using a pair-wise statistical test that accommodates the natural autocorrelation and magnitude of variability. For discrete variables, such as tornado outbreaks, or icing events, the challenges is to reduce the false alarm rate while improving the rate of correctly identifying th discrete event. For both continuus and discrete verification results, proper statistical approaches can reduce the number of runs needed to identify a small improvement in forecasting skill. Verification within the Next Generation Global Prediction System is an important component to the many small decisions needed to make stat-of-the-art improvements to weather forecasting capabilities. The comparison of multiple skill scores with often conflicting results requires not only appropriate testing, but also scientific judgment to assure that the choices are appropriate not only for improvements in today's forecasting capabilities, but allow improvements that will come in the future.
Design and verification of distributed logic controllers with application of Petri nets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał
2015-12-31
The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.
Formal Techniques for Synchronized Fault-Tolerant Systems
NASA Technical Reports Server (NTRS)
DiVito, Ben L.; Butler, Ricky W.
1992-01-01
We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.
Automatic Methods and Tools for the Verification of Real Time Systems
1997-07-31
real - time systems . This was accomplished by extending techniques, based on automata theory and temporal logic, that have been successful for the verification of time-independent reactive systems. As system specification lanmaage for embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous environment variables. As requirements specification languages, we introduced temporal logics with clock variables for expressing timing constraints.
NASA Technical Reports Server (NTRS)
1974-01-01
Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.
NASA Technical Reports Server (NTRS)
1991-01-01
The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.
Dynamic analysis for shuttle design verification
NASA Technical Reports Server (NTRS)
Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.
1972-01-01
Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.
Secure Image Hash Comparison for Warhead Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruillard, Paul J.; Jarman, Kenneth D.; Robinson, Sean M.
2014-06-06
The effort to inspect and verify warheads in the context of possible future arms control treaties is rife with security and implementation issues. In this paper we review prior work on perceptual image hashing for template-based warhead verification. Furthermore, we formalize the notion of perceptual hashes and demonstrate that large classes of such functions are likely not cryptographically secure. We close with a brief discussion of fully homomorphic encryption as an alternative technique.
Design and verification of distributed logic controllers with application of Petri nets
NASA Astrophysics Data System (ADS)
Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika
2015-12-01
The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.
A Methodology for Evaluating Artifacts Produced by a Formal Verification Process
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette
2011-01-01
The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.
Protecting Your Home from Bed Bugs
Take precautions such as checking secondhand furniture for signs of infestation before bringing it home, using mattress encasements, sealing cracks, installing door sweeps, and maintaining cleanliness.
Assessing fluorescent color: a review of common practices and their limitations
NASA Astrophysics Data System (ADS)
Streitel, Steve
2003-07-01
Fluorescent Colorants are widely used around the world to enhance visibility. The outstanding brightness and cleanliness of the colors lend themselves to applications in safety materials, advertising, toys, magazines, packaging, and other areas. The brightness and cleanliness is a result of the colorants ability to reradiate absorbed energy as visible light, usually shorter more energetic photons as longer less energetic photons. This can give reflectance values of well over 100%, sometimes as high as 300%, in the perceived color. A good working definition of fluorescent color is: A colorant that absorbs light energy and reradiates the energy at visible wavelengths. Light that is not absorbed is reflected, as in conventional color. Emission ceases when the excitation energy is removed.
Muris, Peter; Mayer, Birgit; Huijding, Jorg; Konings, Tjeerd
2008-01-01
The present study investigated whether disgust-valenced information has an impact on children's fear beliefs about animals. Non-clinical children aged between 9 and 13 years (n=159) were presented with disgust-related and cleanliness-related information about unknown animals (Australian marsupials). Before and after information, beliefs of disgust and fear regarding the animals were assessed. Results showed that disgust-related information not only induced higher levels of disgust but also increased children's fear beliefs in relation to these animals. The other way around, cleanliness-related information decreased levels of disgust and resulted in lower levels of fear. The implications for the role of disgust in the development of animal fear are briefly discussed.
A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams
NASA Technical Reports Server (NTRS)
Tejada, Arturo
2009-01-01
An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.
Verification of intravenous catheter placement by auscultation--a simple, noninvasive technique.
Lehavi, Amit; Rudich, Utay; Schechtman, Moshe; Katz, Yeshayahu Shai
2014-01-01
Verification of proper placement of an intravenous catheter may not always be simple. We evaluated the auscultation technique for this purpose. Twenty healthy volunteers were randomized for 18G catheter inserted intravenously either in the right (12) or left arm (8), and subcutaneously in the opposite arm. A standard stethoscope was placed over an area approximately 3 cm proximal to the tip of the catheter in the presumed direction of the vein to grade on a 0-6 scale the murmur heard by rapidly injecting 2 mL of NaCl 0.9% solution. The auscultation was evaluated by a blinded staff anesthesiologist. All 20 intravenous injection were evaluated as flow murmurs, and were graded an average 5.65 (±0.98), whereas all 20 subcutaneous injections were evaluated as either crackles or no sound, and were graded an average 2.00 (±1.38), without negative results. Sensitivity was calculated as 95%. Specificity and Kappa could not be calculated due to an empty false-positive group. Being simple, handy and noninvasive, we recommend to use the auscultation technique for verification of the proper placement of an intravenous catheter when uncertain of its position. Data obtained in our limited sample of healthy subjects need to be confirmed in the clinical setting.
Technical review of SRT-CMA-930058 revalidation studies of Mark 16 experiments: J70
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, R.L.
1993-10-25
This study is a reperformance of a set of MGBS-TGAL criticality safety code validation calculations previously reported by Clark. The reperformance was needed because the records of the previous calculations could not be located in current APG files and records. As noted by the author, preliminary attempts to reproduce the Clark results by direct modeling in MGBS and TGAL were unsuccessful. Consultation with Clark indicated that the MGBS-TGAL (EXPT) option within the KOKO system should be used to set up the MGBS and TGAL input data records. The results of the study indicate that the technique used by Clark hasmore » been established and that the technique is now documented for future use. File records of the calculations have also been established in APG files. The review was performed per QAP 11--14 of 1Q34. Since the reviewer was involved in developing the procedural technique used for this study, this review can not be considered a fully independent review, but should be considered a verification that the document contains adequate information to allow a new user to perform similar calculations, a verification of the procedure by performing several calculations independently with identical results to the reported results, and a verification of the readability of the report.« less
Ercoli, Carlo; Geminiani, Alessandro; Feng, Changyong; Lee, Heeje
2012-05-01
The purpose of this retrospective study was to assess if there was a difference in the likelihood of achieving passive fit when an implant-supported full-arch prosthesis framework is fabricated with or without the aid of a verification jig. This investigation was approved by the University of Rochester Research Subject Review Board (protocol #RSRB00038482). Thirty edentulous patients, 49 to 73 years old (mean 61 years old), rehabilitated with a nonsegmented fixed implant-supported complete denture were included in the study. During the restorative process, final impressions were made using the pickup impression technique and elastomeric impression materials. For 16 patients, a verification jig was made (group J), while for the remaining 14 patients, a verification jig was not used (group NJ) and the framework was fabricated directly on the master cast. During the framework try-in appointment, the fit was assessed by clinical (Sheffield test) and radiographic inspection and recorded as passive or nonpassive. When a verification jig was used (group J, n = 16), all frameworks exhibited clinically passive fit, while when a verification jig was not used (group NJ, n = 14), only two frameworks fit. This difference was statistically significant (p < .001). Within the limitations of this retrospective study, the fabrication of a verification jig ensured clinically passive fit of metal frameworks in nonsegmented fixed implant-supported complete denture. © 2011 Wiley Periodicals, Inc.
Survey of statistical techniques used in validation studies of air pollution prediction models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornstein, R D; Anderson, S F
1979-03-01
Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.
Optical/digital identification/verification system based on digital watermarking technology
NASA Astrophysics Data System (ADS)
Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.
2000-06-01
This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.
Robotic Spent Fuel Monitoring – It is time to improve old approaches and old techniques!
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tobin, Stephen Joseph; Dasari, Venkateswara Rao; Trellue, Holly Renee
This report describes various approaches and techniques associated with robotic spent fuel monitoring. The purpose of this description is to improve the quality of measured signatures, reduce the inspection burden on the IAEA, and to provide frequent verification.
TH-B-204-03: TG-199: Implanted Markers for Radiation Treatment Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Z.
Implanted markers as target surrogates have been widely used for treatment verification, as they provide safe and reliable monitoring of the inter- and intra-fractional target motion. The rapid advancement of technology requires a critical review and recommendation for the usage of implanted surrogates in current field. The symposium, also reporting an update of AAPM TG 199 - Implanted Target Surrogates for Radiation Treatment Verification, will be focusing on all clinical aspects of using the implanted target surrogates for treatment verification and related issues. A wide variety of markers available in the market will be first reviewed, including radiopaque markers, MRImore » compatible makers, non-migrating coils, surgical clips and electromagnetic transponders etc. The pros and cons of each kind will be discussed. The clinical applications of implanted surrogates will be presented based on different anatomical sites. For the lung, we will discuss gated treatments and 2D or 3D real-time fiducial tracking techniques. For the prostate, we will be focusing on 2D-3D, 3D-3D matching and electromagnetic transponder based localization techniques. For the liver, we will review techniques when patients are under gating, shallow or free breathing condition. We will review techniques when treating challenging breast cancer as deformation may occur. Finally, we will summarize potential issues related to the usage of implanted target surrogates with TG 199 recommendations. A review of fiducial migration and fiducial derived target rotation in different disease sites will be provided. The issue of target deformation, especially near the diaphragm, and related suggestions will be also presented and discussed. Learning Objectives: Knowledge of a wide variety of markers Knowledge of their application for different disease sites Understand of issues related to these applications Z. Wang: Research funding support from Brainlab AG Q. Xu: Consultant for Accuray; Q. Xu, I am a consultant for Accuray planning service.« less
Advanced Software V&V for Civil Aviation and Autonomy
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.
2017-01-01
With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.
Dosimetric changes with computed tomography automatic tube-current modulation techniques.
Spampinato, Sofia; Gueli, Anna Maria; Milone, Pietro; Raffaele, Luigi Angelo
2018-04-06
The study is aimed at a verification of dose changes for a computed tomography automatic tube-current modulation (ATCM) technique. For this purpose, anthropomorphic phantom and Gafchromic ® XR-QA2 films were used. Radiochromic films were cut according to the shape of two thorax regions. The ATCM algorithm is based on noise index (NI) and three exam protocols with different NI were chosen, of which one was a reference. Results were compared with dose values displayed by the console and with Poisson statistics. The information obtained with radiochromic films has been normalized with respect to the NI reference value to compare dose percentage variations. Results showed that, on average, the information reported by the CT console and calculated values coincide with measurements. The study allowed verification of the dose information reported by the CT console for an ATCM technique. Although this evaluation represents an estimate, the method can be a starting point for further studies.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y; Yin, F; Ren, L
Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to furthermore » reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The technique enables LIVE to accurately reconstruct 4D-CBCT images using only orthogonal 6° angle, which greatly improves the efficiency and reduces dose of LIVE for intrafraction verification.« less
Optical detection of random features for high security applications
NASA Astrophysics Data System (ADS)
Haist, T.; Tiziani, H. J.
1998-02-01
Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.
Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations
NASA Technical Reports Server (NTRS)
Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.
2004-01-01
An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).
Restricted access processor - An application of computer security technology
NASA Technical Reports Server (NTRS)
Mcmahon, E. M.
1985-01-01
This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.
Formal Verification at System Level
NASA Astrophysics Data System (ADS)
Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.
2009-05-01
System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.
The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without themore » use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrence, Chris C.; Flaska, Marek; Pozzi, Sara A.
2016-08-14
Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrixmore » condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.« less
NASA Astrophysics Data System (ADS)
Lawrence, Chris C.; Febbraro, Michael; Flaska, Marek; Pozzi, Sara A.; Becchetti, F. D.
2016-08-01
Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrix condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.
Yassin, Ali A
2014-01-01
Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification.
Yassin, Ali A.
2014-01-01
Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification. PMID:27355051
NASA Technical Reports Server (NTRS)
Pierzga, M. J.
1981-01-01
The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.
d'Errico, F; Chierici, A; Gattas-Sethi, M; Philippe, S; Goldston, R; Glaser, A
2018-04-25
In recent years, neutron detection with superheated emulsions has received renewed attention thanks to improved detector manufacturing and read-out techniques, and thanks to successful applications in warhead verification and special nuclear material (SNM) interdiction. Detectors are currently manufactured with methods allowing high uniformity of the drop sizes, which in turn allows the use of optical read-out techniques based on dynamic light scattering. Small detector cartridges arranged in 2D matrices are developed for the verification of a declared warhead without revealing its design. For this application, the enabling features of the emulsions are that bubbles formed at different times cannot be distinguished from each other, while the passive nature of the detectors avoids the susceptibility to electronic snooping and tampering. Large modules of emulsions are developed to detect the presence of shielded special nuclear materials hidden in cargo containers 'interrogated' with high energy X-rays. In this case, the enabling features of the emulsions are photon discrimination, a neutron detection threshold close to 3 MeV and a rate-insensitive read-out.
Authentication Based on Pole-zero Models of Signature Velocity
Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad
2013-01-01
With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797
SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baba, H; Tachibana, H; Kamima, T
2015-06-15
Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiologicalmore » path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%.« less
NASA Astrophysics Data System (ADS)
Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih
2017-08-01
In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.
Code of Federal Regulations, 2010 CFR
2010-01-01
... growing medium must consist of a substrate (a habitat and nutrient base) sterilized by heat treatment. (d... at a minimum: (1) Adequate hygiene; (2) Overall cleanliness; (3) Isolation or minimum contact...
Code of Federal Regulations, 2011 CFR
2011-01-01
... growing medium must consist of a substrate (a habitat and nutrient base) sterilized by heat treatment. (d... at a minimum: (1) Adequate hygiene; (2) Overall cleanliness; (3) Isolation or minimum contact...
Apprendre a apprendre: L'autocorrection (Learning to Learn: Self-Correction).
ERIC Educational Resources Information Center
Noir, Pascal
1996-01-01
A technique used in an advanced French writing class to encourage student self-correction is described. The technique focused on correction of verbs and their tenses; reduction of repetition; appropriate use of "on" and "nous;" and verification of possessive adjectives, negatives, personal pronouns, spelling, and punctuation.…
Inexpensive Eddy-Current Standard
NASA Technical Reports Server (NTRS)
Berry, Robert F., Jr.
1985-01-01
Radial crack replicas serve as evaluation standards. Technique entails intimately joining two pieces of appropriate aluminum alloy stock and centering drilled hole through and along interface. Bore surface of hole presents two vertical stock interface lines 180 degrees apart. These lines serve as radial crack defect replicas during eddy-current technique setup and verification.
Relation of attitude toward body elimination to parenting style and attitude toward the body.
Corgiat, Claudia A; Templer, Donald I
2003-04-01
The purpose was to estimate the relation of attitude toward body elimination in 93 college students (27 men and 66 women), to authoritarian personality features, participants' perception of their mothers' parenting style, and attitudes toward cleanliness, sex, and family nudity. Subjects were administered the Body Elimination Attitude Scale, the Four-item F Scale, the Parental Authority Questionnaire Pertaining to Mothers, and the items "Sex is dirty," "Cleanliness is next to godliness," and "Children should never see other family members nude." Larger scores for disgust toward body elimination were associated with authoritarian personality characteristics, being less likely to describe mother's parenting style as authoritative (open communication) and more likely to describe it as authoritarian and lower scores for tolerance for family nudity. Implications for further research were suggested.
Solomon, Hiwote; Solomon, Anthony W.; Freeman, Matthew C.
2018-01-01
Background Efforts are underway to scale-up the facial cleanliness and environmental improvement (F&E) components of the World Health Organization’s SAFE strategy for elimination of trachoma as a public health problem. Improving understanding of the F&E intervention landscape could inform advancements prior to scale-up, and lead to more effective and sustained behavior change. Methods/findings We systematically searched for relevant grey literature published from January 1965 through August 2016. Publications were eligible for review if they described interventions addressing F&E in the context of trachoma elimination programs. Subsequent to screening, we mapped attributes of F&E interventions. We then employed three behavior change frameworks to synthesize mapped data and identify potential intervention gaps. We identified 27 documents meeting inclusion criteria. With the exception of some recent programming, F&E interventions have largely focused on intermediate and distal antecedents of behavior change. Evidence from our analyses suggests many interventions are not designed to address documented determinants of improved F&E practices. No reviewed documents endorsed inclusion of intervention components related to behavioral maintenance or resilience–factors critical for sustaining improved behaviors. Conclusions If left unaddressed, identified gaps in intervention content may continue to challenge uptake and sustainability of improved F&E behaviors. Stakeholders designing and implementing trachoma elimination programs should review their F&E intervention content and delivery approaches with an eye toward improvement, including better alignment with established behavior change theories and empirical evidence. Implementation should move beyond information dissemination, and appropriately employ a variety of behavior change techniques to address more proximal influencers of change. PMID:29370169
Meshram, GK
2010-01-01
ABSTRACT Aim : To assess the cleaning efficacy of manual and automated instrumentation using 4% sodium hypochlorite singly and in combination with Glyde file Prep as root canal irrigant. Methodology : The study utilized 40 extracted human permanent premolars with single, straight and fully formed root. The teeth were then divided into four groups of ten each, Group I and II were prepared by manual instruments with 4% sodium hypochlorite used as irrigant singly [Group I] or in combination with Glyde file prep. Group III and IV were prepared by automated instruments at 250 rpm with 4% sodium hypochlorite as irrigant singly [Group III] and in combination with glyde file prep [Group IV] automated instrumentation. After completion of the root canal preparation the canal, teeth were prepared for SEM examination. These photomicrographs were qualitatively evaluated using criteria. Overall cleanliness, presence or absence of the smear layer, presence or absence of the debris, patency of the opening of dentinal tubules. Results : When comparing the cleansing efficacy of manual and automated instrumentation using 4% sodium hypochlorite better cleansing was there with manual instrumentation. When comparing the cleansing efficacy of manual and automated instrumentation using combination regime cleansing is better with automated instrumentation. When comparing the cleansing efficacy of manual instrumentation using 4% sodium hypochlorite singly and in combination with EDTA, the combination regime led to better cleansing. When comparing the cleansing efficacy of automated instrumentation using 4% sodium hypochlorite singly and in combination regime lead to better cleansing. Conclusion : Neither of instrumentation technique, nor irrigating regimes were capable of providing a completely clean canal. Automated instrumentation with a combination of sodium hypochlorite & EDTA resulted the best cleansing efficacy. PMID:27616839
NASA Astrophysics Data System (ADS)
Schwartz, R.
1994-01-01
Adsorption layers on stainless steel mass standards (OIML classes E1 and E2) have been determined directly and precisely by the optical method of ellipsometry as a function of relative humidity in the range 0,03 <= h <= 0,77, the relevant influencing factors being surface cleanliness, roughness, steel composition and ambient temperature. Under the same environmental conditions, two pairs of 1 kg artefacts, having geometrical surfaces differing in area by about δ A = 390 cm2, but the same material properties and surface finish as the mass standards, have been compared on a 1 kg mass comparator. The two independent measuring techniques yield strongly correlated results, the standard uncertainties of the measured surface coverings being
Structural Margins Assessment Approach
NASA Technical Reports Server (NTRS)
Ryan, Robert S.
1988-01-01
A general approach to the structural design and verification used to determine the structural margins of the space vehicle elements under Marshall Space Flight Center (MSFC) management is described. The Space Shuttle results and organization will be used as illustrations for techniques discussed. Given also are: (1) the system analyses performed or to be performed by, and (2) element analyses performed by MSFC and its contractors. Analysis approaches and their verification will be addressed. The Shuttle procedures are general in nature and apply to other than Shuttle space vehicles.
Knowledge-based system verification and validation
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1990-01-01
The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.
Assessing Requirements Quality through Requirements Coverage
NASA Technical Reports Server (NTRS)
Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt
2008-01-01
In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.
... duct stones Poor cleanliness in the mouth ( oral hygiene ) Low amounts of water in the body, most ... to help with recovery include: Practice good oral hygiene. Brush your teeth and floss well at least ...
La Belle, Jeffrey T; Fairchild, Aaron; Demirok, Ugur K; Verma, Aman
2013-05-15
There is a critical need for more accurate, highly sensitive and specific assay for disease diagnosis and management. A novel, multiplexed, single sensor using rapid and label free electrochemical impedance spectroscopy tuning method has been developed. The key challenges while monitoring multiple targets is frequency overlap. Here we describe the methods to circumvent the overlap, tune by use of nanoparticle (NP) and discuss the various fabrication and characterization methods to develop this technique. First sensors were fabricated using printed circuit board (PCB) technology and nickel and gold layers were electrodeposited onto the PCB sensors. An off-chip conjugation of gold NP's to molecular recognition elements (with verification technique) is described as well. A standard covalent immobilization of the molecular recognition elements is also discussed with quality control techniques. Finally use and verification of sensitivity and specificity is also presented. By use of gold NP's of various sizes, we have demonstrated the possibility and shown little loss of sensitivity and specificity in the molecular recognition of inflammatory markers as "model" targets for our tuning system. By selection of other sized NP's or NP's of various materials, the tuning effect can be further exploited. The novel platform technology developed could be utilized in critical care, clinical management and at home health and disease management. Copyright © 2013 Elsevier Inc. All rights reserved.
Formal verification of an avionics microprocessor
NASA Technical Reports Server (NTRS)
Srivas, Mandayam, K.; Miller, Steven P.
1995-01-01
Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.
ERIC Educational Resources Information Center
Watt, W. Bradford
2000-01-01
Discusses the use of seamless flooring in areas where cleanliness, waterproofing, and slip resistance is emphasized. Areas such as locker rooms, restrooms, kitchens and cafeterias, lobbies and hallways, multipurpose-rooms, and walkways are considered. (GR)
Code of Federal Regulations, 2010 CFR
2010-01-01
... outer garments and caps (paper caps, hard hats, or hair nets acceptable) shall be worn to adequately protect the hair and beards when grown by all persons engaged in receiving, testing, processing milk...
Code of Federal Regulations, 2014 CFR
2014-01-01
... outer garments and caps (paper caps, hard hats, or hair nets acceptable) shall be worn to adequately protect the hair and beards when grown by all persons engaged in receiving, testing, processing milk...
Code of Federal Regulations, 2012 CFR
2012-01-01
... outer garments and caps (paper caps, hard hats, or hair nets acceptable) shall be worn to adequately protect the hair and beards when grown by all persons engaged in receiving, testing, processing milk...
Code of Federal Regulations, 2013 CFR
2013-01-01
... outer garments and caps (paper caps, hard hats, or hair nets acceptable) shall be worn to adequately protect the hair and beards when grown by all persons engaged in receiving, testing, processing milk...
Code of Federal Regulations, 2011 CFR
2011-01-01
... outer garments and caps (paper caps, hard hats, or hair nets acceptable) shall be worn to adequately protect the hair and beards when grown by all persons engaged in receiving, testing, processing milk...
Govier, J
2006-01-01
After investment in a clean room, cleanliness, sanitisation or sterility is essential to ensuring it operates at the highest standard. This article advises on the products and maintenance procedures to achieve this.
Xin, Yong; Wang, Jia-Yang; Li, Liang; Tang, Tian-You; Liu, Gui-Hong; Wang, Jian-She; Xu, Yu-Mei; Chen, Yong; Zhang, Long-Zhen
2012-01-01
To make sure the feasibility with (18F)FDG PET/CT to guided dynamic intensity-modulated radiation therapy (IMRT) for nasopharyngeal carcinoma patients, by dosimetric verification before treatment. Chose 11 patients in III~IVA nasopharyngeal carcinoma treated with functional image-guided IMRT and absolute and relative dosimetric verification by Varian 23EX LA, ionization chamber, 2DICA of I'mRT Matrixx and IBA detachable phantom. Drawing outline and making treatment plan were by different imaging techniques (CT and (18F)FDG PET/CT). The dose distributions of the various regional were realized by SMART. The absolute mean errors of interest area were 2.39%±0.66 using 0.6 cc ice chamber. Results using DTA method, the average relative dose measurements within our protocol (3%, 3 mm) were 87.64% at 300 MU/min in all filed. Dosimetric verification before IMRT is obligatory and necessary. Ionization chamber and 2DICA of I'mRT Matrixx was the effective dosimetric verification tool for primary focal hyper metabolism in functional image-guided dynamic IMRT for nasopharyngeal carcinoma. Our preliminary evidence indicates that functional image-guided dynamic IMRT is feasible.
A new verification film system for routine quality control of radiation fields: Kodak EC-L.
Hermann, A; Bratengeier, K; Priske, A; Flentje, M
2000-06-01
The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.
Effect of information about animal welfare on consumer willingness to pay for yogurt.
Napolitano, F; Pacelli, C; Girolami, A; Braghieri, A
2008-03-01
This study aimed to verify whether consumers confirm their willingness to pay extra costs for higher animal welfare standards in a situation where a potential purchase performed by consumers, such as the Vickrey auction, is used. A 104-member consumer panel was asked to rate its willingness to pay (WTP) for plain and low-fat yogurts in 3 information conditions: tasting without information (blind WTP), information about animal welfare without tasting (expected WTP), tasting with information about animal welfare (actual WTP). Information was provided to the consumers under the form of labels indicating the level of animal cleanliness and freedom of movement (5-point scale, from poor to very good). Consumers were influenced by information about low standards of animal welfare (low cleanliness and low freedom of movement) and moved their willingness to pay in the direction of their expectations. However, the discrepancy between expectancy and actual WTP was not totally assimilated, indicating that WTP was also expressed in relation to other aspects (e.g., the sensory properties of the products). Conversely, the information concerning high standards of animal welfare (high cleanliness and high freedom of movement) was able to affect expectancy but had an effect on actual WTP only when the most acceptable yogurt was offered to the consumers. In the case of discordant information on animal welfare, partly indicating high levels of welfare (freedom of movements) and low levels of welfare (cleanliness), expected WTP was always lower than blind WTP. However, when the least acceptable product was presented, they completely assimilated their actual WTP to the expectations. Conversely, with the most acceptable yogurt, no assimilation occurred and sensory properties prevailed in orienting consumer WTP. Within each product, consumers expressed a higher WTP for products with labels indicating high welfare standards as compared with yogurts with labels reporting intermediate and low welfare standard. These results show that information about animal welfare, if given to the consumers, can be a major determinant of consumer WTP for animal-based food products. However, information about high standards of animal welfare should be paired with products presenting a good eating quality.
2014-01-01
Background Hospital cleanliness in hospitals with a tendency toward long-term care in Japan remains unevaluated. We therefore visualized hospital cleanliness in Japan over a 2-month period by two distinct popular methods: ATP bioluminescence (ATP method) and the standard stamp agar method (stamp method). Methods The surfaces of 752 sites within nurse and patient areas in three hospitals located in a central area of Sapporo, Japan were evaluated by the ATP and stamp methods, and each surface was sampled 8 times in 2 months. These areas were located in different ward units (Internal Medicine, Surgery, and Obstetrics and Gynecology). Detection limits for the ATP and stamp methods were determined by spike experiments with a diluted bacterial solution and a wipe test on student tables not in use during winter vacation, respectively. Values were expressed as the fold change over the detection limit, and a sample with a value higher than the detection limit by either method was defined as positive. Results The detection limits were determined to be 127 relative light units (RLU) per 100 cm2 for the ATP method and 5.3 colony-forming units (CFU) per 10 cm2 for the stamp method. The positive frequency of the ATP and stamp methods was 59.8% (450/752) and 47.7% (359/752), respectively, although no significant difference in the positive frequency among the hospitals was seen. Both methods revealed the presence of a wide range of organic contamination spread via hand touching, including microbial contamination, with a preponderance on the entrance floor and in patient rooms. Interestingly, the data of both methods indicated considerable variability regardless of daily visual assessment with usual wiping, and positive surfaces were irregularly seen. Nurse areas were relatively cleaner than patient areas. Finally, there was no significant correlation between the number of patients or medical personnel in the hospital and organic or microbiological contamination. Conclusions Ongoing daily hospital cleanliness is not sufficient in Japanese hospitals with a tendency toward long-term care. PMID:24593868
Watanabe, Reina; Shimoda, Tomoko; Yano, Rika; Hayashi, Yasuhiro; Nakamura, Shinji; Matsuo, Junji; Yamaguchi, Hiroyuki
2014-03-04
Hospital cleanliness in hospitals with a tendency toward long-term care in Japan remains unevaluated. We therefore visualized hospital cleanliness in Japan over a 2-month period by two distinct popular methods: ATP bioluminescence (ATP method) and the standard stamp agar method (stamp method). The surfaces of 752 sites within nurse and patient areas in three hospitals located in a central area of Sapporo, Japan were evaluated by the ATP and stamp methods, and each surface was sampled 8 times in 2 months. These areas were located in different ward units (Internal Medicine, Surgery, and Obstetrics and Gynecology). Detection limits for the ATP and stamp methods were determined by spike experiments with a diluted bacterial solution and a wipe test on student tables not in use during winter vacation, respectively. Values were expressed as the fold change over the detection limit, and a sample with a value higher than the detection limit by either method was defined as positive. The detection limits were determined to be 127 relative light units (RLU) per 100 cm2 for the ATP method and 5.3 colony-forming units (CFU) per 10 cm2 for the stamp method. The positive frequency of the ATP and stamp methods was 59.8% (450/752) and 47.7% (359/752), respectively, although no significant difference in the positive frequency among the hospitals was seen. Both methods revealed the presence of a wide range of organic contamination spread via hand touching, including microbial contamination, with a preponderance on the entrance floor and in patient rooms. Interestingly, the data of both methods indicated considerable variability regardless of daily visual assessment with usual wiping, and positive surfaces were irregularly seen. Nurse areas were relatively cleaner than patient areas. Finally, there was no significant correlation between the number of patients or medical personnel in the hospital and organic or microbiological contamination. Ongoing daily hospital cleanliness is not sufficient in Japanese hospitals with a tendency toward long-term care.
Richardson, Michael L; Petscavage, Jonelle M
2011-11-01
The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.
Compressive sensing using optimized sensing matrix for face verification
NASA Astrophysics Data System (ADS)
Oey, Endra; Jeffry; Wongso, Kelvin; Tommy
2017-12-01
Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.
NASA Technical Reports Server (NTRS)
Sung, Q. C.; Miller, L. D.
1977-01-01
Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.
Expert system verification and validation study: ES V/V Workshop
NASA Technical Reports Server (NTRS)
French, Scott; Hamilton, David
1992-01-01
The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.
24 CFR 982.401 - Housing quality standards (HQS).
Code of Federal Regulations, 2011 CFR
2011-04-01
... capable of maintaining a thermal environment healthy for the human body. (2) Acceptability criteria. (i... must be in proper operating condition, and adequate for personal cleanliness and the disposal of human...
Environmental Control and Life Support Systems
NASA Technical Reports Server (NTRS)
Engel, Joshua Allen
2017-01-01
The Environmental Control System provides a controlled air purge to Orion and SLS. The ECS performs this function by processing 100% ambient air while simultaneously controlling temperature, pressure, humidity, cleanliness and purge distribution.
Cleanliness evaluation of rough surfaces with diffuse IR reflectance
NASA Technical Reports Server (NTRS)
Pearson, L. H.
1995-01-01
Contamination on bonding surfaces has been determined to be a primary cause for degraded bond strength in certain solid rocket motor bondlines. Hydrocarbon and silicone based organic contaminants that are airborne or directly introduced to a surface are a significant source of contamination. Diffuse infrared (IR) reflectance has historically been used as an effective technique for detection of organic contaminants, however, common laboratory methods involving the use of a Fourier transform IR spectrometer (FTIR) are impractical for inspecting the large bonding surface areas found on solid rocket motors. Optical methods involving the use of acousto-optic tunable filters and fixed bandpass optical filters are recommended for increased data acquisition speed. Testing and signal analysis methods are presented which provide for simultaneous measurement of contamination concentration and roughness level on rough metal surfaces contaminated with hydrocarbons.
NASA Astrophysics Data System (ADS)
Alizadeh, A.; Parsafar, S.; Khodaei, M. M.
2017-03-01
A biocompatible method for synthesizing of highly disperses gold nanoparticles using Ferulago Angulata leaf extract has been developed. It has been shown that leaf extract acts as reducing and coating agent. Various spectroscopic and electron microscopic techniques were employed for the structural characterization of the prepared nanoparticles. The biosynthesized particles were identified as elemental gold with spherical morphology, narrow size distribution (ranged 9.2-17.5 nm) with high stability. Also, the effect of initial ratio of precursors, temperature and time of reaction on the size and morphology of the nanoparticles was studied in more detail. It was observed that varying these parameters provides an accessible remote control on the size and morphology of nanoparticles. The uniqueness of this procedure lies in its cleanliness using no extra surfactant, reducing agent or any capping agent.
Surface inspection using FTIR spectroscopy
NASA Technical Reports Server (NTRS)
Powell, G. L.; Smyrl, N. R.; Williams, D. M.; Meyers, H. M., III; Barber, T. E.; Marrero-Rivera, M.
1995-01-01
The use of reflectance Fourier transform infrared (FTIR) spectroscopy as a tool for surface inspection is described. Laboratory instruments and portable instruments can support remote sensing probes that can map chemical contaminants on surfaces with detection limits under the best of conditions in the sub-nanometer range, i.e.. near absolute cleanliness, excellent performance in the sub-micrometer range, and useful performance for films tens of microns thick. Examples of discovering and quantifying contamination such as mineral oils and greases, vegetable oils, and silicone oils on aluminum foil, galvanized sheet steel, smooth aluminum tubing, and sandblasted 7075 aluminum alloy and D6AC steel. The ability to map in time and space the distribution of oil stains on metals is demonstrated. Techniques associated with quantitatively applying oils to metals, subsequently verifying the application, and non-linear relationships between reflectance and the quantity oil are described.
NASA Astrophysics Data System (ADS)
Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.
2015-11-01
High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Optimized Temporal Monitors for SystemC
NASA Technical Reports Server (NTRS)
Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.
2012-01-01
SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.
Precision Cleaning - Path to Premier
NASA Technical Reports Server (NTRS)
Mackler, Scott E.
2008-01-01
ITT Space Systems Division s new Precision Cleaning facility provides critical cleaning and packaging of aerospace flight hardware and optical payloads to meet customer performance requirements. The Precision Cleaning Path to Premier Project was a 2007 capital project and is a key element in the approved Premier Resource Management - Integrated Supply Chain Footprint Optimization Project. Formerly precision cleaning was located offsite in a leased building. A new facility equipped with modern precision cleaning equipment including advanced process analytical technology and improved capabilities was designed and built after outsourcing solutions were investigated and found lacking in ability to meet quality specifications and schedule needs. SSD cleans parts that can range in size from a single threaded fastener all the way up to large composite structures. Materials that can be processed include optics, composites, metals and various high performance coatings. We are required to provide verification to our customers that we have met their particulate and molecular cleanliness requirements and we have that analytical capability in this new facility. The new facility footprint is approximately half the size of the former leased operation and provides double the amount of throughput. Process improvements and new cleaning equipment are projected to increase 1st pass yield from 78% to 98% avoiding $300K+/yr in rework costs. Cost avoidance of $350K/yr will result from elimination of rent, IT services, transportation, and decreased utility costs. Savings due to reduced staff expected to net $4-500K/yr.
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F. A.
2013-12-01
ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.
NASA Astrophysics Data System (ADS)
Petric, Martin Peter
This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.
[Cleanliness in the operating room].
Suzuki, Toshiyasu
2010-05-01
With regard to recent findings in the cleanliness of the operating room, concerning handwashing and performing operations, the traditional method of excessive scrubbing using a brush is not effective, and handwashing using only an alcohol-containing antiseptic hand rub (rubbing method) has become common practice. Use of a brush has already been abolished in some medical institutions. In addition, sterilized water used for handwashing when performing operations has no scientific basis and use of tap water is considered sufficient. Furthermore, the concept of operating room zoning has also undergone a dramatic change. It was discovered that a layout focusing on work efficiency is more desirable than the one that follows an excessively rigid zoning pattern. One-footwear System not requiring change of shoes also has various advantages in improving the efficiency of the operation room, and this is thought to become commonplace in the future.
Evaluation of alternative cleaners for solder flux and mold release removal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopez, E.P.; Peebles, D.E.; Reich, J.E.
1991-01-01
As part of a solvent substitution program, an evaluation of selected alternative cleaners for solder flux and mold release removal has been performed. Six cleaners were evaluated for their efficiency in removing a rosin mildly activated flux and a silicone mold release from copper, 17-4PH stainless steel, polyimide quartz glass and tin-lead surfaces. A parallel effort also studied deionized water removal of organic acid fluxes. Auger electron spectroscopy and X-ray photoelectron spectroscopy were used to determine relative elemental cleanliness of the outermost atomic layers. An Omega Meter Test was used to measure residual ionic contamination. Water drop contact angles weremore » used to measure the effectiveness of silicone removal from Cu substrates. In most cases, the cleanliness levels were good to excellent. 10 refs., 8 figs., 19 tabs.« less
NASA Technical Reports Server (NTRS)
Perey, D. F.
1996-01-01
Many industrial and aerospace processes involving the joining of materials, require sufficient surface cleanliness to insure proper bonding. Processes as diverse as painting, welding, or the soldering of electronic circuits will be compromised if prior inspection and removal of surface contaminants is inadequate. As process requirements become more stringent and the number of different materials and identified contaminants increases, various instruments and techniques have been developed for improved inspection. One such technique, based on the principle of Optically Stimulated Electron Emission (OSEE), has been explored for a number of years as a tool for surface contamination monitoring. Some of the benefits of OSEE are: it is non-contacting; requires little operator training; and has very high contamination sensitivity. This paper describes the development of a portable OSEE based surface contamination monitor. The instrument is suitable for both hand-held and robotic inspections with either manual or automated control of instrument operation. In addition, instrument output data is visually displayed to the operator and may be sent to an external computer for archiving or analysis.
Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications
NASA Technical Reports Server (NTRS)
Boghosian, Mary; Narvaez, Pablo; Herman, Ray
2012-01-01
The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.
Cleanliness of disposable vs nondisposable electrocardiography lead wires in children.
Addison, Nancy; Quatrara, Beth; Letzkus, Lisa; Strider, David; Rovnyak, Virginia; Syptak, Virginia; Fuzy, Lisa
2014-09-01
Mediastinitis costs hospitals thousands of dollars a year and increases the incidence of patient morbidity and mortality. No studies have been done to evaluate adenosine triphosphate (ATP) counts on disposable and nondisposable electrocardiography (ECG) lead wires in pediatric patients. To compare the cleanliness of disposable and nondisposable ECG lead wires in postoperative pediatric cardiac surgery patients by measuring the quantity of ATP (in relative luminescence units [RLUs]). ATP levels correlate with microbial cell counts and are used by institutions to assess hospital equipment and cleanliness. A prospective, randomized trial was initiated with approval from the institutional review board. Verbal consent was obtained from the parents/guardians for each patient. Trained nurses performed ATP swabs on the right and left upper ECG cables on postoperative days 1, 2, and 3. This study enrolled 51 patients. The disposable ECG lead wire ATP count on postoperative day 1 (median, 157 RLUs) was significantly lower (P < .001) than the count for nondisposable ATP lead wires (median, 610 RLUs). On postoperative day 2, the ATP count for the disposable ECG lead wires (median, 200 RLUs) was also lower (P = .06) than the count for the nondisposable ECG lead wires (median, 453 RLUs). Results of this study support the use of disposable ECG lead wires in postoperative pediatric cardiac surgery patients for at least the first 48 hours as a direct strategy to reduce the ATP counts on ECG lead wires. ©2014 American Association of Critical-Care Nurses.
The enerMENA meteorological network - Solar radiation measurements in the MENA region
NASA Astrophysics Data System (ADS)
Schüler, D.; Wilbert, S.; Geuder, N.; Affolter, R.; Wolfertstetter, F.; Prahl, C.; Röger, M.; Schroedter-Homscheidt, M.; Abdellatif, G.; Guizani, A. Allah; Balghouthi, M.; Khalil, A.; Mezrhab, A.; Al-Salaymeh, A.; Yassaa, N.; Chellali, F.; Draou, D.; Blanc, P.; Dubranna, J.; Sabry, O. M. K.
2016-05-01
For solar resource assessment of solar power plants and adjustment of satellite data, high accuracy measurement data of irradiance and ancillary meteorological data is needed. For the MENA region (Middle East and Northern Africa), which is of high importance for concentrating solar power applications, so far merely 2 publicly available ground measurement stations existed (BSRN network). This gap has been filled by ten stations in Morocco, Algeria, Tunisia, Egypt and Jordan. In this publication the data quality is analyzed by evaluating data completeness and the cleanliness of irradiance sensors in comparison for all of the stations. The pyrheliometers have an average cleanliness of 99.2 % for week-daily cleaning. This is a 5 times higher effort than for Rotating Shadowband Irradiometer (RSI) stations which even have a slightly higher average cleanliness of 99.3 % for weekly cleaning. Furthermore, RSI stations show a data completeness of 99.4 % compared to 93.6 % at the stations equipped with thermal sensors. The results of this analysis are used to derive conclusions concerning instrument choice and are hence also applicable to other solar radiation measurements outside the enerMENA network. It turns out that RSIs are the more reliable and robust choice in cases of high soiling, rare station visits for cleaning and maintenance, as usual in desert sites. Furthermore, annual direct normal and global horizontal irradiation as well as average meteorological parameters are calculated for all of the stations.
Fog dispersion. [charged particle technique
NASA Technical Reports Server (NTRS)
Christensen, L. S.; Frost, W.
1980-01-01
The concept of using the charged particle technique to disperse warm fog at airports is investigated and compared with other techniques. The charged particle technique shows potential for warm fog dispersal, but experimental verification of several significant parameters, such as particle mobility and charge density, is needed. Seeding and helicopter downwash techniques are also effective for warm fog disperals, but presently are not believed to be viable techniques for routine airport operations. Thermal systems are currently used at a few overseas airports; however, they are expensive and pose potential environmental problems.
Optical security verification for blurred fingerprints
NASA Astrophysics Data System (ADS)
Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.
1998-12-01
Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.
Hybrid Decompositional Verification for Discovering Failures in Adaptive Flight Control Systems
NASA Technical Reports Server (NTRS)
Thompson, Sarah; Davies, Misty D.; Gundy-Burlet, Karen
2010-01-01
Adaptive flight control systems hold tremendous promise for maintaining the safety of a damaged aircraft and its passengers. However, most currently proposed adaptive control methodologies rely on online learning neural networks (OLNNs), which necessarily have the property that the controller is changing during the flight. These changes tend to be highly nonlinear, and difficult or impossible to analyze using standard techniques. In this paper, we approach the problem with a variant of compositional verification. The overall system is broken into components. Undesirable behavior is fed backwards through the system. Components which can be solved using formal methods techniques explicitly for the ranges of safe and unsafe input bounds are treated as white box components. The remaining black box components are analyzed with heuristic techniques that try to predict a range of component inputs that may lead to unsafe behavior. The composition of these component inputs throughout the system leads to overall system test vectors that may elucidate the undesirable behavior
Generating Models of Infinite-State Communication Protocols Using Regular Inference with Abstraction
NASA Astrophysics Data System (ADS)
Aarts, Fides; Jonsson, Bengt; Uijen, Johan
In order to facilitate model-based verification and validation, effort is underway to develop techniques for generating models of communication system components from observations of their external behavior. Most previous such work has employed regular inference techniques which generate modest-size finite-state models. They typically suppress parameters of messages, although these have a significant impact on control flow in many communication protocols. We present a framework, which adapts regular inference to include data parameters in messages and states for generating components with large or infinite message alphabets. A main idea is to adapt the framework of predicate abstraction, successfully used in formal verification. Since we are in a black-box setting, the abstraction must be supplied externally, using information about how the component manages data parameters. We have implemented our techniques by connecting the LearnLib tool for regular inference with the protocol simulator ns-2, and generated a model of the SIP component as implemented in ns-2.
Image Hashes as Templates for Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.
2012-07-17
Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images,more » and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the information contained in the hashed image data (available out-of-IB) cannot be used to extract sensitive information about the imaged object is of primary concern. Thus the techniques are characterized by high unpredictability to guarantee security. We will present an assessment of the performance of our techniques with respect to security, sensitivity and robustness on the basis of a methodical and mathematically precise framework.« less
Bleeker, H J; Lewin, P A
2000-01-01
A new calibration technique for PVDF ultrasonic hydrophone probes is described. Current implementation of the technique allows determination of hydrophone frequency response between 2 and 100 MHz and is based on the comparison of theoretically predicted and experimentally determined pressure-time waveforms produced by a focused, circular source. The simulation model was derived from the time domain algorithm that solves the non linear KZK (Khokhlov-Zabolotskaya-Kuznetsov) equation describing acoustic wave propagation. The calibration technique data were experimentally verified using independent calibration procedures in the frequency range from 2 to 40 MHz using a combined time delay spectrometry and reciprocity approach or calibration data provided by the National Physical Laboratory (NPL), UK. The results of verification indicated good agreement between the results obtained using KZK and the above-mentioned independent calibration techniques from 2 to 40 MHz, with the maximum discrepancy of 18% at 30 MHz. The frequency responses obtained using different hydrophone designs, including several membrane and needle probes, are presented, and it is shown that the technique developed provides a desirable tool for independent verification of primary calibration techniques such as those based on optical interferometry. Fundamental limitations of the presented calibration method are also examined.
NASA Technical Reports Server (NTRS)
Nicks, Oran W.; Korkan, Kenneth D.
1991-01-01
Two reports on student activities to determine the properties of a new laminar airfoil which were delivered at a conference on soaring technology are presented. The papers discuss a wind tunnel investigation and analysis of the SM701 airfoil and verification of the SM701 airfoil aerodynamic charcteristics utilizing theoretical techniques. The papers are based on a combination of analytical design, hands-on model fabrication, wind tunnel calibration and testing, data acquisition and analysis, and comparison of test results and theory.
Model-based engineering for medical-device software.
Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi
2010-01-01
This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.
Defining the IEEE-854 floating-point standard in PVS
NASA Technical Reports Server (NTRS)
Miner, Paul S.
1995-01-01
A significant portion of the ANSI/IEEE-854 Standard for Radix-Independent Floating-Point Arithmetic is defined in PVS (Prototype Verification System). Since IEEE-854 is a generalization of the ANSI/IEEE-754 Standard for Binary Floating-Point Arithmetic, the definition of IEEE-854 in PVS also formally defines much of IEEE-754. This collection of PVS theories provides a basis for machine checked verification of floating-point systems. This formal definition illustrates that formal specification techniques are sufficiently advanced that is is reasonable to consider their use in the development of future standards.
A formal approach to validation and verification for knowledge-based control systems
NASA Technical Reports Server (NTRS)
Castore, Glen
1987-01-01
As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
NASA Technical Reports Server (NTRS)
Johnson, Kenneth L.; White, K. Preston, Jr.
2012-01-01
The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.
NASA Technical Reports Server (NTRS)
Cantrell, J. H., Jr.; Winfree, W. P.
1980-01-01
The solution of the nonlinear differential equation which describes an initially sinusoidal finite-amplitude elastic wave propagating in a solid contains a static-displacement term in addition to the harmonic terms. The static-displacement amplitude is theoretically predicted to be proportional to the product of the squares of the driving-wave amplitude and the driving-wave frequency. The first experimental verification of the elastic-wave static displacement in a solid (the 111 direction of single-crystal germanium) is reported, and agreement is found with the theoretical predictions.
Active Interrogation for Spent Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swinhoe, Martyn Thomas; Dougan, Arden
2015-11-05
The DDA instrument for nuclear safeguards is a fast, non-destructive assay, active neutron interrogation technique using an external 14 MeV DT neutron generator for characterization and verification of spent nuclear fuel assemblies.
50 CFR 260.101 - Lavatory accommodations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... toilet and locker rooms and also at such other places as may be essential to the cleanliness of all... conspicuously in each toilet room and locker room directing employees to wash hands before returning to work. (e...
50 CFR 260.101 - Lavatory accommodations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... toilet and locker rooms and also at such other places as may be essential to the cleanliness of all... conspicuously in each toilet room and locker room directing employees to wash hands before returning to work. (e...
50 CFR 260.101 - Lavatory accommodations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... toilet and locker rooms and also at such other places as may be essential to the cleanliness of all... conspicuously in each toilet room and locker room directing employees to wash hands before returning to work. (e...
... and don't share needles used to inject drugs. Use a condom during sex. If you choose to have tattoos or body piercings, be picky about cleanliness and safety when selecting a shop. Get vaccinated. If you're at increased risk ...
NASA Technical Reports Server (NTRS)
Choi, S. D.
1974-01-01
Switch, which uses only two p-i-n diodes on microstrip substrate, has been developed for application in spacecraft radio systems. Switch features improved power drain, weight, volume, magnetic cleanliness, and reliability, over currently-used circulator and electromechanical switches.
EVALUATION OF CURRENT SUSTAINABILITY ASSESSMENT OF BIOBASED TECHNOLOGY
Sustainable technology is driven by economic competitiveness, government policies and public pressure. The claim of inherent cleanliness for biotechnology is too simplistic. Each application of biotechnology must be evaluated for suitable characteristics of sustainability. The ...
Abstraction Techniques for Parameterized Verification
2006-11-01
approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite
Ultrasound functional imaging in an ex vivo beating porcine heart platform
NASA Astrophysics Data System (ADS)
Petterson, Niels J.; Fixsen, Louis S.; Rutten, Marcel C. M.; Pijls, Nico H. J.; van de Vosse, Frans N.; Lopata, Richard G. P.
2017-12-01
In recent years, novel ultrasound functional imaging (UFI) techniques have been introduced to assess cardiac function by measuring, e.g. cardiac output (CO) and/or myocardial strain. Verification and reproducibility assessment in a realistic setting remain major issues. Simulations and phantoms are often unrealistic, whereas in vivo measurements often lack crucial hemodynamic parameters or ground truth data, or suffer from the large physiological and clinical variation between patients when attempting clinical validation. Controlled validation in certain pathologies is cumbersome and often requires the use of lab animals. In this study, an isolated beating pig heart setup was adapted and used for performance assessment of UFI techniques such as volume assessment and ultrasound strain imaging. The potential of performing verification and reproducibility studies was demonstrated. For proof-of-principle, validation of UFI in pathological hearts was examined. Ex vivo porcine hearts (n = 6, slaughterhouse waste) were resuscitated and attached to a mock circulatory system. Radio frequency ultrasound data of the left ventricle were acquired in five short axis views and one long axis view. Based on these slices, the CO was measured, where verification was performed using flow sensor measurements in the aorta. Strain imaging was performed providing radial, circumferential and longitudinal strain to assess reproducibility and inter-subject variability under steady conditions. Finally, strains in healthy hearts were compared to a heart with an implanted left ventricular assist device, simulating a failing, supported heart. Good agreement between ultrasound and flow sensor based CO measurements was found. Strains were highly reproducible (intraclass correlation coefficients >0.8). Differences were found due to biological variation and condition of the hearts. Strain magnitude and patterns in the assisted heart were available for different pump action, revealing large changes compared to the normal condition. The setup provides a valuable benchmarking platform for UFI techniques. Future studies will include work on different pathologies and other means of measurement verification.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco
2016-01-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477
A physical zero-knowledge object-comparison system for nuclear warhead verification.
Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
NASA Astrophysics Data System (ADS)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco
2016-09-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
A physical zero-knowledge object-comparison system for nuclear warhead verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
Analysis of historical delta values for IAEA/LANL NDA training courses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geist, William; Santi, Peter; Swinhoe, Martyn
2009-01-01
The Los Alamos National Laboratory (LANL) supports the International Atomic Energy Agency (IAEA) by providing training for IAEA inspectors in neutron and gamma-ray Nondestructive Assay (NDA) of nuclear material. Since 1980, all new IAEA inspectors attend this two week course at LANL gaining hands-on experience in the application of NDA techniques, procedures and analysis to measure plutonium and uranium nuclear material standards with well known pedigrees. As part of the course the inspectors conduct an inventory verification exercise. This exercise provides inspectors the opportunity to test their abilities in performing verification measurements using the various NDA techniques. For an inspector,more » the verification of an item is nominally based on whether the measured assay value agrees with the declared value to within three times the historical delta value. The historical delta value represents the average difference between measured and declared values from previous measurements taken on similar material with the same measurement technology. If the measurement falls outside a limit of three times the historical delta value, the declaration is not verified. This paper uses measurement data from five years of IAEA courses to calculate a historical delta for five non-destructive assay methods: Gamma-ray Enrichment, Gamma-ray Plutonium Isotopics, Passive Neutron Coincidence Counting, Active Neutron Coincidence Counting and the Neutron Coincidence Collar. These historical deltas provide information as to the precision and accuracy of these measurement techniques under realistic conditions.« less
Ensure preparation and capsule endoscopy: A two-center prospective study
Niv, Eva; Ovadia, Baruch; Ron, Yulia; Santo, Ervin; Mahajna, Elisabeth; Halpern, Zamir; Fireman, Zvi
2013-01-01
AIM: To compare small bowel (SB) cleanliness and capsule endoscopy (CE) image quality following Ensure®, polyethylene glycol (PEG) and standard preparations. METHODS: A preparation protocol for CE that is both efficacious and acceptable to patients remains elusive. Considering the physiological function of the SB as a site for the digestion and absorption of food and not as a stool reservoir, preparation consisting of a liquid, fiber-free formula ingested one day before a CE study might have an advantage over other kinds of preparations. We conducted a prospective, blind-to-preparation, two-center study that compared four types of preparations. The participants’ demographic and clinical data were collected. Gastric and SB transit times were calculated. The presence of bile in the duodenum was scored by a single, blinded-to-preparation gastroenterologist expert in CE, as was cleanliness within the proximal, middle and distal part of the SB. A four-point scale was used (grade 1 = no bile or residue, grade 4 ≥ 90% of lumen full of bile or residual material). RESULTS: The 198 consecutive patients who were referred to CE studies due to routine medical reasons were divided into four groups. They all observed a 12-h overnight fast before undergoing CE. Throughout the 24 h preceding the fast, control group 1 (n = 45 patients) ate light unrestricted meals, control group 2 (n = 81) also ate light meals but free of fruits and vegetables, the PEG group (n = 50) ate unrestricted light meals and ingested the PEG preparation, and the Ensure group (n = 22) ingested only the Ensure formula. Preparation with Ensure improved the visualization of duodenal mucosa (a score of 1.76) by decreasing the bile content compared to preparation with PEG (a score of 2.9) (P = 0.053). Overall, as expected, there was less residue and stool in the proximal part of the SB than in the middle and distal parts in all groups. The total score of cleanliness throughout the length of the SB showed some benefit for Ensure (a score of 1.8) over control group 2 (a score of 2) (P = 0.06). The cleanliness grading of the proximal and distal parts of the SB was similar in all four groups (P = 0.6 for both). The cleanliness in the middle part of the SB in the PEG (a score of 1.8) and Ensure groups (a score of 1.7) was equally better than that of control group 2 (a score of 2.1) (P = 0.057 and P = 0.07, respectively). All 50 PEG patients had diarrhea as an anticipated side effect, compared with only one patient in the Ensure group. CONCLUSION: Preparation with Ensure, a liquid, fiber-free formula has advantages over standard and PEG preparations, with significantly fewer side effects than PEG. PMID:23483023
NASA Astrophysics Data System (ADS)
Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.
2004-11-01
The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
Towards the Verification of Human-Robot Teams
NASA Technical Reports Server (NTRS)
Fisher, Michael; Pearce, Edward; Wooldridge, Mike; Sierhuis, Maarten; Visser, Willem; Bordini, Rafael H.
2005-01-01
Human-Agent collaboration is increasingly important. Not only do high-profile activities such as NASA missions to Mars intend to employ such teams, but our everyday activities involving interaction with computational devices falls into this category. In many of these scenarios, we are expected to trust that the agents will do what we expect and that the agents and humans will work together as expected. But how can we be sure? In this paper, we bring together previous work on the verification of multi-agent systems with work on the modelling of human-agent teamwork. Specifically, we target human-robot teamwork. This paper provides an outline of the way we are using formal verification techniques in order to analyse such collaborative activities. A particular application is the analysis of human-robot teams intended for use in future space exploration.
Top down, bottom up structured programming and program structuring
NASA Technical Reports Server (NTRS)
Hamilton, M.; Zeldin, S.
1972-01-01
New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.
Development of a technique for inflight jet noise simulation. I, II
NASA Technical Reports Server (NTRS)
Clapper, W. S.; Stringas, E. J.; Mani, R.; Banerian, G.
1976-01-01
Several possible noise simulation techniques were evaluated, including closed circuit wind tunnels, free jets, rocket sleds and high speed trains. The free jet technique was selected for demonstration and verification. The first paper describes the selection and development of the technique and presents results for simulation and in-flight tests of the Learjet, F106, and Bertin Aerotrain. The second presents a theoretical study relating the two sets of noise signatures. It is concluded that the free jet simulation technique provides a satisfactory assessment of in-flight noise.
SU-E-T-439: Fundamental Verification of Respiratory-Gated Spot Scanning Proton Beam Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamano, H; Yamakawa, T; Hayashi, N
Purpose: The spot-scanning proton beam irradiation with respiratory gating technique provides quite well dose distribution and requires both dosimetric and geometric verification prior to clinical implementation. The purpose of this study is to evaluate the impact of gating irradiation as a fundamental verification. Methods: We evaluated field width, flatness, symmetry, and penumbra in the gated and non-gated proton beams. The respiration motion was distinguished into 3 patterns: 10, 20, and 30 mm. We compared these contents between the gated and non-gated beams. A 200 MeV proton beam from PROBEAT-III unit (Hitachi Co.Ltd) was used in this study. Respiratory gating irradiationmore » was performed by Quasar phantom (MODUS medical devices) with a combination of dedicated respiratory gating system (ANZAI Medical Corporation). For radiochromic film dosimetry, the calibration curve was created with Gafchromic EBT3 film (Ashland) on FilmQA Pro 2014 (Ashland) as film analysis software. Results: The film was calibrated at the middle of spread out Bragg peak in passive proton beam. The field width, flatness and penumbra in non-gated proton irradiation with respiratory motion were larger than those of reference beam without respiratory motion: the maximum errors of the field width, flatness and penumbra in respiratory motion of 30 mm were 1.75% and 40.3% and 39.7%, respectively. The errors of flatness and penumbra in gating beam (motion: 30 mm, gating rate: 25%) were 0.0% and 2.91%, respectively. The results of symmetry in all proton beams with gating technique were within 0.6%. Conclusion: The field width, flatness, symmetry and penumbra were improved with the gating technique in proton beam. The spot scanning proton beam with gating technique is feasible for the motioned target.« less
Zhang, Yawei; Yin, Fang-Fang; Zhang, You; Ren, Lei
2017-05-07
The purpose of this study is to develop an adaptive prior knowledge guided image estimation technique to reduce the scan angle needed in the limited-angle intrafraction verification (LIVE) system for 4D-CBCT reconstruction. The LIVE system has been previously developed to reconstruct 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the 4D-CBCT images for faster intrafraction verification. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on kV-MV projections acquired in extremely limited angle (orthogonal 3°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of the respiratory motion. The 4D digital extended-cardiac-torso (XCAT) phantom and a CIRS 008A dynamic thoracic phantom were used to evaluate the effectiveness of this technique. The reconstruction accuracy of the technique was evaluated by calculating both the center-of-mass-shift (COMS) and 3D volume-percentage-difference (VPD) of the tumor in reconstructed images and the true on-board images. The performance of the technique was also assessed with varied breathing signals against scanning angle, lesion size, lesion location, projection sampling interval, and scanning direction. In the XCAT study, using orthogonal-view of 3° kV and portal MV projections, this technique achieved an average tumor COMS/VPD of 0.4 ± 0.1 mm/5.5 ± 2.2%, 0.6 ± 0.3 mm/7.2 ± 2.8%, 0.5 ± 0.2 mm/7.1 ± 2.6%, 0.6 ± 0.2 mm/8.3 ± 2.4%, for baseline drift, amplitude variation, phase shift, and patient breathing signal variation, respectively. In the CIRS phantom study, this technique achieved an average tumor COMS/VPD of 0.7 ± 0.1 mm/7.5 ± 1.3% for a 3 cm lesion and 0.6 ± 0.2 mm/11.4 ± 1.5% for a 2 cm lesion in the baseline drift case. The average tumor COMS/VPD were 0.5 ± 0.2 mm/10.8 ± 1.4%, 0.4 ± 0.3 mm/7.3 ± 2.9%, 0.4 ± 0.2 mm/7.4 ± 2.5%, 0.4 ± 0.2 mm/7.3 ± 2.8% for the four real patient breathing signals, respectively. Results demonstrated that the adaptive prior knowledge guided image estimation technique with LIVE system is robust against scanning angle, lesion size, location and scanning direction. It can estimate on-board images accurately with as little as 6 projections in orthogonal-view 3° angle. In conclusion, adaptive prior knowledge guided image reconstruction technique accurately estimates 4D-CBCT images using extremely-limited angle and projections. This technique greatly improves the efficiency and accuracy of LIVE system for ultrafast 4D intrafraction verification of lung SBRT treatments.
Employment of adaptive learning techniques for the discrimination of acoustic emissions
NASA Astrophysics Data System (ADS)
Erkes, J. W.; McDonald, J. F.; Scarton, H. A.; Tam, K. C.; Kraft, R. P.
1983-11-01
The following aspects of this study on the discrimination of acoustic emissions (AE) were examined: (1) The analytical development and assessment of digital signal processing techniques for AE signal dereverberation, noise reduction, and source characterization; (2) The modeling and verification of some aspects of key selected techniques through a computer-based simulation; and (3) The study of signal propagation physics and their effect on received signal characteristics for relevant physical situations.
42 CFR 441.365 - Periodic evaluation, assessment, and review.
Code of Federal Regulations, 2011 CFR
2011-10-01
... State government (such as the Department of Health or the Agency on Aging). (2) Each review team must... new § 441.365(g)(3)): (i) Cleanliness. (ii) Absence of bedsores. (iii) Absence of signs of...
42 CFR 441.365 - Periodic evaluation, assessment, and review.
Code of Federal Regulations, 2010 CFR
2010-10-01
... State government (such as the Department of Health or the Agency on Aging). (2) Each review team must... new § 441.365(g)(3)): (i) Cleanliness. (ii) Absence of bedsores. (iii) Absence of signs of...
Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos
2016-01-01
This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
NASA Astrophysics Data System (ADS)
Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.
2008-08-01
The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.
Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos
2016-11-18
This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.
Garbage monitoring system using IoT
NASA Astrophysics Data System (ADS)
Anitha, A.
2017-11-01
Nowadays certain actions are taken to improve the level of cleanliness in the country. People are getting more active in doing all the things possible to clean their surroundings. Various movements are also started by the government to increase cleanliness. We will try to build a system which will notify the corporations to empty the bin on time. In this system, we will put a sensor on top of the garbage bin which will detect the total level of garbage inside it according to the total size of the bin. When the garbage will reach the maximum level, a notification will be sent to the corporation's office, then the employees can take further actions to empty the bin. This system will help in cleaning the city in a better way. By using this system people do not have to check all the systems manually but they will get a notification when the bin will get filled.
Towards a better control of optics cleanliness
NASA Astrophysics Data System (ADS)
Berlioz, P.
2017-11-01
The contamination of optics can considerably degrade the transmission and scattering of spacecraft optics. To prevent efficiently optics from contamination involves introducing since design phase requirements on materials and protections (covers…). Then, integration and test phase demands to implement heavy and stringent means (clean room, specific garment, covers…) and a permanent monitoring by fine contamination measurement of instrument environment and surfaces. Contamination budgets are drawn the project along, first prediction budgets based on analysis and potentially modeling, during design phase, then actual budgets based on contamination measurement during integration and test phase. Finally, the risk still exists to have to clean optics because of hazardous contamination, furthermore to dismount them. The cleanliness engineering set at ASTRIUM Toulouse is presented here, including the contamination monitoring via witness samples measured by IR spectrometry and via counters. ASTRIUM is presently focusing attention on no contact cleaning like the promising UV-ozone process.
Spot-checks to measure general hygiene practice.
Sonego, Ina L; Mosler, Hans-Joachim
2016-01-01
A variety of hygiene behaviors are fundamental to the prevention of diarrhea. We used spot-checks in a survey of 761 households in Burundi to examine whether something we could call general hygiene practice is responsible for more specific hygiene behaviors, ranging from handwashing to sweeping the floor. Using structural equation modeling, we showed that clusters of hygiene behavior, such as primary caregivers' cleanliness and household cleanliness, explained the spot-check findings well. Within our model, general hygiene practice as overall concept explained the more specific clusters of hygiene behavior well. Furthermore, the higher general hygiene practice, the more likely children were to be categorized healthy (r = 0.46). General hygiene practice was correlated with commitment to hygiene (r = 0.52), indicating a strong association to psychosocial determinants. The results show that different hygiene behaviors co-occur regularly. Using spot-checks, the general hygiene practice of a household can be rated quickly and easily.
NASA Astrophysics Data System (ADS)
Matsushima, Masaki; Tsunakawa, Hideo; Iijima, Yu-Ichi; Nakazawa, Satoru; Matsuoka, Ayako; Ikegami, Shingo; Ishikawa, Tomoaki; Shibuya, Hidetoshi; Shimizu, Hisayoshi; Takahashi, Futoshi
2010-07-01
To achieve the scientific objectives related to the lunar magnetic field measurements in a polar orbit at an altitude of 100 km, strict electromagnetic compatibility (EMC) requirements were applied to all components and subsystems of the SELENE (Kaguya) spacecraft. The magnetic cleanliness program was defined as one of the EMC control procedures, and magnetic tests were carried out for most of the engineering and flight models. The EMC performance of all components was systematically controlled and examined through a series of EMC tests. As a result, the Kaguya spacecraft was made to be very clean, magnetically. Hence reliable scientific data related to the magnetic field around the Moon were obtained by the LMAG (Lunar MAGnetometer) and the PACE (Plasma energy Angle and Composition Experiment) onboard the Kaguya spacecraft. These data have been available for lunar science use since November 2009.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
Comparative Mirror Cleaning Study: 'A Study on Removing Particulate Contamination'
NASA Technical Reports Server (NTRS)
Houston, Karrie
2007-01-01
The cleanliness of optical surfaces is recognized as an industry-wide concern for the performance of optical devices such as mirrors and telescopes, microscopes and lenses, lasers and interferometers, and prisms and optical filters. However, no standard has been established for optical cleaning and there is no standard definition of a 'clean' optical element. This study evaluates the effectiveness of commonly used optical cleaning techniques based on wafer configuration, contamination levels, and the number and size of removed particles. It is concluded that cleaning method and exposure time play a significant factor in obtaining a high removal percentage. The detergent bath and solvent rinse method displayed an increase in effective removal percentage as the contamination exposure increased. Likewise, CO2 snow cleaning showed a relatively consistent cleaning effectiveness. The results can help ensure mission success to flight projects developed for the NASA Origins Program. Advantages and disadvantages of each of the optical cleaning methods are described.
Farhat, Asma; Fabiano-Tixier, Anne-Sylvie; Visinoni, Franco; Romdhane, Mehrez; Chemat, Farid
2010-11-19
Without adding any solvent or water, we proposed a novel and green approach for the extraction of secondary metabolites from dried plant materials. This "solvent, water and vapor free" approach based on a simple principle involves the application of microwave irradiation and earth gravity to extract the essential oil from dried caraway seeds. Microwave dry-diffusion and gravity (MDG) has been compared with a conventional technique, hydrodistillation (HD), for the extraction of essential oil from dried caraway seeds. Essential oils isolated by MDG were quantitatively (yield) and qualitatively (aromatic profile) similar to those obtained by HD, but MDG was better than HD in terms of rapidity (45min versus 300min), energy saving, and cleanliness. The present apparatus permits fast and efficient extraction, reduces waste, avoids water and solvent consumption, and allows substantial energy savings. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Bowley, C. J.; Barnes, J. C.; Rango, A.
1981-01-01
The purpose of the handbook is to update the various snowcover interpretation techniques, document the snow mapping techniques used in the various ASVT study areas, and describe the ways snowcover data have been applied to runoff prediction. Through documentation in handbook form, the methodology developed in the Snow Mapping ASVT can be applied to other areas.
Banach, Marzena; Wasilewska, Agnieszka; Dlugosz, Rafal; Pauk, Jolanta
2018-05-18
Due to the problem of aging societies, there is a need for smart buildings to monitor and support people with various disabilities, including rheumatoid arthritis. The aim of this paper is to elaborate on novel techniques for wireless motion capture systems for the monitoring and rehabilitation of disabled people for application in smart buildings. The proposed techniques are based on cross-verification of distance measurements between markers and transponders in an environment with highly variable parameters. To their verification, algorithms that enable comprehensive investigation of a system with different numbers of transponders and varying ambient parameters (temperature and noise) were developed. In the estimation of the real positions of markers, various linear and nonlinear filters were used. Several thousand tests were carried out for various system parameters and different marker locations. The results show that localization error may be reduced by as much as 90%. It was observed that repetition of measurement reduces localization error by as much as one order of magnitude. The proposed system, based on wireless techniques, offers a high commercial potential. However, it requires extensive cooperation between teams, including hardware and software design, system modelling, and architectural design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samuel, D; Testa, M; Park, Y
Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less
Microbiological cleanliness of the Mars Exploration Rover spacecraft
NASA Technical Reports Server (NTRS)
Newlin, L.; Barengoltz, J.; Chung, S.; Kirschner, L.; Koukol, R.; Morales, F.
2002-01-01
Planetary protection for Mars missions is described, and the approach being taken by the Mars Exploration Rover Project is discussed. Specific topics include alcohol wiping, dry heat microbial reduction, microbiological assays, and the Kennedy Space center's PHSF clean room.
Family Resemblance in Attitudes to Foods.
ERIC Educational Resources Information Center
Rozin, Paul; And Others
1984-01-01
Compares young adults with their parents and explores food preferences and attitudes to food, especially sensitivity to cleanliness and contamination of foods. Results indicate small positive parent-child correlations for food preferences but larger correlations for contamination sensitivity. (Author/AS)
Thermoplastic Single-Ply Roof Relieves Water Damage and Inconvenience.
ERIC Educational Resources Information Center
Williams, Jennifer Lynn
2002-01-01
Assesses use of thermoplastic single-ply roofs by North Carolina's Mars Hill College to prevent leaks, reduce maintenance costs, and enhance the value of their older historic buildings. Administrators comment on the roof's installation efficiency and cleanliness. (GR)
Norring, M; Manninen, E; de Passillé, A M; Rushen, J; Munksgaard, L; Saloniemi, H
2008-02-01
This experiment compared the effects of sand and straw bedding in free stalls on resting time, cleanliness, hock injuries, and hoof health of dairy cows and tested whether cow preferences for a bedding material depended on the familiarity with the material. A total of 52 dairy cows were kept either on straw bedded concrete stalls or sand stalls for at least 21 wk. The lying behavior was observed, and hock lesions, hoof health, and cleanliness of the cows and stalls were measured. A 5-d preference test between sand and straw stalls was conducted at the end of the experiment. The total daily duration of lying was longer for cows on straw bedding than on sand bedding (straw 749 +/- 16 vs. sand 678 +/- 19 min). During the preference test, cows that had been kept on straw bedding preferred lying in straw stalls [straw 218.7 (133.4 to 239.7) vs. sand 9.0 min (2.8 to 44.8)]; however, cows that had been kept on sand showed no preference [straw 101.3 (51.7 to 205.9) vs. sand 94.3 min (54.1 to 156.1, median and interquartile range)]. Although there were no differences in the dirtiness of stalls, the cows using straw stalls were dirtier than cows using sand stalls [straw 6.04 (5.39 to 6.28) vs. sand 4.19 (3.62 to 5.16)]. At the end of experiment the severity of hock lesions was lower for cows on sand than for cows on straw [sand 0.5 (0.0 to 1.0) vs. straw 1.0 (1.0 to 2.0)]. The improvement in overall hoof health over the observation period was greater for cows kept on sand compared with cows kept on straw [sand -2.00 (-3.75 to -0.25) vs. straw 0.00 (-2.00 to 2.00)]. Straw bedding increased the time that cows spend lying, and cows preferred straw stalls to sand stalls. However, previous experience with sand reduces avoidance of sand stalls. Sand stalls were advantageous for cow cleanliness and health; hock lesions and claw diseases healed more quickly for cows using sand stalls compared with straw.
Ngondi, Jeremiah; Matthews, Fiona; Reacher, Mark; Baba, Samson; Brayne, Carol; Emerson, Paul
2008-04-30
Surgery, Antibiotics, Facial cleanliness and Environmental improvement (SAFE) are advocated by the World Health Organization (WHO) for trachoma control. However, few studies have evaluated the complete SAFE strategy, and of these, none have investigated the associations of Antibiotics, Facial cleanliness, and Environmental improvement (A,F,E) interventions and active trachoma. We aimed to investigate associations between active trachoma and A,F,E interventions in communities in Southern Sudan. Surveys were undertaken in four districts after 3 years of implementation of the SAFE strategy. Children aged 1-9 years were examined for trachoma and uptake of SAFE assessed through interviews and observations. Using ordinal logistic regression, associations between signs of active trachoma and A,F,E interventions were explored. Trachomatous inflammation-intense (TI) was considered more severe than trachomatous inflammation-follicular (TF). A total of 1,712 children from 25 clusters (villages) were included in the analysis. Overall uptake of A,F,E interventions was: 53.0% of the eligible children had received at least one treatment with azithromycin; 62.4% children had a clean face on examination; 72.5% households reported washing faces of children two or more times a day; 73.1% households had received health education; 44.4% of households had water accessible within 30 minutes; and 6.3% households had pit latrines. Adjusting for age, sex, and district baseline prevalence of active trachoma, factors independently associated with reduced odds of a more severe active trachoma sign were: receiving three treatments with azithromycin (odds ratio [OR] = 0.1; 95% confidence interval [CI] 0.0-0.4); clean face (OR = 0.3; 95% CI 0.2-0.4); washing faces of children three or more times daily (OR = 0.4; 95% CI 0.3-0.7); and presence and use of a pit latrine in the household (OR = 0.4; 95% CI 0.2-0.9). Analysis of associations between the A,F,E components of the SAFE strategy and active trachoma showed independent protective effects against active trachoma of mass systemic azithromycin treatment, facial cleanliness, face washing, and use of pit latrines in the household. This strongly argues for continued use of all the components of the SAFE strategy together.
Murphy, V S; Lowe, D E; Lively, F O; Gordon, A W
2018-05-01
The aim of this study was to evaluate the effect of using different floor types to accommodate growing and finishing beef cattle on their performance, cleanliness, carcass characteristics and meat quality. In total, 80 dairy origin young bulls (mean initial live weight 224 kg (SD=28.4 kg)) were divided into 20 blocks with four animals each according to live weight. The total duration of the experimental period was 204 days. The first 101 days was defined as the growing period, with the remainder of the study defined as the finishing period. Cattle were randomly assigned within blocks to one of four floor type treatments, which included fully slatted flooring throughout the entire experimental period (CS); fully slatted flooring covered with rubber strips throughout the entire experimental period (RS); fully slatted flooring during the growing period and moved to a solid floor covered with straw bedding during the finishing period (CS-S) and fully slatted flooring during the growing period and moved to fully slatted flooring covered with rubber strips during the finishing period (CS-RS). Bulls were offered ad libitum grass silage supplemented with concentrates during the growing period. During the finishing period, bulls were offered concentrates supplemented with chopped barley straw. There was no significant effect of floor type on total dry matter intake (DMI), feed conversion ratio, daily live weight gain or back fat depth during the growing and finishing periods. Compared with bulls accommodated on CS, RS and CS-RS, bulls accommodated on CS-S had a significantly lower straw DMI (P<0.01). Although bulls accommodated on CS and CS-S were significantly dirtier compared with those accommodated on RS and CS-RS on days 50 (P<0.05) and 151 (P<0.01), there was no effect of floor type on the cleanliness of bulls at the end of the growing and finishing periods. There was also no significant effect of floor type on carcass characteristics or meat quality. However, bulls accommodated on CS-S had a tendency for less channel, cod and kidney fat (P=0.084) compared with those accommodated on CS, RS and CS-RS. Overall, floor type had no effect on the performance, cleanliness, carcass characteristics or meat quality of growing or finishing beef cattle.
Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris
2010-01-01
Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
Baeten; Bruggeman; Paepen; Carchon
2000-03-01
The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1989-01-01
The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.
Expert system verification and validation survey, delivery 4
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Expert system verification and validation survey. Delivery 2: Survey results
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and industry applications. This is the first task of the series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Man-rated flight software for the F-8 DFBW program
NASA Technical Reports Server (NTRS)
Bairnsfather, R. R.
1976-01-01
The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.
Expert system verification and validation survey. Delivery 5: Revised
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Expert system verification and validation survey. Delivery 3: Recommendations
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Klein, Gerwin; Andronick, June; Keller, Gabriele; Matichuk, Daniel; Murray, Toby; O'Connor, Liam
2017-10-13
We present recent work on building and scaling trustworthy systems with formal, machine-checkable proof from the ground up, including the operating system kernel, at the level of binary machine code. We first give a brief overview of the seL4 microkernel verification and how it can be used to build verified systems. We then show two complementary techniques for scaling these methods to larger systems: proof engineering, to estimate verification effort; and code/proof co-generation, for scalable development of provably trustworthy applications.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).
Bridge Health Monitoring Using a Machine Learning Strategy
DOT National Transportation Integrated Search
2017-01-01
The goal of this project was to cast the SHM problem within a statistical pattern recognition framework. Techniques borrowed from speaker recognition, particularly speaker verification, were used as this discipline deals with problems very similar to...
Catarinucci, L; Tarricone, L
2009-12-01
With the next transposition of the 2004/40/EC Directive, employers will become responsible for the electromagnetic field level at the workplace. To make this task easier, the scientific community is compiling practical guidelines to be followed. This work aims at enriching such guidelines, especially for the dosimetric issues. More specifically, some critical aspects related to the application of numerical dosimetric techniques for the verification of the safety limit compliance have been highlighted. In particular, three different aspects have been considered: the dosimetric parameter dependence on the shape and the inner characterisation of the exposed subject as well as on the numerical algorithm used, and the correlation between reference limits and basic restriction. Results and discussions demonstrate how, even by using sophisticated numerical techniques, in some cases a complex interpretation of the result is mandatory.
EVA Design, Verification, and On-Orbit Operations Support Using Worksite Analysis
NASA Technical Reports Server (NTRS)
Hagale, Thomas J.; Price, Larry R.
2000-01-01
The International Space Station (ISS) design is a very large and complex orbiting structure with thousands of Extravehicular Activity (EVA) worksites. These worksites are used to assemble and maintain the ISS. The challenge facing EVA designers was how to design, verify, and operationally support such a large number of worksites within cost and schedule. This has been solved through the practical use of computer aided design (CAD) graphical techniques that have been developed and used with a high degree of success over the past decade. The EVA design process allows analysts to work concurrently with hardware designers so that EVA equipment can be incorporated and structures configured to allow for EVA access and manipulation. Compliance with EVA requirements is strictly enforced during the design process. These techniques and procedures, coupled with neutral buoyancy underwater testing, have proven most valuable in the development, verification, and on-orbit support of planned or contingency EVA worksites.
Infrasound from the 2009 and 2017 DPRK rocket launches
NASA Astrophysics Data System (ADS)
Evers, L. G.; Assink, J. D.; Smets, P. SM
2018-06-01
Supersonic rockets generate low-frequency acoustic waves, that is, infrasound, during the launch and re-entry. Infrasound is routinely observed at infrasound arrays from the International Monitoring System, in place for the verification of the Comprehensive Nuclear-Test-Ban Treaty. Association and source identification are key elements of the verification system. The moving nature of a rocket is a defining criterion in order to distinguish it from an isolated explosion. Here, it is shown how infrasound recordings can be associated, which leads to identification of the rocket. Propagation modelling is included to further constrain the source identification. Four rocket launches by the Democratic People's Republic of Korea in 2009 and 2017 are analysed in which multiple arrays detected the infrasound. Source identification in this region is important for verification purposes. It is concluded that with a passive monitoring technique such as infrasound, characteristics can be remotely obtained on sources of interest, that is, infrasonic intelligence, over 4500+ km.
Closed Loop Requirements and Analysis Management
NASA Technical Reports Server (NTRS)
Lamoreaux, Michael; Verhoef, Brett
2015-01-01
Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.
Static test induced loads verification beyond elastic limit
NASA Technical Reports Server (NTRS)
Verderaime, V.; Harrington, F.
1996-01-01
Increasing demands for reliable and least-cost high-performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total-inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.
Static test induced loads verification beyond elastic limit
NASA Technical Reports Server (NTRS)
Verderaime, V.; Harrington, F.
1996-01-01
Increasing demands for reliable and least-cost high performance aerostructures are pressing design analyses, materials, and manufacturing processes to new and narrowly experienced performance and verification technologies. This study assessed the adequacy of current experimental verification of the traditional binding ultimate safety factor which covers rare events in which no statistical design data exist. Because large, high-performance structures are inherently very flexible, boundary rotations and deflections under externally applied loads approaching fracture may distort their transmission and unknowingly accept submarginal structures or prematurely fracturing reliable ones. A technique was developed, using measured strains from back-to-back surface mounted gauges, to analyze, define, and monitor induced moments and plane forces through progressive material changes from total-elastic to total inelastic zones within the structural element cross section. Deviations from specified test loads are identified by the consecutively changing ratios of moment-to-axial load.
Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking
NASA Technical Reports Server (NTRS)
Cavada, Roberto; Pecheur, Charles
2003-01-01
This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.
Faster Double-Size Bipartite Multiplication out of Montgomery Multipliers
NASA Astrophysics Data System (ADS)
Yoshino, Masayuki; Okeya, Katsuyuki; Vuillaume, Camille
This paper proposes novel algorithms for computing double-size modular multiplications with few modulus-dependent precomputations. Low-end devices such as smartcards are usually equipped with hardware Montgomery multipliers. However, due to progresses of mathematical attacks, security institutions such as NIST have steadily demanded longer bit-lengths for public-key cryptography, making the multipliers quickly obsolete. In an attempt to extend the lifespan of such multipliers, double-size techniques compute modular multiplications with twice the bit-length of the multipliers. Techniques are known for extending the bit-length of classical Euclidean multipliers, of Montgomery multipliers and the combination thereof, namely bipartite multipliers. However, unlike classical and bipartite multiplications, Montgomery multiplications involve modulus-dependent precomputations, which amount to a large part of an RSA encryption or signature verification. The proposed double-size technique simulates double-size multiplications based on single-size Montgomery multipliers, and yet precomputations are essentially free: in an 2048-bit RSA encryption or signature verification with public exponent e=216+1, the proposal with a 1024-bit Montgomery multiplier is at least 1.5 times faster than previous double-size Montgomery multiplications.
Application of additive laser technologies in the gas turbine blades design process
NASA Astrophysics Data System (ADS)
Shevchenko, I. V.; Rogalev, A. N.; Osipov, S. K.; Bychkov, N. M.; Komarov, I. I.
2017-11-01
An emergence of modern innovative technologies requires delivering new and modernization existing design and production processes. It is especially relevant for designing the high-temperature turbines of gas turbine engines, development of which is characterized by a transition to higher parameters of working medium in order to improve their efficient performance. A design technique for gas turbine blades based on predictive verification of thermal and hydraulic models of their cooling systems by testing of a blade prototype fabricated using the selective laser melting technology was presented in this article. Technique was proven at the time of development of the first stage blade cooling system for the high-pressure turbine. An experimental procedure for verification of a thermal model of the blades with convective cooling systems based on the comparison of heat-flux density obtained from the numerical simulation data and results of tests in a liquid-metal thermostat was developed. The techniques makes it possible to obtain an experimentally tested blade version and to exclude its experimental adjustment after the start of mass production.
Low radioactivity material for use in mounting radiation detectors
NASA Technical Reports Server (NTRS)
Fong, Marshall; Metzger, Albert E.; Fox, Richard L.
1988-01-01
Two materials, sapphire and synthetic quartz, have been found for use in Ge detector mounting assemblies. These materials combine desirable mechanical, thermal, and electrical properties with the radioactive cleanliness required to detect minimal amounts of K, Th, and U.
DOT National Transportation Integrated Search
2016-09-02
Public transportation agencies can obtain large amounts of information regarding timeliness, efficiency, cleanliness, ridership, and other : performance measures. However, these metrics are based on the interests of these agencies and do not necessar...
Multibody modeling and verification
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1989-01-01
A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.
Development of a software safety process and a case study of its use
NASA Technical Reports Server (NTRS)
Knight, John C.
1993-01-01
The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.
In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less
Surface inspection: Research and development
NASA Technical Reports Server (NTRS)
Batchelder, J. S.
1987-01-01
Surface inspection techniques are used for process learning, quality verification, and postmortem analysis in manufacturing for a spectrum of disciplines. First, trends in surface analysis are summarized for integrated circuits, high density interconnection boards, and magnetic disks, emphasizing on-line applications as opposed to off-line or development techniques. Then, a closer look is taken at microcontamination detection from both a patterned defect and a particulate inspection point of view.
TomoTherapy MLC verification using exit detector data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Quan; Westerly, David; Fang Zhenyu
2012-01-15
Purpose: Treatment delivery verification (DV) is important in the field of intensity modulated radiation therapy (IMRT). While IMRT and image guided radiation therapy (IGRT), allow us to create more conformal plans and enables the use of tighter margins, an erroneously executed plan can have detrimental effects on the treatment outcome. The purpose of this study is to develop a DV technique to verify TomoTherapy's multileaf collimator (MLC) using the onboard mega-voltage CT detectors. Methods: The proposed DV method uses temporal changes in the MVCT detector signal to predict actual leaf open times delivered on the treatment machine. Penumbra and scatteredmore » radiation effects may produce confounding results when determining leaf open times from the raw detector data. To reduce the impact of the effects, an iterative, Richardson-Lucy (R-L) deconvolution algorithm is applied. Optical sensors installed on each MLC leaf are used to verify the accuracy of the DV technique. The robustness of the DV technique is examined by introducing different attenuation materials in the beam. Additionally, the DV technique has been used to investigate several clinical plans which failed to pass delivery quality assurance (DQA) and was successful in identifying MLC timing discrepancies as the root cause. Results: The leaf open time extracted from the exit detector showed good agreement with the optical sensors under a variety of conditions. Detector-measured leaf open times agreed with optical sensor data to within 0.2 ms, and 99% of the results agreed within 8.5 ms. These results changed little when attenuation was added in the beam. For the clinical plans failing DQA, the dose calculated from reconstructed leaf open times played an instrumental role in discovering the root-cause of the problem. Throughout the retrospective study, it is found that the reconstructed dose always agrees with measured doses to within 1%. Conclusions: The exit detectors in the TomoTherapy treatment systems can provide valuable information about MLC behavior during delivery. A technique to estimate the TomoTherapy binary MLC leaf open time from exit detector signals is described. This technique is shown to be both robust and accurate for delivery verification.« less
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - 4100 VAPOR DETECTOR - ELECTRONIC SENSOR TECHNOLOGY
In July 1997, the U.S. Environmental Protection Agency conducted a demonstration of polychlorinated biphenyl (PCB) FIELD ANALYTICAL TECHNIQUES. The demonstration design was subjected to extensive review and comment by EPA's National Exposure Research Laboratory (NERL) Environmen...
Projects in an expert system class
NASA Technical Reports Server (NTRS)
Whitson, George M.
1991-01-01
Many universities now teach courses in expert systems. In these courses students study the architecture of an expert system, knowledge acquisition techniques, methods of implementing systems and verification and validation techniques. A major component of any such course is a class project consisting of the design and implementation of an expert system. Discussed here are a number of techniques that we have used at the University of Texas at Tyler to develop meaningful projects that could be completed in a semester course.
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1975-01-01
Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.
Development and production of a multilayer-coated x-ray reflecting stack for the Athena mission
NASA Astrophysics Data System (ADS)
Massahi, S.; Ferreira, D. D. M.; Christensen, F. E.; Shortt, B.; Girou, D. A.; Collon, M.; Landgraf, B.; Barriere, N.; Krumrey, M.; Cibik, L.; Schreiber, S.
2016-07-01
The Advanced Telescope for High-Energy Astrophysics, Athena, selected as the European Space Agency's second large-mission, is based on the novel Silicon Pore Optics X-ray mirror technology. DTU Space has been working for several years on the development of multilayer coatings on the Silicon Pore Optics in an effort to optimize the throughput of the Athena optics. A linearly graded Ir/B4C multilayer has been deposited on the mirrors, via the direct current magnetron sputtering technique, at DTU Space. This specific multilayer, has through simulations, been demonstrated to produce the highest reflectivity at 6 keV, which is a goal for the scientific objectives of the mission. A critical aspect of the coating process concerns the use of photolithography techniques upon which we will present the most recent developments in particular related to the cleanliness of the plates. Experiments regarding the lift-off and stacking of the mirrors have been performed and the results obtained will be presented. Furthermore, characterization of the deposited thin-films was performed with X-ray reflectometry at DTU Space and in the laboratory of the Physikalisch-Technische Bundesanstalt at the synchrotron radiation facility BESSY II.
Electrical contacts to individual SWCNTs: A review
Hierold, Christofer; Haluska, Miroslav
2014-01-01
Summary Owing to their superior electrical characteristics, nanometer dimensions and definable lengths, single-walled carbon nanotubes (SWCNTs) are considered as one of the most promising materials for various types of nanodevices. Additionally, they can be used as either passive or active elements. To be integrated into circuitry or devices, they are typically connected with metal leads to provide electrical contacts. The properties and quality of these electrical contacts are important for the function and performance of SWCNT-based devices. Since carbon nanotubes are quasi-one-dimensional structures, contacts to them are different from those for bulk semiconductors. Additionally, some techniques used in Si-based technology are not compatible with SWCNT-based device fabrication, such as the contact area cleaning technique. In this review, an overview of the investigations of metal–SWCNT contacts is presented, including the principle of charge carrier injection through the metal–SWCNT contacts and experimental achievements. The methods for characterizing the electrical contacts are discussed as well. The parameters which influence the contact properties are summarized, mainly focusing on the contact geometry, metal type and the cleanliness of the SWCNT surface affected by the fabrication processes. Moreover, the challenges for widespread application of CNFETs are additionally discussed. PMID:25551048
1991-01-01
degrade due to thermal cyclingmultiple repairs, and/or corrosion. Depending on the service history and alloy, reduced properties result from carbide U...subsystem 91 titanitun alloys 85 cleanliness gas turbine engines 85 titani •-n aluminide 81 tolerance control 92 tool breaqkage 95 tool fracture 95 tools
40 CFR 80.141 - Interim detergent gasoline program.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) “Carburetor Cleanliness Test Procedure, State-of-the-Art Summary, Report: 1973-1981”, Coordinating Research... ultimate consumer; (ii) All additized post-refinery component (PRC); and (iii) All detergent additives sold... who manufacture, supply, or transfer detergent additives or detergent-additized post-refinery...
Code of Federal Regulations, 2010 CFR
2010-04-01
... CONSUMPTION CURRENT GOOD MANUFACTURING PRACTICE IN MANUFACTURING, PACKING, OR HOLDING HUMAN FOOD General... contamination of food. The methods for maintaining cleanliness include, but are not limited to: (1) Wearing outer garments suitable to the operation in a manner that protects against the contamination of food...
ADDRESSING EMERGING ISSUES IN WATER QUALITY THROUGH ENVIRONMENTAL CHEMISTRY
Public concern over cleanliness and safety of source and recreational waters has prompted researchers to look for indicators of water quality. Giving public water authorities multiple tools to measure and monitor levels of chemical contaminants, as well as chemical markers of c...
Using the Case Study Method to Treat Several Problems in a Family Indicated for Child Neglect.
ERIC Educational Resources Information Center
Lutzker, John R.; And Others
1984-01-01
An example of the ecobehavioral perspective is offered of a family referred for child neglect which received a variety of services from one project, including interventions for health maintenance, cleanliness, and personal hygiene. (CL)
Microbial biodiversity assessment of the European Space Agency's ExoMars 2016 mission.
Koskinen, Kaisa; Rettberg, Petra; Pukall, Rüdiger; Auerbach, Anna; Wink, Lisa; Barczyk, Simon; Perras, Alexandra; Mahnert, Alexander; Margheritis, Diana; Kminek, Gerhard; Moissl-Eichinger, Christine
2017-10-25
The ExoMars 2016 mission, consisting of the Trace Gas Orbiter and the Schiaparelli lander, was launched on March 14 2016 from Baikonur, Kazakhstan and reached its destination in October 2016. The Schiaparelli lander was subject to strict requirements for microbial cleanliness according to the obligatory planetary protection policy. To reach the required cleanliness, the ExoMars 2016 flight hardware was assembled in a newly built, biocontrolled cleanroom complex at Thales Alenia Space in Turin, Italy. In this study, we performed microbiological surveys of the cleanroom facilities and the spacecraft hardware before and during the assembly, integration and testing (AIT) activities. Besides the European Space Agency (ESA) standard bioburden assay, that served as a proxy for the microbiological contamination in general, we performed various alternative cultivation assays and utilised molecular techniques, including quantitative PCR and next generation sequencing, to assess the absolute and relative abundance and broadest diversity of microorganisms and their signatures in the cleanroom and on the spacecraft hardware. Our results show that the bioburden, detected microbial contamination and microbial diversity decreased continuously after the cleanroom was decontaminated with more effective cleaning agents and during the ongoing AIT. The studied cleanrooms and change room were occupied by very distinct microbial communities: Overall, the change room harboured a higher number and diversity of microorganisms, including Propionibacterium, which was found to be significantly increased in the change room. In particular, the so called alternative cultivation assays proved important in detecting a broader cultivable diversity than covered by the standard bioburden assay and thus completed the picture on the cleanroom microbiota. During the whole project, the bioburden stayed at acceptable level and did not raise any concern for the ExoMars 2016 mission. The cleanroom complex at Thales Alenia Space in Turin is an excellent example of how efficient microbiological control is performed.
NASA Technical Reports Server (NTRS)
Dolgin, B.; Yarbrough, C.; Carson, J.; Troy, R.
2000-01-01
The proposed Mars Sample Transfer Chain Architecture provides Planetary Protection Officers with clean samples that are required for the eventual release from confinement of the returned Martian samples. At the same time, absolute cleanliness and sterility requirement is not placed of any part of the Lander (including the deep drill), Mars Assent Vehicle (MAV), any part of the Orbiting Sample container (OS), Rover mobility platform, any part of the Minicorer, Robotic arm (including instrument sensors), and most of the caching equipment on the Rover. The removal of the strict requirements in excess of the Category IVa cleanliness (Pathfinder clean) is expected to lead to significant cost savings. The proposed architecture assumes that crosscontamination renders all surfaces in the vicinity of the rover(s) and the lander(s) contaminated. Thus, no accessible surface of Martian rocks and soil is Earth contamination free. As a result of the latter, only subsurface samples (either rock or soil) can be and will be collected for eventual return to Earth. Uncontaminated samples can be collected from a Category IVa clean platform. Both subsurface soil and rock samples can be maintained clean if they are collected by devices that are self-contained and clean and sterile inside only. The top layer of the sample is removed in a manner that does not contaminate the collection tools. Biobarrier (e.g., aluminum foil) covering the moving parts of these devices may be used as the only self removing bio-blanket that is required. The samples never leave the collection tools. The lids are placed on these tools inside the collection device. These single use tools with the lid and the sample inside are brought to Earth in the OS. The lids have to be designed impenetrable to the Earth organisms. The latter is a well established art.
Hygiene behaviour in rural Nicaragua in relation to diarrhoea.
Gorter, A C; Sandiford, P; Pauw, J; Morales, P; Pérez, R M; Alberts, H
1998-12-01
Childhood diarrhoea is a leading cause of morbidity and mortality in Nicaragua. Amongst the risk factors for its transmission are 'poor' hygiene practices. We investigated the effect of a large number of hygiene practices on diarrhoeal disease in children aged <2 years and validated the technique of direct observation of hygiene behaviour. A prospective follow-up study was carried out in a rural zone of Nicaragua. From the database of a previously conducted case-control study on water and sanitation 172 families were recruited, half of which had experienced a higher than expected rate of diarrhoea in their children and the other half a lower rate. Hygiene behaviour was observed over two mornings and diarrhoea incidence was recorded with a calendar, filled out by the mother, and collected every week for 5 months. Of 46 'good' practices studied, 39 were associated with a lower risk of diarrhoea, five were unrelated and only for two a higher risk was observed. Washing of hands, domestic cleanliness (kitchen, living room, yard) and the use of a diaper/underclothes by the child had the strongest protective effect. Schooling (>3 years of primary school) and better economic position (possession of a radio) had a positive influence on general hygiene behaviour, education having a slightly stronger effect when a radio was present. Individual hygiene behaviour appeared to be highly variable in contrast with the consistent behaviour of the community as a whole. Feasible and appropriate indicators of hygiene behaviour were found to be domestic cleanliness and the use of a diaper or underclothes by the child. A consistent relationship between almost all hygiene practices and diarrhoea was detected, more schooling producing better hygiene behaviour. The high variability of hygiene behaviour at the individual level requires repeated observations (at least two) before and after the hygiene education in the event one wants to measure the impact of the campaign on the individual.
Jeon, In-Soo; Spångberg, Larz S W; Yoon, Tai-Cheol; Kazemi, Reza B; Kum, Kee-Yeon
2003-11-01
The design of the cutting blade of rotary instruments may affect the outcome of root canal instrumentation in terms of cleanliness. The aim of this scanning electron microscopic study was to compare the quality and amount of smear layer generated in the apical third of straight root canals by 2 rotary nickel-titanium reamers and 1 rotary steel reamer with different cutting blade designs. Seventy intact, single-rooted human mandibular premolars with straight, fully developed roots were selected for this study. Before instrumentation, the cervical portion of all teeth was removed by using a microtome (Isomet), leaving 13-mm-long roots. Automated preparation was performed with ProFile (n = 20) and Hero 642 (n = 20) reamers by using the crown-down technique and with a stainless steel engine reamer (Mani; n = 20) by using a reaming motion. All root canals were instrumented to No. 40. A control group (pulp extirpation with barbed broaches; n = 10) was also included. Irrigation with 3 mL of a 1% sodium hypochlorite (NaOCl) solution was performed after each instrumentation. After the instrumentation, each root was split longitudinally, and a scanning electron microscope was used to examine the selected areas of the canal walls at the apical third from 2 different perspectives. A 4-category scoring system for smear layer was used, and the resulting scores were statistically analyzed. The least smear layer remained in the Hero 642 group at the selected apical third of straight root canals (P < .05). However, all instruments left a smear layer. The surface texture of the smear layer, in addition to the depth and the frequency of packed materials into the dentinal tubules, varied with instrument type. These data revealed that the design of the cutting blade of rotary instruments can affect root canal cleanliness in straight root canals. This information may be useful in the selection of nickel-titanium rotary reamers.
Power Performance Verification of a Wind Farm Using the Friedman's Test.
Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L
2016-06-03
In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.
Power Performance Verification of a Wind Farm Using the Friedman’s Test
Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.
2016-01-01
In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Phillip A.; O'Hagan, Ryan; Shumaker, Brent
The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carriedmore » out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.« less
NASA Astrophysics Data System (ADS)
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
Branck, Tobyn A.; Hurley, Matthew J.; Prata, Gianna N.; Crivello, Christina A.
2017-01-01
ABSTRACT Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher (P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly (P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. PMID:28314729
Branck, Tobyn A; Hurley, Matthew J; Prata, Gianna N; Crivello, Christina A; Marek, Patrick J
2017-06-01
Listeria monocytogenes is of great concern in food processing facilities because it persists in biofilms, facilitating biotransfer. Stainless steel is commonly used for food contact surfaces and transport containers. L. monocytogenes biofilms on stainless steel served as a model system for surface sampling, to test the performance of a sonicating swab in comparison with a standard cotton swab. Swab performance and consistency were determined using total viable counts. Stainless steel coupons sampled with both types of swabs were examined using scanning electron microscopy, to visualize biofilms and surface structures (i.e., polishing grooves and scratches). Laser scanning confocal microscopy was used to image and to quantitate the biofilms remaining after sampling with each swab type. The total viable counts were significantly higher ( P ≤ 0.05) with the sonicating swab than with the standard swab in each trial. The sonicating swab was more consistent in cell recovery than was the standard swab, with coefficients of variation ranging from 8.9% to 12.3% and from 7.1% to 37.6%, respectively. Scanning electron microscopic imaging showed that biofilms remained in the polished grooves of the coupons sampled with the standard swab but were noticeably absent with the sonicating swab. Percent area measurements of biofilms remaining on the stainless steel coupons showed significantly ( P ≤ 0.05) less biofilm remaining when the sonicating swab was used (median, 1.1%), compared with the standard swab (median, 70.4%). The sonicating swab provided greater recovery of cells, with more consistency, than did the standard swab, and it is employs sonication, suction, and scrubbing. IMPORTANCE Inadequate surface sampling can result in foodborne illness outbreaks from biotransfer, since verification of sanitization protocols relies on surface sampling and recovery of microorganisms for detection and enumeration. Swabbing is a standard method for microbiological sampling of surfaces. Although swabbing offers portability and ease of use, there are limitations, such as high user variability and low recovery rates, which can be attributed to many different causes. This study demonstrates some benefits that a sonicating swab has over a standard swab for removal and collection of microbiological samples from a surface, to provide better verification of surface cleanliness and to help decrease the potential for biotransfer of pathogens into foods. Copyright © 2017 American Society for Microbiology.
Huang, Yu-Shan; Chen, Yee-Chun; Chen, Mei-Ling; Cheng, Aristine; Hung, I-Chen; Wang, Jann-Tay; Sheng, Wang-Huei; Chang, Shan-Chwen
2015-08-01
Environmental cleaning is essential in reducing microbial colonization and health care-associated infections in hospitals. However, there is no consensus for the standard method to assess hospital cleanliness, and comparisons of newer methodology, such as adenosine triphosphate bioluminescence assay, with the traditional methods are limited. A prospective study was conducted at a medical center between January 2013 and August 2013. In each selected room, 10-12 high-touch surfaces were sampled before and after terminal cleaning. The adequacy of cleaning was evaluated by visual inspection, aerobic colony counts (ACCs), and adenosine triphosphate (ATP) bioluminescence assay. Eighty-five environmental surfaces from 8 rooms were evaluated by all 3 methods. The overall inadequacy defined by visual inspection, ACC, and ATP level was 11.8%, 20.0%, and 50.6% before cleaning and 4.7%, 5.9%, 21.2% after cleaning, respectively. A correlation between the ACC and ATP was found (r = 0.285, P < .001) using log10 values. Using ACCs <2.5 colony forming units/cm(2) as the cutoff for cleanliness, the ATP assay had better sensitivity than visual inspection (63.6% vs 27.3%). The receiver operating characteristics of the ATP assay indicated that the optimal ATP cutoff value was estimated to be 5.57 relative light units/cm(2). ATP bioluminescence assay is a sensitive and rapid tool in evaluating the quality of terminal cleaning. We emphasize the value of using a quantitative method to monitor environmental cleaning at hospitals. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
High-radiance LDP source for mask inspection and beam line applications (Conference Presentation)
NASA Astrophysics Data System (ADS)
Teramoto, Yusuke; Santos, Bárbara; Mertens, Guido; Kops, Ralf; Kops, Margarete; von Wezyk, Alexander; Bergmann, Klaus; Yabuta, Hironobu; Nagano, Akihisa; Ashizawa, Noritaka; Taniguchi, Yuta; Yamatani, Daiki; Shirai, Takahiro; Kasama, Kunihiko
2017-04-01
High-throughput actinic mask inspection tools are needed as EUVL begins to enter into volume production phase. One of the key technologies to realize such inspection tools is a high-radiance EUV source of which radiance is supposed to be as high as 100 W/mm2/sr. Ushio is developing laser-assisted discharge-produced plasma (LDP) sources. Ushio's LDP source is able to provide sufficient radiance as well as cleanliness, stability and reliability. Radiance behind the debris mitigation system was confirmed to be 120 W/mm2/sr at 9 kHz and peak radiance at the plasma was increased to over 200 W/mm2/sr in the recent development which supports high-throughput, high-precision mask inspection in the current and future technology nodes. One of the unique features of Ushio's LDP source is cleanliness. Cleanliness evaluation using both grazing-incidence Ru mirrors and normal-incidence Mo/Si mirrors showed no considerable damage to the mirrors other than smooth sputtering of the surface at the pace of a few nm per Gpulse. In order to prove the system reliability, several long-term tests were performed. Data recorded during the tests was analyzed to assess two-dimensional radiance stability. In addition, several operating parameters were monitored to figure out which contributes to the radiance stability. The latest model that features a large opening angle was recently developed so that the tool can utilize a large number of debris-free photons behind the debris shield. The model was designed both for beam line application and high-throughput mask inspection application. At the time of publication, the first product is supposed to be in use at the customer site.
Criteria evaluation for cleanliness testing phase 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meltzer, Michael; Koester, Carolyn; Stefanni, Chris
1999-02-04
The Boeing Company (Boeing) contracted with Lawrence Livermore National Laboratory (LLNL) to develop criteria for evaluating the efficacy of its parts cleaning processes. In particular, LLNL and Boeing are attempting to identify levels of contamination that lead to parts failures. Sufficient contamination to cause impairment of anodizing, alodining, painting, or welding operations is considered a "part failure." In the "Phase 0" part of the project that was recently completed, preliminary analyses of aluminum substrates were performed as a first step in determining suitable cleanliness criteria for actual Boeing parts made from this material. A wide spread of contamination levels wasmore » specified for the Phase 0 test coupons, in the hopes of finding a range in which an appropriate cleanliness specification might lie. It was planned that, based on the results of the Phase 0 testing, further more detailed analyses ("Phase 1 testing") would be performed in order to more accurately identify the most appropriate criteria. For the Phase 0 testing, Boeing supplied LLNL with 3" x 6" and 3" x 10" aluminum test panels which LLNL contaminated with measured amounts of typical hydrocarbon substances encountered in Boeing' s fabrication operations. The panels were then subjected by Boeing to normal cleaning procedures, after which they went through one of the following sets of operations: l anodizing and primer painting . alodining (chromating) and primer painting l welding The coatings or welds were then examined by both Boeing and LLNL to determine whether any of the operations were impaired, and whether there was a correlation between contamination level and damage to the parts. The experimental approach and results are described in detail.« less
Fornwalt, Lori; Riddell, Brad
2014-01-01
It is widely acknowledged that the hospital environment is an important reservoir for many of the pathogenic microbes associated with health care-associated infections (HAIs). Environmental cleaning plays an important role in the prevention and containment of HAIs, in patient safety, and the overall experience of health care facilities. New technologies, such as pulsed xenon ultraviolet (PX-UV) light systems are an innovative development for enhanced cleaning and decontamination of hospital environments. A portable PX-UV disinfection device delivers pulsed UV light to destroy microbial pathogens and spores, and can be used in conjunction with manual environmental cleaning. In addition, this technology facilitates thorough disinfection of hospital rooms in 10-15 minutes. The current study was conducted to evaluate whether the introduction of the PX-UV device had a positive impact on patient satisfaction. Satisfaction was measured using the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey. In 2011, prior to the introduction of the PX-UV system, patient HCAHPS scores for cleanliness averaged 75.75%. In the first full quarter after enhanced cleaning of the facility was introduced, this improved to 83%. Overall scores for the hospital rose from 76% (first quarter, 2011) to 87.6% (fourth quarter, 2012). As a result of this improvement, the hospital received 1% of at-risk reimbursement from the inpatient prospective payment system as well as additional funding. Cleanliness of the hospital environment is one of the questions included in the HCAHPS survey and one measure of patient satisfaction. After the introduction of the PX-UV system, the score for cleanliness and the overall rating of the hospital rose from below the fiftieth to the ninety-ninth percentile. This improvement in the patient experience was associated with financial benefits to the hospital.
Lund-Nielsen, Betina; Adamsen, Lis; Kolmos, Hans Jørn; Rørth, Mikael; Tolver, Anders; Gottrup, Finn
2011-11-01
Malignant wounds (MWs) occur in 5-10% of all cancer patients. Malodor and exudation are the most common side effects. The aim was to determine the influence of honey-coated compared with silver-coated bandages on treatment of MWs. Patients were randomly selected to enter either group A (honey-coated bandages) or group B (silver-coated bandages). Parameters were the following: wound size, cleanliness, malodor, exudation, and wound pain. Digital photographs, visual analog scales (VAS), and wound morphology registration were used for measurement at baseline and following the 4-week intervention. Sixty-nine patients with MWs and advanced cancer, aged 47-90 (median 65.6), were included. No statistically significant difference was noted between the groups with respect to wound size, degree of cleanliness, exudation, malodor, and wound pain. There was a median decrease in wound size of 15 cm² and 8 cm² in group A and B, respectively (p = 0.63). Based on post-intervention pooled data from the groups, improvement was seen in 62% of the participants with respect to wound size and in 58% (n = 69) with respect to cleanliness. The VAS score for malodor (p = 0.007) and exudation (p < 0.0001) improved significantly post-intervention. Patients with reduced wound size had a median survival time of 387 days compared with 134 days in patients with no wound reduction (p = 0.003). The use of honey-coated and silver-coated bandages improved the outcome of MWs. No differences were found between the two regimens. Both types of bandages are recommended for use by patients with MWs containing tumor debris and necrosis. 2011 by the Wound Healing Society.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1988-01-01
This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.
NASA Technical Reports Server (NTRS)
1973-01-01
The development, construction, and test of a 100-word vocabulary near real time word recognition system are reported. Included are reasonable replacement of any one or all 100 words in the vocabulary, rapid learning of a new speaker, storage and retrieval of training sets, verbal or manual single word deletion, continuous adaptation with verbal or manual error correction, on-line verification of vocabulary as spoken, system modes selectable via verification display keyboard, relationship of classified word to neighboring word, and a versatile input/output interface to accommodate a variety of applications.
Palmprint verification using Lagrangian decomposition and invariant interest points
NASA Astrophysics Data System (ADS)
Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.
2011-06-01
This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.
Arithmetic Circuit Verification Based on Symbolic Computer Algebra
NASA Astrophysics Data System (ADS)
Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo
This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.
Reasoning about Function Objects
NASA Astrophysics Data System (ADS)
Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian
Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.
NASA Astrophysics Data System (ADS)
Shen, Feng; Flynn, Patrick J.
2013-05-01
Iris recognition is one of the most reliable biometric technologies for identity recognition and verification, but it has not been used in a forensic context because the representation and matching of iris features are not straightforward for traditional iris recognition techniques. In this paper we concentrate on the iris crypt as a visible feature used to represent the characteristics of irises in a similar way to fingerprint minutiae. The matching of crypts is based on their appearances and locations. The number of matching crypt pairs found between two irises can be used for identity verification and the convenience of manual inspection makes iris crypts a potential candidate for forensic applications.
Contamination control program plan for the ultraviolet spectrometer experiment, revision E
NASA Technical Reports Server (NTRS)
Gilmore, D. B.
1972-01-01
The contamination control program plan delineates the cleanliness requirements to be attained and maintained, and the methods to be utilized, in the fabrication, handling, test, calibration, shipment, pre-installation checkout and installation for the ultraviolet spectrometer experiment prototype, qualification and flight equipment.
DOT National Transportation Integrated Search
2011-12-31
Twelve field projects were studied where forty-four locations were evaluated to assess the cause or : causes of asphalt concrete that exhibits tender zone characteristics (i.e. instability during compaction) and to : investigate the tendency of...
ERIC Educational Resources Information Center
Karatay, Halit
2011-01-01
In this paper, the occurrence frequency of the values "justice, family union, independence, peace, scientificality, diligence, cooperation, sensitivity, honesty, aesthetics, tolerance, hospitality, freedom, wellness, respect, affection, responsibility, cleanliness, patriotism, benevolence" which the Ministry of Education requires that…
Sabbatical Report: Results of a Survey of Library Microforms Facilities.
ERIC Educational Resources Information Center
McIntosh, Melinda C.
1987-01-01
Highlights findings on the status of academic library microforms facilities in the United States and Canada based on visits to 11 libraries. Topics covered include administration, personnel, collection access and storage, classification, acquisition, circulation, indexes, hours, facilities, signage, equipment, photocopying, cleanliness, vandalism,…
Model-Driven Test Generation of Distributed Systems
NASA Technical Reports Server (NTRS)
Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin
2012-01-01
This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.
Investigation of optical/infrared sensor techniques for application satellites
NASA Technical Reports Server (NTRS)
Kaufman, I.
1972-01-01
A method of scanning an optical sensor array by acoustic surface waves is discussed. Data cover detailed computer based analysis of the operation of a multielement acoustic surface-wave-scanned optical sensor, the development of design and operation techniques that were used to show the feasibility of an integrated array to design several such arrays, and experimental verification of a number of the calculations with discrete sensor devices.
Genesis Spacecraft Science Canister Preliminary Inspection and Cleaning
NASA Technical Reports Server (NTRS)
Hittle, J. D.; Calaway, M. J.; Allton, J. H.; Warren, J. L.; Schwartz, C. M.; Stansbery, E. K.
2006-01-01
The Genesis science canister is an aluminum cylinder (75 cm diameter and 35 cm tall) hinged at the mid-line for opening. This canister was cleaned and assembled in an ISO level 4 (Class 10) clean room at Johnson Space Center (JSC) prior to launch. The clean solar collectors were installed and the canister closed in the cleanroom to preserve collector cleanliness. The canister remained closed until opened on station at Earth-Sun L1 for solar wind collection. At the conclusion of collection, the canister was again closed to preserve collector cleanliness during Earth return and re-entry. Upon impacting the dry Utah lakebed at 300 kph the science canister integrity was breached. The canister was returned to JSC. The canister shell was briefly examined, imaged, gently cleaned of dust and packaged for storage in anticipation of future detailed examination. The condition of the science canister shell noted during this brief examination is presented here. The canister interior components were packaged and stored without imaging due to time constraints.
Surface Plasmon Resonance Based Sensitive Immunosensor for Benzaldehyde Detection
NASA Astrophysics Data System (ADS)
Onodera, Takeshi; Shimizu, Takuzo; Miura, Norio; Matsumoto, Kiyoshi; Toko, Kiyoshi
Fragrant compounds used to add flavor to beverages remain in the manufacturing line after the beverage manufacturing process. Line cleanliness before the next manufacturing cycle is difficult to estimate by sensory analysis, making excessive washing necessary. A new measurement system to determine line cleanliness is desired. In this study, we attempted to detect benzaldehyde (Bz) using an anti-Bz monoclonal antibody (Bz-Ab) and a surface plasmon resonance (SPR) sensor. We fabricated two types of sensor chips using self-assembled monolayers (SAMs) and investigated which sensor surface exhibited higher sensitivity. In addition, anti-Bz antibody conjugated with horseradish peroxidase (HRP-Bz-Ab) was used to enhance the SPR signal. A detection limit of ca. 9ng/mL (ppb) was achieved using an immobilized 4-carboxybenzaldehyde sensor surface using SAMs containing ethylene glycol. When the HRP-Bz-Ab concentration was reduced to 30ng/mL, a detection limit of ca. 4ng/mL (ppb) was achieved for Bz.
Mir Environmental Effects Payload and Returned Mir Solar Panel Cleanliness
NASA Technical Reports Server (NTRS)
Harvey, Gale A.; Humes, Donald H.; Kinard, William H.
2000-01-01
The MIR Environmental Effects Payload (MEEP) was attached to the Docking Module of the MIR space station for 18 months during calendar years 1996 and 1997 (March 1996, STS 76 to October 1997, STS 86). A solar panel array with more than 10 years space exposure was removed from the MIR core module in November 1997, and returned to Earth in January, 1998, STS 89. MEEP and the returned solar array are part of the International Space Station (ISS) Risk Mitigation Program. This space flight hardware has been inspected and studied by teams of space environmental effects (SEE) investigators for micrometeoroid and space debris effects, space exposure effects on materials, and electrical performance. This paper reports changes in cleanliness of parts of MEEP and the solar array due to the space exposures. Special attention is given to the extensive water soluble residues deposited on some of the flight hardware surfaces. Directionality of deposition and chemistry of these residues are discussed.
Evaluation of control parameters for Spray-In-Air (SIA) aqueous cleaning for shuttle RSRM hardware
NASA Technical Reports Server (NTRS)
Davis, S. J.; Deweese, C. D.
1995-01-01
HD-2 grease is deliberately applied to Shuttle Redesigned Solid Rocket Motor (RSRM) D6AC steel hardware parts as a temporary protective coating for storage and shipping. This HD-2 grease is the most common form of surface contamination on RSRM hardware and must be removed prior to subsequent surface treatment. Failure to achieve an acceptable level of cleanliness (HD-2 calcium grease removal) is a common cause of defect incidence. Common failures from ineffective cleaning include poor adhesion of surface coatings, reduced bond performance of structural adhesives, and failure to pass cleanliness inspection standards. The RSRM hardware is currently cleaned and refurbished using methyl chloroform (1,1,1-trichloroethane). This chlorinated solvent is mandated for elimination due to its ozone depleting characteristics. This report describes an experimental study of an aqueous cleaning system (which uses Brulin 815 GD) as a replacement for methyl chloroform. Evaluation of process control parameters for this cleaner are discussed as well as cleaning mechanisms for a spray-in-air process.
Garcia, Faustino; Murray, Peter E; Garcia-Godoy, Franklin; Namerow, Kenneth N
2010-01-01
The purpose of this study was to measure and compare the root canal cleanliness and smear layer removal effectiveness of Aquatine Endodontic Cleanser (Aquatine EC) when used as an endodontic irrigating solution in comparison with 6% sodium hypochlorite (NaOCl). Forty-five human teeth were randomly allocated to five treatment groups; the pulp chamber was accessed, cleaned, and shaped by using ProTaper and ProFile rotary instrumentation to an ISO size #40. The teeth were then processed for scanning electron microscopy, and the root canal cleanliness and removal of smear layer were examined. The most effective removal of smear layer occurred with Aquatine EC and NaOCl, both with a rinse of EDTA. Aquatine EC appears to be the first hypochlorous acid approved by the FDA to be a possible alternative to the use of NaOCl as an intracanal irrigant. Further research is needed to identify safer and more effective alternatives to the use of NaOCl irrigation in endodontics.
Effects of Ultra-Clean and centrifugal filtration on rolling-element bearing life
NASA Technical Reports Server (NTRS)
Loewenthal, S. H.; Moyer, D. W.; Needelman, W. M.
1981-01-01
Fatigue tests were conducted on groups of 65-millimeter bore diameter deep-groove ball bearings in a MIL-L-23699 lubricant under two levels of filtration. In one test series, the oil cleanliness was maintained at an exceptionally high level (better than a class "000" per NAS 1638) with a 3 micron absolute barrier filter. These tests were intended to determine the "upper limit" in bearing life under the strictest possible lubricant cleanliness conditions. In the tests using a centrifugal oil filter, contaminants of the type found in aircraft engine filters were injected into the filters' supply line at 125 milligrams per bearing-hour. "Ultra-clean" lubrication produced bearing fatigue lives that were approximately twice that obtained in previous tests with contaminated oil using 3 micron absolute filtration and approximately three times that obtained with 49 micron filtration. It was also observed that the centrifugal oil filter had approximately the same effectiveness as a 30 micron absolute filter in preventing bearing surface damage.
2005-01-01
Abstract The study objectives were to provide a province-wide description of stall dimensions and the aspects of cattle welfare linked to stall design in the tie-stall industry. Data on stall design; stall dimensions; and the prevalence of lameness, injury, and hind limb and udder cleanliness in lactating dairy cattle were collected from a sample of 317 tie-stall farms across Ontario. The majority of the study farms (90%) had stalls with dimensions (length, width, tie-chain length, and tie rail height) that were less than the current recommendations. This may explain, in part, the prevalence of lameness measured as the prevalence of back arch (3.2%) and severe hind claw rotation (23%), hock lesions (44%), neck lesions (3.8%), broken tails (3%), dirty hind limbs (23%), and dirty udders (4.6%). Veterinarians and producers may use this information to compare farms with the industry averages and target areas in need of improvement. PMID:16454382
Plasma surface cleaning using microwave plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, C.C.; Haselton, H.H.; Nelson, W.D.
1993-11-01
In a microwave electron cyclotron resonance (ECR) plasma source, reactive plasmas of oxygen and its mixture with argon are used for plasma-cleaning experiments. Aluminum test samples (0.95 {times} 1.9 cm) were coated with thin films ({le} 20 {mu}m in thickness) of Shell Vitrea oil and cleaned by using such reactive plasmas. The plasma cleaning was done in various discharge conditions with fixed microwave power, rf power, biased potential, gas pressures (0.5 and 5 mtorr), and operating time up to 35 min. The status of plasma cleaning has been monitored by using mass spectroscopy. Mass loss of the samples after plasmamore » cleaning was measured to estimate cleaning rates. Measured clean rates of low pressure (0.5 mtorr) argon/oxygen plasmas were as high as 2.7 {mu}/min. X-ray photoelectron spectroscopy was used to determine cleanliness of the sample surfaces and confirm the effectiveness of plasma cleaning in achieving atomic levels of surface cleanliness. In this paper, significant results are reported and discussed.« less
A New Objective Technique for Verifying Mesoscale Numerical Weather Prediction Models
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Manobianco, John; Lane, John E.; Immer, Christopher D.
2003-01-01
This report presents a new objective technique to verify predictions of the sea-breeze phenomenon over east-central Florida by the Regional Atmospheric Modeling System (RAMS) mesoscale numerical weather prediction (NWP) model. The Contour Error Map (CEM) technique identifies sea-breeze transition times in objectively-analyzed grids of observed and forecast wind, verifies the forecast sea-breeze transition times against the observed times, and computes the mean post-sea breeze wind direction and speed to compare the observed and forecast winds behind the sea-breeze front. The CEM technique is superior to traditional objective verification techniques and previously-used subjective verification methodologies because: It is automated, requiring little manual intervention, It accounts for both spatial and temporal scales and variations, It accurately identifies and verifies the sea-breeze transition times, and It provides verification contour maps and simple statistical parameters for easy interpretation. The CEM uses a parallel lowpass boxcar filter and a high-order bandpass filter to identify the sea-breeze transition times in the observed and model grid points. Once the transition times are identified, CEM fits a Gaussian histogram function to the actual histogram of transition time differences between the model and observations. The fitted parameters of the Gaussian function subsequently explain the timing bias and variance of the timing differences across the valid comparison domain. Once the transition times are all identified at each grid point, the CEM computes the mean wind direction and speed during the remainder of the day for all times and grid points after the sea-breeze transition time. The CEM technique performed quite well when compared to independent meteorological assessments of the sea-breeze transition times and results from a previously published subjective evaluation. The algorithm correctly identified a forecast or observed sea-breeze occurrence or absence 93% of the time during the two- month evaluation period from July and August 2000. Nearly all failures in CEM were the result of complex precipitation features (observed or forecast) that contaminated the wind field, resulting in a false identification of a sea-breeze transition. A qualitative comparison between the CEM timing errors and the subjectively determined observed and forecast transition times indicate that the algorithm performed very well overall. Most discrepancies between the CEM results and the subjective analysis were again caused by observed or forecast areas of precipitation that led to complex wind patterns. The CEM also failed on a day when the observed sea- breeze transition affected only a very small portion of the verification domain. Based on the results of CEM, the RAMS tended to predict the onset and movement of the sea-breeze transition too early and/or quickly. The domain-wide timing biases provided by CEM indicated an early bias on 30 out of 37 days when both an observed and forecast sea breeze occurred over the same portions of the analysis domain. These results are consistent with previous subjective verifications of the RAMS sea breeze predictions. A comparison of the mean post-sea breeze winds indicate that RAMS has a positive wind-speed bias for .all days, which is also consistent with the early bias in the sea-breeze transition time since the higher wind speeds resulted in a faster inland penetration of the sea breeze compared to reality.
NASA Technical Reports Server (NTRS)
Dabney, James B.; Arthur, James Douglas
2017-01-01
Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.
A calibration method for patient specific IMRT QA using a single therapy verification film
Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.
2013-01-01
Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558
Post-OPC verification using a full-chip pattern-based simulation verification method
NASA Astrophysics Data System (ADS)
Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary
2005-11-01
In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.