DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarkar, Avik; Sun, Xin; Sundaresan, Sankaran
2014-04-23
The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatialmore » resolution of meso-scale clustering heterogeneities is sacrificed.« less
Verification testing of the Hydro International Up-Flo™ Filter with one filter module and CPZ Mix™ filter media was conducted at the Penn State Harrisburg Environmental Engineering Laboratory in Middletown, Pennsylvania. The Up-Flo™ Filter is designed as a passive, modular filtr...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... reference PM sample media (e.g., filters) before and after a weighing session. A weighing session may be as...
40 CFR 1065.390 - PM balance verifications and weighing process verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... days before weighing any filter. (2) Zero and span the balance within 12 h before weighing any filter. (3) Verify that the mass determination of reference filters before and after a filter weighing... weighing session by weighing reference PM sample media (e.g., filters) before and after a weighing session...
ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS
The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.
Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...
The Environmental Technology Verification report discusses the technology and performance of the AeroStar "C-Series" Polyester Panel Filter air filter for dust and bioaerosol filtration manufactured by Filtration Group. The pressure drop across the filter was 126 Pa clean and 267...
The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...
The Environmental Technology Verification report discusses the technology and performance of the AeroStar FP-98 Minipleat V-Bank Filter air filter for dust and bioaerosol filtration manufactured by Filtration Group. The pressure drop across the filter was 137 Pa clean and 348 Pa ...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
The Environmental Technology Verification report discusses the technology and performance of the Predator II, Model 8VADTP123C23CC000 air filter for dust and bioaerosol filtration manufactured by Tri-Dim Filter Corporation. The pressure drop across the filter was 138 Pa clean and...
Verification testing of the Stormwater Management, Inc. StormFilter Using ZPG Filter Media was conducted on a 0.19 acre portion of the eastbound highway surface of Interstate 794, at an area commonly referred to as the "Riverwalk" site near downtown Milwaukee, Wisconsin...
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.
2003-01-01
Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.
Space shuttle propulsion estimation development verification, volume 1
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.
Aycock, Kenneth I; Campbell, Robert L; Manning, Keefe B; Craven, Brent A
2017-06-01
Inferior vena cava (IVC) filters are medical devices designed to provide a mechanical barrier to the passage of emboli from the deep veins of the legs to the heart and lungs. Despite decades of development and clinical use, IVC filters still fail to prevent the passage of all hazardous emboli. The objective of this study is to (1) develop a resolved two-way computational model of embolus transport, (2) provide verification and validation evidence for the model, and (3) demonstrate the ability of the model to predict the embolus-trapping efficiency of an IVC filter. Our model couples computational fluid dynamics simulations of blood flow to six-degree-of-freedom simulations of embolus transport and resolves the interactions between rigid, spherical emboli and the blood flow using an immersed boundary method. Following model development and numerical verification and validation of the computational approach against benchmark data from the literature, embolus transport simulations are performed in an idealized IVC geometry. Centered and tilted filter orientations are considered using a nonlinear finite element-based virtual filter placement procedure. A total of 2048 coupled CFD/6-DOF simulations are performed to predict the embolus-trapping statistics of the filter. The simulations predict that the embolus-trapping efficiency of the IVC filter increases with increasing embolus diameter and increasing embolus-to-blood density ratio. Tilted filter placement is found to decrease the embolus-trapping efficiency compared with centered filter placement. Multiple embolus-trapping locations are predicted for the IVC filter, and the trapping locations are predicted to shift upstream and toward the vessel wall with increasing embolus diameter. Simulations of the injection of successive emboli into the IVC are also performed and reveal that the embolus-trapping efficiency decreases with increasing thrombus load in the IVC filter. In future work, the computational tool could be used to investigate IVC filter design improvements, the effect of patient anatomy on embolus transport and IVC filter embolus-trapping efficiency, and, with further development and validation, optimal filter selection and placement on a patient-specific basis.
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Verification testing of the Stormwater Management, Inc. StormFilter® Using Perlite Filter Media was conducted on a 0.7 acre drainage basin near downtown Griffin, Georgia. The system consists of an inlet bay, flow spreader, cartridge bay, overflow baffle, and outlet bay, housed in...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...
The Environmental Technology Verification report discusses the technology and performance of the AFP30 air filter for dust and bioaerosol filtration manufactured by Airflow Products. The pressure drop across the filter was 62 Pa clean and 247 Pa dust loaded. The filtration effici...
Schumann, A; Priegnitz, M; Schoene, S; Enghardt, W; Rohling, H; Fiedler, F
2016-10-07
Range verification and dose monitoring in proton therapy is considered as highly desirable. Different methods have been developed worldwide, like particle therapy positron emission tomography (PT-PET) and prompt gamma imaging (PGI). In general, these methods allow for a verification of the proton range. However, quantification of the dose from these measurements remains challenging. For the first time, we present an approach for estimating the dose from prompt γ-ray emission profiles. It combines a filtering procedure based on Gaussian-powerlaw convolution with an evolutionary algorithm. By means of convolving depth dose profiles with an appropriate filter kernel, prompt γ-ray depth profiles are obtained. In order to reverse this step, the evolutionary algorithm is applied. The feasibility of this approach is demonstrated for a spread-out Bragg-peak in a water target.
The report gives results of March 19-23, 1999, tests of Koch Filter Corporation's Multi-Sak 6FZ159-S paint overspray arrestor (POA) as part of an evaluation of POAs by EPA's Air Pollution Control Technology (APCT) Environmental Technology Verification (ETV) Program. The basic per...
The Environmental Technology Verification report discusses the technology and performance of the Fuel-Borne Catalyst with Mitsui/PUREarth Catalyzed Wire Mesh Filter manufactured by Clean Diesel Technologies, Inc. The technology is a platinum/cerium fuel-borne catalyst in commerci...
Verification testing of the Stormwater Management CatchBasin StormFilter® (CBSF) was conducted on a 0.16 acre drainage basin at the City of St. Clair Shores, Michigan Department of Public Works facility. The four-cartridge CBSF consists of a storm grate and filter chamber inlet b...
The Environmental Technology Verification report discusses the technology and performance of the PerfectPleat Ultra 175-102-863 air filter for dust and bioaerosol filtration manufactured by AAF International. The pressure drop across the filter was 112 Pa clean and 229 Pa dust lo...
The Environmental Technology Verification report discusses the technology and performance of the High Efficiency Mini Pleat air filter for dust and bioaerosol filtration manufactured by Columbus Industries. The pressure drop across the filter was 142 Pa clean and 283 Pa dust load...
The Environmental Technology Verification report discusses the technology and performance of the Synthetic Minipleat V-Cell, SMV-M13-2424 air filter for dust and bioaerosol filtration manufactured by Aeolus Corporation. The pressure drop across the filter was 77 Pa clean and 348 ...
The Environmental Technology Verification report discusses the technology and performance of the Synthetic Minipleat V-Cell, SMV-M14-2424 air filter for dust and bioaerosol filtration manufactured by Aeolus Corporation. The pressure drop across the filter was 104 Pa clean and 348...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draeger, E; Chen, H; Polf, J
Purpose: To test two new techniques, the distance-of-closest approach (DCA) and Compton line (CL) filters, developed as a means of improving the spatial resolution of Compton camera (CC) imaging. Methods: Gammas emitted from {sup 22}Na, {sup 137}Cs, and {sup 60}Co point sources were measured with a prototype 3-stage CC. The energy deposited and position of each interaction in each stage were recorded and used to calculate a “cone-of-origin” for each gamma that scattered twice in the CC. A DCA filter was developed which finds the shortest distance from the gamma’s cone-of-origin surface to the location of the gamma source. Themore » DCA filter was applied to the data to determine the initial energy of the gamma and to remove “bad” interactions that only contribute noise to the image. Additionally, a CL filter, which removes gamma events that do not follow the theoretical predictions of the Compton scatter equation, was used to further remove “bad” interactions from the measured data. Then images were reconstructed with raw, unfiltered data, DCA filtered data, and DCA+CL filtered data and the achievable image resolution of each dataset was compared. Results: Spatial resolutions of ∼2 mm, and better than 2 mm, were achievable with the DCA and DCA+CL filtered data, respectively, compared to > 5 mm for the raw, unfiltered data. Conclusion: In many special cases in medical imaging where information about the source position may be known, such as proton radiotherapy range verification, the application of the DCA and CL filters can result in considerable improvements in the achievable spatial resolutions of Compton imaging.« less
Application of optimal control theory to the design of the NASA/JPL 70-meter antenna servos
NASA Technical Reports Server (NTRS)
Alvarez, L. S.; Nickerson, J.
1989-01-01
The application of Linear Quadratic Gaussian (LQG) techniques to the design of the 70-m axis servos is described. Linear quadratic optimal control and Kalman filter theory are reviewed, and model development and verification are discussed. Families of optimal controller and Kalman filter gain vectors were generated by varying weight parameters. Performance specifications were used to select final gain vectors.
NASA Astrophysics Data System (ADS)
Wijaya, Surya Li; Savvides, Marios; Vijaya Kumar, B. V. K.
2005-02-01
Face recognition on mobile devices, such as personal digital assistants and cell phones, is a big challenge owing to the limited computational resources available to run verifications on the devices themselves. One approach is to transmit the captured face images by use of the cell-phone connection and to run the verification on a remote station. However, owing to limitations in communication bandwidth, it may be necessary to transmit a compressed version of the image. We propose using the image compression standard JPEG2000, which is a wavelet-based compression engine used to compress the face images to low bit rates suitable for transmission over low-bandwidth communication channels. At the receiver end, the face images are reconstructed with a JPEG2000 decoder and are fed into the verification engine. We explore how advanced correlation filters, such as the minimum average correlation energy filter [Appl. Opt. 26, 3633 (1987)] and its variants, perform by using face images captured under different illumination conditions and encoded with different bit rates under the JPEG2000 wavelet-encoding standard. We evaluate the performance of these filters by using illumination variations from the Carnegie Mellon University's Pose, Illumination, and Expression (PIE) face database. We also demonstrate the tolerance of these filters to noisy versions of images with illumination variations.
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...
The verification test of the SeparmaticTM DE Pressure Type Filter System Model 12P-2 was conducted at the UNH Water Treatment Technology Assistance Center (WTTAC) in Durham, New Hampshire. The source water was finished water from the Arthur Rollins Treatment Plant that was pretr...
Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than 10 micrometers. The APCT Verification Center...
Verification testing of the US Filter 3M10C membrane system was conducted over a 44-day test period at the Aqua 2000 Research Center in Chula Vista, California. The test period extended from July 24, 2002 to September 5, 2002. The source water was a blend of Colorado River and ...
The Environmental Technology Verification report discusses the technology and performance of the Excel Filter, Model SBG24242898 air filter for dust and bioaerosol filtration manufactured by Glasfloss Industries, Inc. The pressure drop across the filter was 82 Pa clean and 348 Pa...
NASA Astrophysics Data System (ADS)
Puhan, Pratap Sekhar; Ray, Pravat Kumar; Panda, Gayadhar
2016-12-01
This paper presents the effectiveness of 5/5 Fuzzy rule implementation in Fuzzy Logic Controller conjunction with indirect control technique to enhance the power quality in single phase system, An indirect current controller in conjunction with Fuzzy Logic Controller is applied to the proposed shunt active power filter to estimate the peak reference current and capacitor voltage. Current Controller based pulse width modulation (CCPWM) is used to generate the switching signals of voltage source inverter. Various simulation results are presented to verify the good behaviour of the Shunt active Power Filter (SAPF) with proposed two levels Hysteresis Current Controller (HCC). For verification of Shunt Active Power Filter in real time, the proposed control algorithm has been implemented in laboratory developed setup in dSPACE platform.
Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...
The Wet-Weather Flow Technologies Pilot of the EPA's Technology Verification (ETV) Program under a partnership with NSF International has verified the performawnce of the USFilter/Stranco Products chemical induction mixer used for disinfection of wet-weather flows. The USFilter t...
Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...
Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...
The Baumot BA-B Diesel Particulate Filter with Pre-Catalyst is a diesel engine retrofit device for light, medium, and heavy heavy-duty diesel on-highway engines for use with commercial ultra-low-sulfur diesel (ULSD) fuel. The BA-B particulate filter is composed of a pre-catalyst ...
40 CFR 1065.307 - Linearity verification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... different flow rates. Use a gravimetric reference measurement (such as a scale, balance, or mass comparator... the gas-division system to divide the span gas with purified air or nitrogen. Select gas divisions... PM balance, m max refers to the typical mass of a PM filter. (ii) For linearity verification of...
The U.S. EPA implemented the Environmental Technology Verification (ETV) program in 1995 to generate independent and credible data on the performance of innovative technologies that have the potential to improve protection of public health and the environment. Results are publicl...
EPA‘s Environmental Technology Verification program is designed to further environmental protection by accelerating the acceptance and use of improved and cost effective technologies. This is done by providing high-quality, peer reviewed data on technology performance to those in...
Efficient and Scalable Graph Similarity Joins in MapReduce
Chen, Yifan; Zhang, Weiming; Tang, Jiuyang
2014-01-01
Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results. PMID:25121135
Efficient and scalable graph similarity joins in MapReduce.
Chen, Yifan; Zhao, Xiang; Xiao, Chuan; Zhang, Weiming; Tang, Jiuyang
2014-01-01
Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results.
Filtering methods for broadcast authentication against PKC-based denial of service in WSN: a survey
NASA Astrophysics Data System (ADS)
Afianti, Farah; Wirawan, Iwan; Suryani, Titiek
2017-11-01
Broadcast authentication is used to determine legitimate packet from authorized user. The received packet can be forwarded or used for the further purpose. The use of digital signature is one of the compromising methods but it is followed by high complexity especially in the verification process. That phenomenon is used by the adversary to force the user to verify a lot of false packet data. Kind of Denial of Service (DoS) which attacks the main signature can be mitigated by using pre-authentication methods as the first layer to filter false packet data. The objective of the filter is not replacing the main signature but as an addition to actual verification in the sensor node. This paper contributes in comparing the cost of computation, storage, and communication among several filters. The result shows Pre- Authenticator and Dos Attack-Resistant scheme have the lower overhead than the others. Thus followed by needing powerful sender. Moreover, the key chain is promising methods because of efficiency and effectiveness.
The U.S. EPA has created the Environmental Technology Verification (ETV) program to provide high quality, peer reviewed data on technology performance to those involved in the design, distribution, financing, permitting, purchase, and use of environmental technologies. The Air Po...
Face identification with frequency domain matched filtering in mobile environments
NASA Astrophysics Data System (ADS)
Lee, Dong-Su; Woo, Yong-Hyun; Yeom, Seokwon; Kim, Shin-Hwan
2012-06-01
Face identification at a distance is very challenging since captured images are often degraded by blur and noise. Furthermore, the computational resources and memory are often limited in the mobile environments. Thus, it is very challenging to develop a real-time face identification system on the mobile device. This paper discusses face identification based on frequency domain matched filtering in the mobile environments. Face identification is performed by the linear or phase-only matched filter and sequential verification stages. The candidate window regions are decided by the major peaks of the linear or phase-only matched filtering outputs. The sequential stages comprise a skin-color test and an edge mask filtering test, which verify color and shape information of the candidate regions in order to remove false alarms. All algorithms are built on the mobile device using Android platform. The preliminary results show that face identification of East Asian people can be performed successfully in the mobile environments.
Spent Fuel Assay with an Ultra-High Rate HPGe Spectrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fast, James; Fulsom, Bryan; Pitts, Karl
2015-07-01
Traditional verification of spent nuclear fuel (SNF) includes determination of initial enrichment, burnup and cool down time (IE, BU, CT). Along with neutron measurements, passive gamma assay provides important information for determining BU and CT. Other gamma-ray-based assay methods such as passive tomography and active delayed gamma offer the potential to measure the spatial distribution of fission products and the fissile isotopic concentration of the fuel, respectively. All fuel verification methods involving gamma-ray spectroscopy require that the spectrometers manage very high count rates while extracting the signatures of interest. PNNL has developed new digital filtering and analysis techniques to producemore » an ultra-high rate gamma-ray spectrometer from a standard coaxial high-purity germanium (HPGe) crystal. This 37% relative efficiency detector has been operated for SNF measurements at input count rates of 500-1300 kcps and throughput in excess of 150 kcps. Optimized filtering algorithms preserve the spectroscopic capability of the system even at these high rates. This paper will present the results of both passive and active SNF measurement performed with this system at PNNL. (authors)« less
Rapid, absolute calibration of x-ray filters employed by laser-produced plasma diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, G. V.; Beiersdorfer, P.; Emig, J.
2008-10-15
The Electron Beam Ion Trap (EBIT) facility at the Lawrence Livermore National Laboratory is being used to absolutely calibrate the transmission efficiency of x-ray filters employed by diodes and spectrometers used to diagnose laser-produced plasmas. EBIT emits strong, discrete monoenergetic lines at appropriately chosen x-ray energies. X rays are detected using the high resolution EBIT Calorimeter Spectrometer (ECS), developed for LLNL at the NASA/Goddard Space Flight Center. X-ray filter transmission efficiency is determined by dividing the x-ray counts detected when the filter is in the line of sight by those detected when out of the line of sight. Verification ofmore » filter thickness can be completed in only a few hours, and absolute efficiencies can be calibrated in a single day over a broad range from about 0.1 to 15 keV. The EBIT calibration lab has been used to field diagnostics (e.g., the OZSPEC instrument) with fully calibrated x-ray filters at the OMEGA laser. Extensions to use the capability for calibrating filter transmission for the DANTE instrument on the National Ignition Facility are discussed.« less
Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...
The Environmental Technology Verification report discusses the technology and performance of the DriPak 90/95% air filter for dust and bioaerosol filtration manufactured by AAF International. The pressure drop across the filter was 104 Pa clean and 348 Pa dust loaded, and the fil...
The Environmental Technology Verification report discusses the technology and performance of the BioCel I (Type SH) air filter for dust and bioaerosol filtration manufactured by AAF International. The pressure drop across the filter was 236 Pa clean and 478 Pa dust loaded, and th...
Hierarchical Representation Learning for Kinship Verification.
Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul
2017-01-01
Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.
NASA Astrophysics Data System (ADS)
Lee, Byungjin; Lee, Young Jae; Sung, Sangkyung
2018-05-01
A novel attitude determination method is investigated that is computationally efficient and implementable in low cost sensor and embedded platform. Recent result on attitude reference system design is adapted to further develop a three-dimensional attitude determination algorithm through the relative velocity incremental measurements. For this, velocity incremental vectors, computed respectively from INS and GPS with different update rate, are compared to generate filter measurement for attitude estimation. In the quaternion-based Kalman filter configuration, an Euler-like attitude perturbation angle is uniquely introduced for reducing filter states and simplifying propagation processes. Furthermore, assuming a small angle approximation between attitude update periods, it is shown that the reduced order filter greatly simplifies the propagation processes. For performance verification, both simulation and experimental studies are completed. A low cost MEMS IMU and GPS receiver are employed for system integration, and comparison with the true trajectory or a high-grade navigation system demonstrates the performance of the proposed algorithm.
Verification testing of the ADI International Inc. Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8, 2003 through May 28,...
Acceptance Testing of Thermoluminescent Dosimeter Holders.
Romanyukha, Alexander; Grypp, Matthew D; Sharp, Thad J; DiRito, John N; Nelson, Martin E; Mavrogianis, Stanley T; Torres, Jeancarlo; Benevides, Luis A
2018-05-01
The U.S. Navy uses the Harshaw 8840/8841 dosimetric (DT-702/PD) system, which employs LiF:Mg,Cu,P thermoluminescent dosimeters (TLDs), developed and produced by Thermo Fisher Scientific (TFS). The dosimeter consists of four LiF:Mg,Cu,P elements, mounted in Teflon® on an aluminum card and placed in a plastic holder. The holder contains a unique filter for each chip made of copper, acrylonitrile butadiene styrene (ABS), Mylar®, and tin. For accredited dosimetry labs, the ISO/IEC 17025:2005(E) requires an acceptance procedure for all new equipment. The Naval Dosimetry Center (NDC) has developed and tested a new non-destructive procedure, which enables the verification and the evaluation of embedded filters in the holders. Testing is based on attenuation measurements of low-energy radiation transmitted through each filter in a representative sample group of holders to verify that the correct filter type and thickness are present. The measured response ratios are then compared with the expected response ratios. In addition, each element's measured response is compared to the mean response of the group. The test was designed and tested to identify significant nonconformities, such as missing copper or tin filters, double copper or double tin filters, or other nonconformities that may impact TLD response ratios. During the implementation of the developed procedure, testing revealed a holder with a double copper filter. To complete the evaluation, the impact of the nonconformities on proficiency testing was examined. The evaluation revealed failures in proficiency testing categories III and IV when these dosimeters were irradiated to high-energy betas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorokine, Alexandre
2011-10-01
Simple Ontology Format (SOFT) library and file format specification provides a set of simple tools for developing and maintaining ontologies. The library, implemented as a perl module, supports parsing and verification of the files in SOFt format, operations with ontologies (adding, removing, or filtering of entities), and converting of ontologies into other formats. SOFT allows users to quickly create ontologies using only a basic text editor, verify it, and portray it in a graph layout system using customized styles.
The Johnson Matthey PCRT2 1000, v.2 system is a partial continuously regenerating technology (PCRT) system that consists of a flow-through partial filter combined with a DOC. The system is designed for low temperature exhaust resulting from intermittent loads from medium and heav...
The Environmental Technology Verification report discusses the technology and performance of the Z-Pak Series S, Model ZPS24241295B0 air filter for dust and bioaerosol filtration manufactured by Glasfloss Industries, Inc. The pressure drop across the filter was 91 Pa clean and 34...
STELLAR: fast and exact local alignments
2011-01-01
Background Large-scale comparison of genomic sequences requires reliable tools for the search of local alignments. Practical local aligners are in general fast, but heuristic, and hence sometimes miss significant matches. Results We present here the local pairwise aligner STELLAR that has full sensitivity for ε-alignments, i.e. guarantees to report all local alignments of a given minimal length and maximal error rate. The aligner is composed of two steps, filtering and verification. We apply the SWIFT algorithm for lossless filtering, and have developed a new verification strategy that we prove to be exact. Our results on simulated and real genomic data confirm and quantify the conjecture that heuristic tools like BLAST or BLAT miss a large percentage of significant local alignments. Conclusions STELLAR is very practical and fast on very long sequences which makes it a suitable new tool for finding local alignments between genomic sequences under the edit distance model. Binaries are freely available for Linux, Windows, and Mac OS X at http://www.seqan.de/projects/stellar. The source code is freely distributed with the SeqAn C++ library version 1.3 and later at http://www.seqan.de. PMID:22151882
Online fingerprint verification.
Upendra, K; Singh, S; Kumar, V; Verma, H K
2007-01-01
As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.
Investigation of Cleanliness Verification Techniques for Rocket Engine Hardware
NASA Technical Reports Server (NTRS)
Fritzemeier, Marilyn L.; Skowronski, Raymund P.
1994-01-01
Oxidizer propellant systems for liquid-fueled rocket engines must meet stringent cleanliness requirements for particulate and nonvolatile residue. These requirements were established to limit residual contaminants which could block small orifices or ignite in the oxidizer system during engine operation. Limiting organic residues in high pressure oxygen systems, such as in the Space Shuttle Main Engine (SSME), is particularly important. The current method of cleanliness verification for the SSME uses an organic solvent flush of the critical hardware surfaces. The solvent is filtered and analyzed for particulate matter followed by gravimetric determination of the nonvolatile residue (NVR) content of the filtered solvent. The organic solvents currently specified for use (1, 1, 1-trichloroethane and CFC-113) are ozone-depleting chemicals slated for elimination by December 1995. A test program is in progress to evaluate alternative methods for cleanliness verification that do not require the use of ozone-depleting chemicals and that minimize or eliminate the use of solvents regulated as hazardous air pollutants or smog precursors. Initial results from the laboratory test program to evaluate aqueous-based methods and organic solvent flush methods for NVR verification are provided and compared with results obtained using the current method. Evaluation of the alternative methods was conducted using a range of contaminants encountered in the manufacture of rocket engine hardware.
On flattening filter‐free portal dosimetry
Novais, Juan Castro; Molina López, María Yolanda; Maqueda, Sheila Ruiz
2016-01-01
Varian introduced (in 2010) the option of removing the flattening filter (FF) in their C‐Arm linacs for intensity‐modulated treatments. This mode, called flattening filter‐free (FFF), offers the advantage of a greater dose rate. Varian's “Portal Dosimetry” is an electronic portal imager device (EPID)‐based tool for IMRT verification. This tool lacks the capability of verifying flattening filter‐free (FFF) modes due to saturation and lack of an image prediction algorithm. (Note: the latest versions of this software and EPID correct these issues.) The objective of the present study is to research the feasibility of said verifications (with the older versions of the software and EPID). By placing the EPID at a greater distance, the images can be acquired without saturation, yielding a linearity similar to the flattened mode. For the image prediction, a method was optimized based on the clinically used algorithm (analytical anisotropic algorithm (AAA)) over a homogeneous phantom. The depth inside the phantom and its electronic density were tailored. An application was developed to allow the conversion of a dose plane (in DICOM format) to Varian's custom format for Portal Dosimetry. The proposed method was used for the verification of test and clinical fields for the three qualities used in our institution for IMRT: 6X, 6FFF and 10FFF. The method developed yielded a positive verification (more than 95% of the points pass a 2%/2 mm gamma) for both the clinical and test fields. This method was also capable of “predicting” static and wedged fields. A workflow for the verification of FFF fields was developed. This method relies on the clinical algorithm used for dose calculation and is able to verify the FFF modes, as well as being useful for machine quality assurance. The procedure described does not require new hardware. This method could be used as a verification of Varian's Portal Dose Image Prediction. PACS number(s): 87.53.Kn, 87.55.T‐, 87.56.bd, 87.59.‐e PMID:27455487
Code of Federal Regulations, 2011 CFR
2011-07-01
... meets a minimum response time. You may use the results of this test to determine transformation time, t... you use any analog or real-time digital filters during emission testing, you must operate those... the rise time and fall time as needed. You may also configure analog or digital filters before...
NASA Technical Reports Server (NTRS)
Fields, Christina M.
2013-01-01
The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is,. responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) is a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The purpose of the UCTS is to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems s:luring their development. As an intern at KSC, my assignment was to develop a model component for the UCTS. I was given a fluid component (drier) to model in Matlab. The drier was a Catch All replaceable core type filter-drier. The filter-drier provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-drier also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. I completed training for UNIX and Simulink to help aid in my assignment. The filter-drier was modeled by determining affects it has on the pressure, velocity and temperature of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my model filter-drier in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements.
NASA Astrophysics Data System (ADS)
Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong
2011-04-01
As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.
Schaffner, B; Kanai, T; Futami, Y; Shimbo, M; Urakabe, E
2000-04-01
The broad-beam three-dimensional irradiation system under development at National Institute of Radiological Sciences (NIRS) requires a small ridge filter to spread the initially monoenergetic heavy-ion beam to a small spread-out Bragg peak (SOBP). A large SOBP covering the target volume is then achieved by a superposition of differently weighted and displaced small SOBPs. Two approaches were studied for the definition of a suitable ridge filter and experimental verifications were performed. Both approaches show a good agreement between the calculated and measured dose and lead to a good homogeneity of the biological dose in the target. However, the ridge filter design that produces a Gaussian-shaped spectrum of the particle ranges was found to be more robust to small errors and uncertainties in the beam application. Furthermore, an optimization procedure for two fields was applied to compensate for the missing dose from the fragmentation tail for the case of a simple-geometry target. The optimized biological dose distributions show that a very good homogeneity is achievable in the target.
Commissioning and Science Verification of JAST/T80
NASA Astrophysics Data System (ADS)
Ederoclte, A.; Cenarro, A. J.; Marín-Franch, A.; Cristóbal-Hornillos, D.; Vázquez Ramió, H.; Varela, J.; Hurier, G.; Moles, M.; Lamadrid, J. L.; Díaz-Martín, M. C.; Iglesias Marzoa, R.; Tilve, V.; Rodríguez, S.; Maícas, N.; Abri, J.
2017-03-01
Located at the Observatorio Astrofísico de Javalambre, the ’’Javalambre Auxiliary Survey Telescope’’ is an 80cm telescope with a unvignetted 2 square degrees field of view. The telescope is equipped with T80Cam, a camera with a large format CCD and two filter wheels which can host, at any given time, 12 filters. The telescope has been designed to provide optical quality all across the field of view, which is achieved with a field corrector. In this talk, I will review the commissioning of the telescope. The optical performance in the centre of the field of view has been tested with lucky imaging technique, providing a telescope PSF of 0.4’’, which is close to the one expected from theory. Moreover, the tracking of the telescope does not affect the image quality, as it has been shown that stars appear round even in exposures of 10minutes obtained without guiding. Most importantly, we present the preliminary results of science verification observations which combine the two main characteristics of this telescope: the large field of view and the special filter set.
Mineral mapping in the Maherabad area, eastern Iran, using the HyMap remote sensing data
NASA Astrophysics Data System (ADS)
Molan, Yusuf Eshqi; Refahi, Davood; Tarashti, Ali Hoseinmardi
2014-04-01
This study applies matched filtering on the HyMap airborne hyperspectral data to obtain the distribution map of alteration minerals in the Maherabad area and uses virtual verification to verify the results. This paper also introduces "moving threshold" which tries to find an appropriate threshold value to convert gray scale images, produced by mapping methods, to target and background pixels. The Maherabad area, located in the eastern part of the Lut block, is a Cu-Au porphyry system in which quartz-sericite-pyrite, argillic and propylitic alteration are most common. Minimum noise fraction transform coupled with a pixel purity index was applied on the HyMap images to extract the endmembers of the alteration minerals, including kaolinite, montmorillonite, sericite (muscovite/illite), calcite, chlorite, epidote, and goethite. Since there was no access to any portable spectrometer and/or lab spectral measurements for the verification of the remote sensing imagery results, virtual verification achieved using the USGS spectral library and showed an agreement of 83.19%. The comparison between the results of the matched filtering and X-ray diffraction (XRD) analyses also showed an agreement of 56.13%.
Optical security verification for blurred fingerprints
NASA Astrophysics Data System (ADS)
Soon, Boon Y.; Karim, Mohammad A.; Alam, Mohammad S.
1998-12-01
Optical fingerprint security verification is gaining popularity, as it has the potential to perform correlation at the speed of light. With advancement in optical security verification techniques, authentication process can be almost foolproof and reliable for financial transaction, banking, etc. In law enforcement, when a fingerprint is obtained from a crime scene, it may be blurred and can be an unhealthy candidate for correlation purposes. Therefore, the blurred fingerprint needs to be clarified before it is used for the correlation process. There are a several different types of blur, such as linear motion blur and defocus blur, induced by aberration of imaging system. In addition, we may or may not know the blur function. In this paper, we propose the non-singularity inverse filtering in frequency/power domain for deblurring known motion-induced blur in fingerprints. This filtering process will be incorporated with the pow spectrum subtraction technique, uniqueness comparison scheme, and the separated target and references planes method in the joint transform correlator. The proposed hardware implementation is a hybrid electronic-optical correlator system. The performance of the proposed system would be verified with computer simulation for both cases: with and without additive random noise corruption.
Sekar, Yuvaraj; Thoelking, Johannes; Eckl, Miriam; Kalichava, Irakli; Sihono, Dwi Seno Kuncoro; Lohr, Frank; Wenz, Frederik; Wertz, Hansjoerg
2018-04-01
The novel MatriXX FFF (IBA Dosimetry, Germany) detector is a new 2D ionization chamber detector array designed for patient specific IMRT-plan verification including flattening-filter-free (FFF) beams. This study provides a detailed analysis of the characterization and clinical evaluation of the new detector array. The verification of the MatriXX FFF was subdivided into (i) physical dosimetric tests including dose linearity, dose rate dependency and output factor measurements and (ii) patient specific IMRT pre-treatment plan verifications. The MatriXX FFF measurements were compared to the calculated dose distribution of a commissioned treatment planning system by gamma index and dose difference evaluations for 18 IMRT-sequences. All IMRT-sequences were measured with original gantry angles and with collapsing all beams to 0° gantry angle to exclude the influence of the detector's angle dependency. The MatriXX FFF was found to be linear and dose rate independent for all investigated modalities (deviations ≤0.6%). Furthermore, the output measurements of the MatriXX FFF were in very good agreement to reference measurements (deviations ≤1.8%). For the clinical evaluation an average pixel passing rate for γ (3%,3mm) of (98.5±1.5)% was achieved when applying a gantry angle correction. Also, with collapsing all beams to 0° gantry angle an excellent agreement to the calculated dose distribution was observed (γ (3%,3mm) =(99.1±1.1)%). The MatriXX FFF fulfills all physical requirements in terms of dosimetric accuracy. Furthermore, the evaluation of the IMRT-plan measurements showed that the detector particularly together with the gantry angle correction is a reliable device for IMRT-plan verification including FFF. Copyright © 2017. Published by Elsevier GmbH.
Competitive region orientation code for palmprint verification and identification
NASA Astrophysics Data System (ADS)
Tang, Wenliang
2015-11-01
Orientation features of the palmprint have been widely investigated in coding-based palmprint-recognition methods. Conventional orientation-based coding methods usually used discrete filters to extract the orientation feature of palmprint. However, in real operations, the orientations of the filter usually are not consistent with the lines of the palmprint. We thus propose a competitive region orientation-based coding method. Furthermore, an effective weighted balance scheme is proposed to improve the accuracy of the extracted region orientation. Compared with conventional methods, the region orientation of the palmprint extracted using the proposed method can precisely and robustly describe the orientation feature of the palmprint. Extensive experiments on the baseline PolyU and multispectral palmprint databases are performed and the results show that the proposed method achieves a promising performance in comparison to conventional state-of-the-art orientation-based coding methods in both palmprint verification and identification.
Hailstorms over Switzerland: Verification of Crowd-sourced Data
NASA Astrophysics Data System (ADS)
Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia
2016-04-01
The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.
Nonlinear research of an image motion stabilization system embedded in a space land-survey telescope
NASA Astrophysics Data System (ADS)
Somov, Yevgeny; Butyrin, Sergey; Siguerdidjane, Houria
2017-01-01
We consider an image motion stabilization system embedded into a space telescope for a scanning optoelectronic observation of terrestrial targets. Developed model of this system is presented taking into account physical hysteresis of piezo-ceramic driver and a time delay at a forming of digital control. We have presented elaborated algorithms for discrete filtering and digital control, obtained results on analysis of the image motion velocity oscillations in the telescope focal plane, and also methods for terrestrial and in-flight verification of the system.
Integrity Verification for SCADA Devices Using Bloom Filters and Deep Packet Inspection
2014-03-27
prevent intrusions in smart grids [PK12]. Parthasarathy proposed an anomaly detection based IDS that takes into account system state. In his implementation...Security, 25(7):498–506, 10 2006. [LMV12] O. Linda, M. Manic, and T. Vollmer. Improving cyber-security of smart grid systems via anomaly detection and...6 2012. 114 [PK12] S. Parthasarathy and D. Kundur. Bloom filter based intrusion detection for smart grid SCADA. In Electrical & Computer Engineering
NASA Astrophysics Data System (ADS)
Vilardy, Juan M.; Giacometto, F.; Torres, C. O.; Mattos, L.
2011-01-01
The two-dimensional Fast Fourier Transform (FFT 2D) is an essential tool in the two-dimensional discrete signals analysis and processing, which allows developing a large number of applications. This article shows the description and synthesis in VHDL code of the FFT 2D with fixed point binary representation using the programming tool Simulink HDL Coder of Matlab; showing a quick and easy way to handle overflow, underflow and the creation registers, adders and multipliers of complex data in VHDL and as well as the generation of test bench for verification of the codes generated in the ModelSim tool. The main objective of development of the hardware architecture of the FFT 2D focuses on the subsequent completion of the following operations applied to images: frequency filtering, convolution and correlation. The description and synthesis of the hardware architecture uses the XC3S1200E family Spartan 3E FPGA from Xilinx Manufacturer.
Cleanup Verification Package for the 118-F-1 Burial Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. J. Farris and H. M. Sulloway
2008-01-10
This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.
VERIFICATION TESTING OF TECHNOLOGIES TO CLEAN OR FILTER VENTILATION AIR
Because of the importance of indoor air quality, Research Triangle Institute's Air Pollution Control Technology is adding indoor air products as a new technology category available for testing. This paper discusses RTI's participation in previous Environmental Technology Verifica...
Li, Sui-Xian
2018-05-07
Previous research has shown that the effectiveness of selecting filter sets from among a large set of commercial broadband filters by a vector analysis method based on maximum linear independence (MLI). However, the traditional MLI approach is suboptimal due to the need to predefine the first filter of the selected filter set to be the maximum ℓ₂ norm among all available filters. An exhaustive imaging simulation with every single filter serving as the first filter is conducted to investigate the features of the most competent filter set. From the simulation, the characteristics of the most competent filter set are discovered. Besides minimization of the condition number, the geometric features of the best-performed filter set comprise a distinct transmittance peak along the wavelength axis of the first filter, a generally uniform distribution for the peaks of the filters and substantial overlaps of the transmittance curves of the adjacent filters. Therefore, the best-performed filter sets can be recognized intuitively by simple vector analysis and just a few experimental verifications. A practical two-step framework for selecting optimal filter set is recommended, which guarantees a significant enhancement of the performance of the systems. This work should be useful for optimizing the spectral sensitivity of broadband multispectral imaging sensors.
NASA Technical Reports Server (NTRS)
Fields, Christina M.
2013-01-01
The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.
Ozone Contamination in Aircraft Cabins: Appendix B: Overview papers. Ozone destruction techniques
NASA Technical Reports Server (NTRS)
Wilder, R.
1979-01-01
Ozone filter test program and ozone instrumentation are presented. Tables on the flight tests, samll scale lab tests, and full scale lab tests were reviewed. Design verification, flammability, vibration, accelerated contamination, life cycle, and cabin air quality are described.
Beam energy tracking system on Optima XEx high energy ion implanter
DOE Office of Scientific and Technical Information (OSTI.GOV)
David, Jonathan; Satoh, Shu; Wu Xiangyang
2012-11-06
The Axcelis Optima XEx high energy implanter is an RF linac-based implanter with 12 RF resonators for beam acceleration. Even though each acceleration field is an alternating, sinusoidal RF field, the well known phase-focusing principle produces a beam with a sharp quasi-monoenergetic energy spectrum. A magnetic energy filter after the linac further attenuates the low energy continuum in the energy spectrum often associated with RF acceleration. The final beam energy is a function of the phase and amplitude of the 12 resonators in the linac. When tuning a beam, the magnetic energy filter is set to the desired energy, andmore » each linac parameter is tuned to maximize the transmission through the filter. Once a beam is set up, all the parameters are stored in a recipe, which can be easily tuned and has proven to be quite repeatable. The magnetic field setting of the energy filter selects the beam energy from the RF Linac accelerator, and in-situ verification of beam energy in addition to the magnetic energy filter setting has long been desired. An independent energy tracking system was developed for this purpose, using the existing electrostatic beam scanner as a deflector to construct an in-situ electrostatic energy analyzer. This paper will describe the system and performance of the beam energy tracking system.« less
TEST QA PLAN FOR THE VERIFICATION TESTING OF BAGHOUSE FILTRATION PRODUCTS
Baghouses and their accompanying filter media are a leading particulate control technique for industrial sources. Increasingly emphasis on higher removal efficiencies has helped the baghouse to be even more competitive when compared to other control devices. At present there is n...
The Multiple Doppler Radar Workshop, November 1979.
NASA Astrophysics Data System (ADS)
Carbone, R. E.; Harris, F. I.; Hildebrand, P. H.; Kropfli, R. A.; Miller, L. J.; Moninger, W.; Strauch, R. G.; Doviak, R. J.; Johnson, K. W.; Nelson, S. P.; Ray, P. S.; Gilet, M.
1980-10-01
The findings of the Multiple Doppler Radar Workshop are summarized by a series of six papers. Part I of this series briefly reviews the history of multiple Doppler experimentation, fundamental concepts of Doppler signal theory, and organization and objectives of the Workshop. Invited presentations by dynamicists and cloud physicists are also summarized.Experimental design and procedures (Part II) are shown to be of critical importance. Well-defined and limited experimental objectives are necessary in view of technological limitations. Specified radar scanning procedures that balance temporal and spatial resolution considerations are discussed in detail. Improved siting for suppression of ground clutter as well as scanning procedures to minimize errors at echo boundaries are discussed. The need for accelerated research using numerically simulated proxy data sets is emphasized.New technology to eliminate various sampling limitations is cited as an eventual solution to many current problems in Part III. Ground clutter contamination may be curtailed by means of full spectral processing, digital filters in real time, and/or variable pulse repetition frequency. Range and velocity ambiguities also may be minimized by various pulsing options as well as random phase transmission. Sidelobe contamination can be reduced through improvements in radomes, illumination patterns, and antenna feed types. Radar volume-scan time can be sharply reduced by means of wideband transmission, phased array antennas, multiple beam antennas, and frequency agility.Part IV deals with synthesis of data from several radars in the context of scientific requirements in cumulus clouds, widespread precipitation, and severe convective storms. The important temporal and spatial scales are examined together with the accuracy required for vertical air motion in each phenomenon. Factors that introduce errors in the vertical velocity field are identified and synthesis techniques are discussed separately for the dual Doppler and multiple Doppler cases. Various filters and techniques, including statistical and variational approaches, are mentioned. Emphasis is placed on the importance of experiment design and procedures, technological improvements, incorporation of all information from supporting sensors, and analysis priority for physically simple cases. Integrated reliability is proposed as an objective tool for radar siting.Verification of multiple Doppler-derived vertical velocity is discussed in Part V. Three categories of verification are defined as direct, deductive, and theoretical/numerical. Direct verification consists of zenith-pointing radar measurements (from either airborne or ground-based systems), air motion sensing aircraft, instrumented towers, and tracking of radar chaff. Deductive sources include mesonetworks, aircraft (thermodynamic and microphysical) measurements, satellite observations, radar reflectivity, multiple Doppler consistency, and atmospheric soundings. Theoretical/numerical sources of verification include proxy data simulation, momentum checking, and numerical cloud models. New technology, principally in the form of wide bandwidth radars, is seen as a development that may reduce the need for extensive verification of multiple Doppler-derived vertical air motions. Airborne Doppler radar is perceived as the single most important source of verification within the bounds of existing technology.Nine stages of data processing and display are identified in Part VI. The stages are identified as field checks, archival, selection, editing, coordinate transformation, synthesis of Cartesian fields, filtering, display, and physical analysis. Display of data is considered to be a problem critical to assimilation of data at all stages. Interactive computing systems and software are concluded to be very important, particularly for the editing stage. Three- and 4-dimensional displays are considered essential for data assimilation, particularly at the physical analysis stage. The concept of common data tape formats is approved both for data in radar spherical space as well as for synthesized Cartesian output.1169
Diagnostics and Identification of Injection Duration of Common Rail Diesel Injectors
NASA Astrophysics Data System (ADS)
Krogerus, Tomi R.; Huhtala, Kalevi J.
2018-02-01
In this paper, we study the diagnostics and identification of injection duration of common rail (CR) diesel pilot injectors of dual-fuel engines. In these pilot injectors, the injected volume is small and the repeatability of the injections and identification of the drifts of the injectors are important factors, which need to be taken into account in achieving good repeatability (shot-to-shot with every cylinder) and therefore a well-balanced engine and reduced overall wear. A diagnostics method based on analysis of CR pressure signal with experimental verification results is presented. Using the developed method, the relative duration of injection events can be identified. In the method, the pressure signal during the injection is first extracted after the control of each injection event. After that, the signal is normalized and filtered. Then a derivative of the filtered signal is calculated. Change in the derivative of the filtered signal larger than a predefined threshold indicates an injection event which can be detected and its relative duration can be identified. The efficacy of the proposed diagnostics method is presented with the experimental results, which show that the developed method detects drifts in injection duration and the magnitude of drift. According to the result, ≥ 10 μs change (2%, 500 μs) in injection time can be identified.
Kim, Byeong Hak; Kim, Min Young; Chae, You Seong
2017-01-01
Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970
Kim, Byeong Hak; Kim, Min Young; Chae, You Seong
2017-12-27
Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.
The Environmental Technology Verification report discusses the technology and performance of the Lubrizol Engine Control Systems Purifilter SC17L manufactured by Lubrizol Engine Control Systems. The technology is a precious and base metal, passively regenerated particulate filter...
Framework for Development and Distribution of Hardware Acceleration
NASA Astrophysics Data System (ADS)
Thomas, David B.; Luk, Wayne W.
2002-07-01
This paper describes IGOL, a framework for developing reconfigurable data processing applications. While IGOL was originally designed to target imaging and graphics systems, its structure is sufficiently general to support a broad range of applications. IGOL adopts a four-layer architecture: application layer, operation layer, appliance layer and configuration layer. This architecture is intended to separate and co-ordinate both the development and execution of hardware and software components. Hardware developers can use IGOL as an instance testbed for verification and benchmarking, as well as for distribution. Software application developers can use IGOL to discover hardware accelerated data processors, and to access them in a transparent, non-hardware specific manner. IGOL provides extensive support for the RC1000-PP board via the Handel-C language, and a wide selection of image processing filters have been developed. IGOL also supplies plug-ins to enable such filters to be incorporated in popular applications such as Premiere, Winamp, VirtualDub and DirectShow. Moreover, IGOL allows the automatic use of multiple cards to accelerate an application, demonstrated using DirectShow. To enable transparent acceleration without sacrificing performance, a three-tiered COM (Component Object Model) API has been designed and implemented. This API provides a well-defined and extensible interface which facilitates the development of hardware data processors that can accelerate multiple applications.
TPS(PET)-A TPS-based approach for in vivo dose verification with PET in proton therapy.
Frey, K; Bauer, J; Unholtz, D; Kurz, C; Krämer, M; Bortfeld, T; Parodi, K
2014-01-06
Since the interest in ion-irradiation for tumour therapy has significantly increased over the last few decades, intensive investigations are performed to improve the accuracy of this form of patient treatment. One major goal is the development of methods for in vivo dose verification. In proton therapy, a PET (positron emission tomography)-based approach measuring the irradiation-induced tissue activation inside the patient has been already clinically implemented. The acquired PET images can be compared to an expectation, derived under the assumption of a correct treatment application, to validate the particle range and the lateral field position in vivo. In the context of this work, TPSPET is introduced as a new approach to predict proton-irradiation induced three-dimensional positron emitter distributions by means of the same algorithms of the clinical treatment planning system (TPS). In order to perform additional activity calculations, reaction-channel-dependent input positron emitter depth distributions are necessary, which are determined from the application of a modified filtering approach to the TPS reference depth dose profiles in water. This paper presents the implementation of TPSPET on the basis of the research treatment planning software treatment planning for particles. The results are validated in phantom and patient studies against Monte Carlo simulations, and compared to β(+)-emitter distributions obtained from a slightly modified version of the originally proposed one-dimensional filtering approach applied to three-dimensional dose distributions. In contrast to previously introduced methods, TPSPET provides a faster implementation, the results show no sensitivity to lateral field extension and the predicted β(+)-emitter densities are fully consistent to the planned treatment dose as they are calculated by the same pencil beam algorithms. These findings suggest a large potential of the application of TPSPET for in vivo dose verification in the daily clinical routine.
Chowdhury, Debbrota Paul; Bakshi, Sambit; Guo, Guodong; Sa, Pankaj Kumar
2017-11-27
In this paper, an overall framework has been presented for person verification using ear biometric which uses tunable filter bank as local feature extractor. The tunable filter bank, based on a half-band polynomial of 14th order, extracts distinct features from ear images maintaining its frequency selectivity property. To advocate the applicability of tunable filter bank on ear biometrics, recognition test has been performed on available constrained databases like AMI, WPUT, IITD and unconstrained database like UERC. Experiments have been conducted applying tunable filter based feature extractor on subparts of the ear. Empirical experiments have been conducted with four and six subdivisions of the ear image. Analyzing the experimental results, it has been found that tunable filter moderately succeeds to distinguish ear features at par with the state-of-the-art features used for ear recognition. Accuracies of 70.58%, 67.01%, 81.98%, and 57.75% have been achieved on AMI, WPUT, IITD, and UERC databases through considering Canberra Distance as underlying measure of separation. The performances indicate that tunable filter is a candidate for recognizing human from ear images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, J; Kim, J; Kang, S
2015-06-15
Purpose: The purpose of this study is to access VMAT-SABR plan using flattening filter (FF) and flattening filter-free (FFF) beam, and compare the verification results for all pretreatment plans. Methods: SABR plans for 20 prostate patients were optimized in the Eclipse treatment planning system. A prescription dose was 42.7 Gy/7 fractions. Four SABR plans for each patient were calculated using Acuros XB algorithm with both FF and FFF beams of 6- and 10-MV. The dose-volume histograms (DVH) and technical parameters were recorded and compared. A pretreatment verification was performed and the gamma analysis was used to quantify the agreement betweenmore » calculations and measurements. Results: For each patient, the DVHs are closely similar for plans of four different beams. There are small differences showed in dose distributions and corresponding DVHs when comparing the each plan related to the same patient. Sparing on bladder and rectum was slightly better on plans with 10-MV FF and FFF than with 6-MV FF and FFF, but this difference was negligible. However, there was no significance in the other OARs. The mean agreement of 3%/3mm criteria was higher than 97% in all plans. The mean MUs and deliver time employed was 1701±101 and 3.02±0.17 min for 6-MV FF, 1870±116 and 1.69±0.08 min for 6-MV FFF, 1471±86 and 2.68±0.14 min for 10-MV FF, and 1619±101 and 0.98±0.04 min for 10-MV FFF, respectively. Conclusion: Dose distributions on prostate SABR plans using FFF beams were similar to those generated by FF beams. However, the use of FFF beam offers a clear benefit in delivery time when compared to FF beam. Verification of pretreatment also represented the acceptable and comparable results in all plans using FF beam as well as FFF beam. Therefore, this study suggests that the use of FFF beam is feasible and efficient technique for prostate SABR.« less
Baghouses are air pollution control devices used to control particulate emissions from stationary sources and are among the technologies evaluated by the APCT Center. Baghouses and their accompanying filter media have long been one of the leading particulate control techniques fo...
Baghouses are air pollution control devices used to control particulate emissions from stationary sources and are among the technologies evaluated by the APCT Center. Baghouses and their accompanying filter media have long been one of the leading particulate control techniques fo...
Baghouses are air pollution control devices used to control particulate emissions from stationary sources and are among the technologies evaluated by the APCT Center. Baghouses and their accompanying filter media have long been one of the leading particulate control techniques fo...
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Csank, Jeffrey Thomas; Chicatelli, Amy; Kilver, Jacob
2013-01-01
This paper covers the development of a model-based engine control (MBEC) methodology featuring a self tuning on-board model applied to an aircraft turbofan engine simulation. Here, the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40k) serves as the MBEC application engine. CMAPSS40k is capable of modeling realistic engine performance, allowing for a verification of the MBEC over a wide range of operating points. The on-board model is a piece-wise linear model derived from CMAPSS40k and updated using an optimal tuner Kalman Filter (OTKF) estimation routine, which enables the on-board model to self-tune to account for engine performance variations. The focus here is on developing a methodology for MBEC with direct control of estimated parameters of interest such as thrust and stall margins. Investigations using the MBEC to provide a stall margin limit for the controller protection logic are presented that could provide benefits over a simple acceleration schedule that is currently used in traditional engine control architectures.
NASA Astrophysics Data System (ADS)
Vora, A. P.; Rami, J. B.; Hait, A. K.; Dewan, C. P.; Subrahmanyam, D.; Kirankumar, A. S.
2017-11-01
Next generation Indian Meteorological Satellite will carry Sounder instrument having subsystem of filter wheel measuring Ø260mm and carrying 18 filters arranged in three concentric rings. These filters made from Germanium, are used to separate spectral channels in IR band. Filter wheel is required to be cooled to 214K and rotated at 600 rpm. This Paper discusses the challenges faced in mechanical design of the filter wheel, mainly filter mount design to protect brittle germanium filters from failure under stresses due to very low temperature, compactness of the wheel and casings for improved thermal efficiency, survival under vibration loads and material selection to keep it lighter in weight. Properties of Titanium, Kovar, Invar and Aluminium materials are considered for design. The mount has been designed to accommodate both thermal and dynamic loadings without introducing significant aberrations into the optics or incurring permanent alignment shifts. Detailed finite element analysis of mounts was carried out for stress verification. Results of the qualification tests are discussed for given temperature range of 100K and vibration loads of 12g in Sine and 11.8grms in Random at mount level. Results of the filter wheel qualification as mounted in Electro Optics Module (EOM) are also presented.
An Intrinsically Switchable Ladder-Type Ferroelectric BST-on-Si Composite FBAR Filter.
Lee, Seungku; Mortazawi, Amir
2016-03-01
This paper presents a ladder-type bulk acoustic wave (BAW) intrinsically switchable filter based on ferroelectric thin-film bulk acoustic resonators (FBARs). The switchable filter can be turned on and off by the application of an external bias voltage due to the electrostrictive effect in thin-film ferroelectrics. In this paper, Barium Strontium Titanate (BST) is used as the ferroelectric material. A systematic design approach for switchable ladder-type ferroelectric filters is provided based on required filter specifications. A switchable filter is implemented in the form of a BST-on-Si composite structure to control the effective electromechanical coupling coefficient of FBARs. As an experimental verification, a 2.5-stage intrinsically switchable BST-on-Si composite FBAR filter is designed, fabricated, and measured. Measurement results for a typical BST-on-Si composite FBAR show a resonator mechanical quality factor (Q(m)) of 971, as well as a (Q(m)) × f of 2423 GHz. The filter presented here provides a measured insertion loss of 7.8 dB, out-of-band rejection of 26 dB, and fractional bandwidth of 0.33% at 2.5827 GHz when the filter is in the on state at a dc bias of 40 V. In its off state, the filter exhibits an isolation of 31 dB.
Arithmetic Circuit Verification Based on Symbolic Computer Algebra
NASA Astrophysics Data System (ADS)
Watanabe, Yuki; Homma, Naofumi; Aoki, Takafumi; Higuchi, Tatsuo
This paper presents a formal approach to verify arithmetic circuits using symbolic computer algebra. Our method describes arithmetic circuits directly with high-level mathematical objects based on weighted number systems and arithmetic formulae. Such circuit description can be effectively verified by polynomial reduction techniques using Gröbner Bases. In this paper, we describe how the symbolic computer algebra can be used to describe and verify arithmetic circuits. The advantageous effects of the proposed approach are demonstrated through experimental verification of some arithmetic circuits such as multiply-accumulator and FIR filter. The result shows that the proposed approach has a definite possibility of verifying practical arithmetic circuits.
Upper Atmosphere Research Satellite (UARS) onboard attitude determination using a Kalman filter
NASA Technical Reports Server (NTRS)
Garrick, Joseph
1993-01-01
The Upper Atmospheric Research Satellite (UARS) requires a highly accurate knowledge of its attitude to accomplish its mission. Propagation of the attitude state using gyro measurements is not sufficient to meet the accuracy requirements, and must be supplemented by a observer/compensation process to correct for dynamics and observation anomalies. The process of amending the attitude state utilizes a well known method, the discrete Kalman Filter. This study is a sensitivity analysis of the discrete Kalman Filter as implemented in the UARS Onboard Computer (OBC). The stability of the Kalman Filter used in the normal on-orbit control mode within the OBC, is investigated for the effects of corrupted observations and nonlinear errors. Also, a statistical analysis on the residuals of the Kalman Filter is performed. These analysis is based on simulations using the UARS Dynamics Simulator (UARSDSIM) and compared against attitude requirements as defined by General Electric (GE). An independent verification of expected accuracies is performed using the Attitude Determination Error Analysis System (ADEAS).
An improved algorithm of laser spot center detection in strong noise background
NASA Astrophysics Data System (ADS)
Zhang, Le; Wang, Qianqian; Cui, Xutai; Zhao, Yu; Peng, Zhong
2018-01-01
Laser spot center detection is demanded in many applications. The common algorithms for laser spot center detection such as centroid and Hough transform method have poor anti-interference ability and low detection accuracy in the condition of strong background noise. In this paper, firstly, the median filtering was used to remove the noise while preserving the edge details of the image. Secondly, the binarization of the laser facula image was carried out to extract target image from background. Then the morphological filtering was performed to eliminate the noise points inside and outside the spot. At last, the edge of pretreated facula image was extracted and the laser spot center was obtained by using the circle fitting method. In the foundation of the circle fitting algorithm, the improved algorithm added median filtering, morphological filtering and other processing methods. This method could effectively filter background noise through theoretical analysis and experimental verification, which enhanced the anti-interference ability of laser spot center detection and also improved the detection accuracy.
An effective one-dimensional anisotropic fingerprint enhancement algorithm
NASA Astrophysics Data System (ADS)
Ye, Zhendong; Xie, Mei
2012-01-01
Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.
An effective one-dimensional anisotropic fingerprint enhancement algorithm
NASA Astrophysics Data System (ADS)
Ye, Zhendong; Xie, Mei
2011-12-01
Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.
ATM photoheliograph. [at a solar observatory
NASA Technical Reports Server (NTRS)
Prout, R. A.
1975-01-01
The design and fabrication are presented of a 65 cm photoheliograph functional verification unit (FVU) installed in a major solar observatory. The telescope is used in a daily program of solar observation while serving as a test bed for the development of instrumentation to be included in early space shuttle launched solar telescopes. The 65 cm FVU was designed to be mechanically compatible with the ATM spar/canister and would be adaptable to a second ATM flight utilizing the existing spar/canister configuration. An image motion compensation breadboard and a space-hardened, remotely tuned H alpha filter, as well as solar telescopes of different optical configurations or increased aperture are discussed.
A Coding Method for Efficient Subgraph Querying on Vertex- and Edge-Labeled Graphs
Zhu, Lei; Song, Qinbao; Guo, Yuchen; Du, Lei; Zhu, Xiaoyan; Wang, Guangtao
2014-01-01
Labeled graphs are widely used to model complex data in many domains, so subgraph querying has been attracting more and more attention from researchers around the world. Unfortunately, subgraph querying is very time consuming since it involves subgraph isomorphism testing that is known to be an NP-complete problem. In this paper, we propose a novel coding method for subgraph querying that is based on Laplacian spectrum and the number of walks. Our method follows the filtering-and-verification framework and works well on graph databases with frequent updates. We also propose novel two-step filtering conditions that can filter out most false positives and prove that the two-step filtering conditions satisfy the no-false-negative requirement (no dismissal in answers). Extensive experiments on both real and synthetic graphs show that, compared with six existing counterpart methods, our method can effectively improve the efficiency of subgraph querying. PMID:24853266
Verification in Referral-Based Crowdsourcing
Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.
2012-01-01
Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qiu, J; Washington University in St Louis, St Louis, MO; Li, H. Harlod
Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The mostmore » important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.« less
ERIC Educational Resources Information Center
Harel, Assaf; Bentin, Shlomo
2009-01-01
The type of visual information needed for categorizing faces and nonface objects was investigated by manipulating spatial frequency scales available in the image during a category verification task addressing basic and subordinate levels. Spatial filtering had opposite effects on faces and airplanes that were modulated by categorization level. The…
Space shuttle propulsion estimation development verification
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The application of extended Kalman filtering to estimating the Space Shuttle Propulsion performance, i.e., specific impulse, from flight data in a post-flight processing computer program is detailed. The flight data used include inertial platform acceleration, SRB head pressure, SSME chamber pressure and flow rates, and ground based radar tracking data. The key feature in this application is the model used for the SRB's, which is a nominal or reference quasi-static internal ballistics model normalized to the propellant burn depth. Dynamic states of mass overboard and propellant burn depth are included in the filter model to account for real-time deviations from the reference model used. Aerodynamic, plume, wind and main engine uncertainties are also included for an integrated system model. Assuming uncertainty within the propulsion system model and attempts to estimate its deviations represent a new application of parameter estimation for rocket powered vehicles. Illustrations from the results of applying this estimation approach to several missions show good quality propulsion estimates.
Verification of a rapid mooring and foundation design tool
Weller, Sam D.; Hardwick, Jon; Gomez, Steven; ...
2018-02-15
Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less
Verification of a rapid mooring and foundation design tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weller, Sam D.; Hardwick, Jon; Gomez, Steven
Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less
Research on registration algorithm for check seal verification
NASA Astrophysics Data System (ADS)
Wang, Shuang; Liu, Tiegen
2008-03-01
Nowadays seals play an important role in China. With the development of social economy, the traditional method of manual check seal identification can't meet the need s of banking transactions badly. This paper focus on pre-processing and registration algorithm for check seal verification using theory of image processing and pattern recognition. First of all, analyze the complex characteristics of check seals. To eliminate the difference of producing conditions and the disturbance caused by background and writing in check image, many methods are used in the pre-processing of check seal verification, such as color components transformation, linearity transform to gray-scale image, medium value filter, Otsu, close calculations and labeling algorithm of mathematical morphology. After the processes above, the good binary seal image can be obtained. On the basis of traditional registration algorithm, a double-level registration method including rough and precise registration method is proposed. The deflection angle of precise registration method can be precise to 0.1°. This paper introduces the concepts of difference inside and difference outside and use the percent of difference inside and difference outside to judge whether the seal is real or fake. The experimental results of a mass of check seals are satisfied. It shows that the methods and algorithmic presented have good robustness to noise sealing conditions and satisfactory tolerance of difference within class.
Baghouses are air pollution control devices used to control particulate emissions from stationary sources and are among the technologies evaluated by the APCT Center. Baghouses and their accompanying filter media have long been one of the leading particulate control techniques fo...
Pickering, Amy J.; Arnold, Benjamin F.; Dentz, Holly N.; Colford, John M.; Null, Clair
2016-01-01
Background: The recent global climate agreement in Paris aims to mitigate greenhouse gas emissions while fostering sustainable development and establishes an international trading mechanism to meet this goal. Currently, carbon offset program implementers are allowed to collect their own monitoring data to determine the number of carbon credits to be awarded. Objectives: We summarize reasons for mandating independent monitoring of greenhouse gas emission reduction projects. In support of our policy recommendations, we describe a case study of a program designed to earn carbon credits by distributing almost one million drinking water filters in rural Kenya to avert the use of fuel for boiling water. We compare results from an assessment conducted by our research team in the program area among households with pregnant women or caregivers in rural villages with low piped water access with the reported program monitoring data and discuss the implications. Discussion: Our assessment in Kenya found lower levels of household water filter usage than the internal program monitoring reported estimates used to determine carbon credits; we found 19% (n = 4,041) of households reported filter usage 2–3 years after filter distribution compared to the program stated usage rate of 81% (n = 14,988) 2.7 years after filter distribution. Although carbon financing could be a financially sustainable approach to scale up water treatment and improve health in low-income settings, these results suggest program effectiveness will remain uncertain in the absence of requiring monitoring data be collected by third-party organizations. Conclusion: Independent monitoring should be a key requirement for carbon credit verification in future international carbon trading mechanisms to ensure programs achieve benefits in line with sustainable development goals. Citation: Pickering AJ, Arnold BF, Dentz HN, Colford JM Jr., Null C. 2017. Climate and health co-benefits in low-income countries: a case study of carbon financed water filters in Kenya and a call for independent monitoring. Environ Health Perspect 125:278–283; http://dx.doi.org/10.1289/EHP342 PMID:27634098
Pickering, Amy J; Arnold, Benjamin F; Dentz, Holly N; Colford, John M; Null, Clair
2017-03-01
The recent global climate agreement in Paris aims to mitigate greenhouse gas emissions while fostering sustainable development and establishes an international trading mechanism to meet this goal. Currently, carbon offset program implementers are allowed to collect their own monitoring data to determine the number of carbon credits to be awarded. We summarize reasons for mandating independent monitoring of greenhouse gas emission reduction projects. In support of our policy recommendations, we describe a case study of a program designed to earn carbon credits by distributing almost one million drinking water filters in rural Kenya to avert the use of fuel for boiling water. We compare results from an assessment conducted by our research team in the program area among households with pregnant women or caregivers in rural villages with low piped water access with the reported program monitoring data and discuss the implications. Our assessment in Kenya found lower levels of household water filter usage than the internal program monitoring reported estimates used to determine carbon credits; we found 19% ( n = 4,041) of households reported filter usage 2-3 years after filter distribution compared to the program stated usage rate of 81% ( n = 14,988) 2.7 years after filter distribution. Although carbon financing could be a financially sustainable approach to scale up water treatment and improve health in low-income settings, these results suggest program effectiveness will remain uncertain in the absence of requiring monitoring data be collected by third-party organizations. Independent monitoring should be a key requirement for carbon credit verification in future international carbon trading mechanisms to ensure programs achieve benefits in line with sustainable development goals. Citation: Pickering AJ, Arnold BF, Dentz HN, Colford JM Jr., Null C. 2017. Climate and health co-benefits in low-income countries: a case study of carbon financed water filters in Kenya and a call for independent monitoring. Environ Health Perspect 125:278-283; http://dx.doi.org/10.1289/EHP342.
Bowtie filters for dedicated breast CT: Theory and computational implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kontson, Kimberly, E-mail: Kimberly.Kontson@fda.hhs.gov; Jennings, Robert J.
Purpose: To design bowtie filters with improved properties for dedicated breast CT to improve image quality and reduce dose to the patient. Methods: The authors present three different bowtie filters designed for a cylindrical 14-cm diameter phantom with a uniform composition of 40/60 breast tissue, which vary in their design objectives and performance improvements. Bowtie design #1 is based on single material spectral matching and produces nearly uniform spectral shape for radiation incident upon the detector. Bowtie design #2 uses the idea of basis material decomposition to produce the same spectral shape and intensity at the detector, using two differentmore » materials. Bowtie design #3 eliminates the beam hardening effect in the reconstructed image by adjusting the bowtie filter thickness so that the effective attenuation coefficient for every ray is the same. All three designs are obtained using analytical computational methods and linear attenuation coefficients. Thus, the designs do not take into account the effects of scatter. The authors considered this to be a reasonable approach to the filter design problem since the use of Monte Carlo methods would have been computationally intensive. The filter profiles for a cone-angle of 0° were used for the entire length of each filter because the differences between those profiles and the correct cone-beam profiles for the cone angles in our system are very small, and the constant profiles allowed construction of the filters with the facilities available to us. For evaluation of the filters, we used Monte Carlo simulation techniques and the full cone-beam geometry. Images were generated with and without each bowtie filter to analyze the effect on dose distribution, noise uniformity, and contrast-to-noise ratio (CNR) homogeneity. Line profiles through the reconstructed images generated from the simulated projection images were also used as validation for the filter designs. Results: Examples of the three designs are presented. Initial verification of performance of the designs was done using analytical computations of HVL, intensity, and effective attenuation coefficient behind the phantom as a function of fan-angle with a cone-angle of 0°. The performance of the designs depends only weakly on incident spectrum and tissue composition. For all designs, the dynamic range requirement on the detector was reduced compared to the no-bowtie-filter case. Further verification of the filter designs was achieved through analysis of reconstructed images from simulations. Simulation data also showed that the use of our bowtie filters can reduce peripheral dose to the breast by 61% and provide uniform noise and CNR distributions. The bowtie filter design concepts validated in this work were then used to create a computational realization of a 3D anthropomorphic bowtie filter capable of achieving a constant effective attenuation coefficient behind the entire field-of-view of an anthropomorphic breast phantom. Conclusions: Three different bowtie filter designs that vary in performance improvements were described and evaluated using computational and simulation techniques. Results indicate that the designs are robust against variations in breast diameter, breast composition, and tube voltage, and that the use of these filters can reduce patient dose and improve image quality compared to the no-bowtie-filter case.« less
Trickling filter for urea and bio-waste processing - dynamic modelling of nitrogen cycle
NASA Astrophysics Data System (ADS)
Zhukov, Anton; Hauslage, Jens; Tertilt, Gerin; Bornemann, Gerhild
Mankind’s exploration of the solar system requires reliable Life Support Systems (LSS) enabling long duration manned space missions. In the absence of frequent resupply missions, closure of the LSS will play a very important role and its maximisation will to a large extent drive the selection of appropriate LSS architectures. One of the significant issues on the way to full closure is to effectively utilise biological wastes such as urine, inedible biomass etc. A very promising concept of biological waste reprocessing is the use of trickling filters which are currently being developed and investigated by DLR, Cologne, Germany. The concept is called Combined Regenerative Organic-Food Production (C.R.O.P.) and is based on the microbiological treatment of biological wastes and reprocessing them into aqueous fertilizer which can directly be used in a greenhouse for food production. Numerous experiments have been and are being conducted by DLR in order to fully understand and characterize the process. The human space exploration group of the Technical University of Munich (TUM) in cooperation with DLR has started to establish a dynamic model of the trickling filter system to be able to assess its performance on the LSS level. In the first development stage the model covers the nitrogen cycle enabling to simulate urine processing. This paper describes briefly the C.R.O.P. concept and the status of the trickling filter model development. The model is based on enzyme-catalyzed reaction kinetics for the fundamental microbiological reaction chain and is created in MATLAB. Verification and correlation of the developed model with experiment results has been performed. Several predictive studies for batch sequencing behavior have been performed, demonstrating a good capability of C.R.O.P. concept to be used in closed LSS. Achieved results are critically discussed and way forward is presented.
Martinek, Radek; Nedoma, Jan; Fajkus, Marcel; Kahankova, Radana; Konecny, Jaromir; Janku, Petr; Kepak, Stanislav; Bilik, Petr; Nazeran, Homer
2017-04-18
This paper focuses on the design, realization, and verification of a novel phonocardiographic- based fiber-optic sensor and adaptive signal processing system for noninvasive continuous fetal heart rate (fHR) monitoring. Our proposed system utilizes two Mach-Zehnder interferometeric sensors. Based on the analysis of real measurement data, we developed a simplified dynamic model for the generation and distribution of heart sounds throughout the human body. Building on this signal model, we then designed, implemented, and verified our adaptive signal processing system by implementing two stochastic gradient-based algorithms: the Least Mean Square Algorithm (LMS), and the Normalized Least Mean Square (NLMS) Algorithm. With this system we were able to extract the fHR information from high quality fetal phonocardiograms (fPCGs), filtered from abdominal maternal phonocardiograms (mPCGs) by performing fPCG signal peak detection. Common signal processing methods such as linear filtering, signal subtraction, and others could not be used for this purpose as fPCG and mPCG signals share overlapping frequency spectra. The performance of the adaptive system was evaluated by using both qualitative (gynecological studies) and quantitative measures such as: Signal-to-Noise Ratio-SNR, Root Mean Square Error-RMSE, Sensitivity-S+, and Positive Predictive Value-PPV.
Martinek, Radek; Nedoma, Jan; Fajkus, Marcel; Kahankova, Radana; Konecny, Jaromir; Janku, Petr; Kepak, Stanislav; Bilik, Petr; Nazeran, Homer
2017-01-01
This paper focuses on the design, realization, and verification of a novel phonocardiographic- based fiber-optic sensor and adaptive signal processing system for noninvasive continuous fetal heart rate (fHR) monitoring. Our proposed system utilizes two Mach-Zehnder interferometeric sensors. Based on the analysis of real measurement data, we developed a simplified dynamic model for the generation and distribution of heart sounds throughout the human body. Building on this signal model, we then designed, implemented, and verified our adaptive signal processing system by implementing two stochastic gradient-based algorithms: the Least Mean Square Algorithm (LMS), and the Normalized Least Mean Square (NLMS) Algorithm. With this system we were able to extract the fHR information from high quality fetal phonocardiograms (fPCGs), filtered from abdominal maternal phonocardiograms (mPCGs) by performing fPCG signal peak detection. Common signal processing methods such as linear filtering, signal subtraction, and others could not be used for this purpose as fPCG and mPCG signals share overlapping frequency spectra. The performance of the adaptive system was evaluated by using both qualitative (gynecological studies) and quantitative measures such as: Signal-to-Noise Ratio—SNR, Root Mean Square Error—RMSE, Sensitivity—S+, and Positive Predictive Value—PPV. PMID:28420215
Fast wavelength calibration method for spectrometers based on waveguide comb optical filter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Zhengang; Department of Physics and Astronomy, Shanghai Jiao Tong University, Shanghai 200240; Huang, Meizhen, E-mail: mzhuang@sjtu.edu.cn
2015-04-15
A novel fast wavelength calibration method for spectrometers based on a standard spectrometer and a double metal-cladding waveguide comb optical filter (WCOF) is proposed and demonstrated. By using the WCOF device, a wide-spectrum beam is comb-filtered, which is very suitable for spectrometer wavelength calibration. The influence of waveguide filter’s structural parameters and the beam incident angle on the comb absorption peaks’ wavelength and its bandwidth are also discussed. The verification experiments were carried out in the wavelength range of 200–1100 nm with satisfactory results. Comparing with the traditional wavelength calibration method based on discrete sparse atomic emission or absorption lines,more » the new method has some advantages: sufficient calibration data, high accuracy, short calibration time, fit for produce process, stability, etc.« less
Zhou, Yong Jin; Yang, Bao Jia
2015-05-10
Although subwavelength planar terahertz (THz) plasmonic devices can be implemented based on planar spoof surface plasmons (SPs), they still suffer from a little high propagation loss. Here the dispersion and propagation characteristics of the spoof plasmonic waveguide composed of double metal strips corrugated with dumbbell shaped grooves have been investigated. It has been found that much lower propagation loss and longer propagation length can be achieved based on the waveguide compared with the conventional spoof plasmonic waveguide with rectangular grooves. Moreover, the waveguide can implement a decrease in size of about 22%. An ultra-wideband THz plasmonic filter for planar circuits has been demonstrated based on the proposed waveguide. The experimental verification at the microwave frequency has been conducted by scaling up the geometry size of the filter.
Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M
2009-03-01
Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.
NASA Astrophysics Data System (ADS)
Cianciara, Aleksander
2016-09-01
The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.
An Investigation into Solution Verification for CFD-DEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fullmer, William D.; Musser, Jordan
This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of anmore » experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6 th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing different randomized particle configurations of the same general problem (for the fictitious case) or different instances of freezing a transient simulation, the numerical uncertainties appeared to be on the same order of magnitude as ensemble or time averaging uncertainties. By testing different drag laws, almost all cases studied show that model form uncertainty in this one, very important closure relation was larger than the numerical uncertainty, at least with a reasonable CFD grid, roughly five particle diameters. In this study, the diffusion width (filtering length scale) was mostly set at a constant of six particle diameters. A few exploratory tests were performed to show that similar convergence behavior was observed for diffusion widths greater than approximately two particle diameters. However, this subject was not investigated in great detail because determining an appropriate filter size is really a validation question which must be determined by comparison to experimental or highly accurate numerical data. Future studies are being considered targeting solution verification of transient simulations as well as validation of the filter size with direct numerical simulation data.« less
NASA Technical Reports Server (NTRS)
Marshall, William M.; Borowski, Stanley K.; Bulman, Mel; Joyner, Russell; Martin, Charles R.
2015-01-01
Nuclear thermal propulsion (NTP) has been recognized as an enabling technology for missions to Mars and beyond. However, one of the key challenges of developing a nuclear thermal rocket is conducting verification and development tests on the ground. A number of ground test options are presented, with the Sub-surface Active Filtration of Exhaust (SAFE) method identified as a preferred path forward for the NTP program. The SAFE concept utilizes the natural soil characteristics present at the Nevada National Security Site to provide a natural filter for nuclear rocket exhaust during ground testing. A validation method of the SAFE concept is presented, utilizing a non-nuclear sub-scale hydrogen/oxygen rocket seeded with detectible radioisotopes. Additionally, some alternative ground test concepts, based upon the SAFE concept, are presented. Finally, an overview of the ongoing discussions of developing a ground test campaign are presented.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
This ETV test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research (DER) describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR Part 89 for nonroad engines, will be ...
The Johnson Matthey SCCRT, v.1 technology is a urea-based SCR system combined with a CCRT filter designed for on-highway light, medium, and heavy heavy-duty diesel, urban and non-urban, bus exhaust gas recirculation (EGR)-or non-EGR-equipped engines for use with commercial ultra-...
Phase-synchroniser based on gm-C all-pass filter chain with sliding mode control
NASA Astrophysics Data System (ADS)
Mitić, Darko B.; Jovanović, Goran S.; Stojčev, Mile K.; Antić, Dragan S.
2015-03-01
Phase-synchronisers have many applications in VLSI circuit designs. They are used in CMOS RF circuits including phase (de)modulators, phase recovery circuits, multiphase synthesis, etc. In this article, a phase-synchroniser based on gm-C all-pass filter chain with sliding mode control is presented. The filter chain provides good controllable delay characteristics over the full range of phase and frequency regulation, without deterioration of input signal amplitude and waveform, while the sliding mode control enables us to achieve fast and predetermined finite locking time. IHP 0.25 µm SiGe BiCMOS technology has been used in design and verification processes. The circuit operates in the frequency range from 33 MHz up to 150 MHz. Simulation results indicate that it is possible to achieve very fast synchronisation time period, which is approximately four time intervals of the input signal during normal operation, and 20 time intervals during power-on.
Banach, Marzena; Wasilewska, Agnieszka; Dlugosz, Rafal; Pauk, Jolanta
2018-05-18
Due to the problem of aging societies, there is a need for smart buildings to monitor and support people with various disabilities, including rheumatoid arthritis. The aim of this paper is to elaborate on novel techniques for wireless motion capture systems for the monitoring and rehabilitation of disabled people for application in smart buildings. The proposed techniques are based on cross-verification of distance measurements between markers and transponders in an environment with highly variable parameters. To their verification, algorithms that enable comprehensive investigation of a system with different numbers of transponders and varying ambient parameters (temperature and noise) were developed. In the estimation of the real positions of markers, various linear and nonlinear filters were used. Several thousand tests were carried out for various system parameters and different marker locations. The results show that localization error may be reduced by as much as 90%. It was observed that repetition of measurement reduces localization error by as much as one order of magnitude. The proposed system, based on wireless techniques, offers a high commercial potential. However, it requires extensive cooperation between teams, including hardware and software design, system modelling, and architectural design.
Wang, Mi; Fan, Chengcheng; Yang, Bo; Jin, Shuying; Pan, Jun
2016-01-01
Satellite attitude accuracy is an important factor affecting the geometric processing accuracy of high-resolution optical satellite imagery. To address the problem whereby the accuracy of the Yaogan-24 remote sensing satellite’s on-board attitude data processing is not high enough and thus cannot meet its image geometry processing requirements, we developed an approach involving on-ground attitude data processing and digital orthophoto (DOM) and the digital elevation model (DEM) verification of a geometric calibration field. The approach focuses on three modules: on-ground processing based on bidirectional filter, overall weighted smoothing and fitting, and evaluation in the geometric calibration field. Our experimental results demonstrate that the proposed on-ground processing method is both robust and feasible, which ensures the reliability of the observation data quality, convergence and stability of the parameter estimation model. In addition, both the Euler angle and quaternion could be used to build a mathematical fitting model, while the orthogonal polynomial fitting model is more suitable for modeling the attitude parameter. Furthermore, compared to the image geometric processing results based on on-board attitude data, the image uncontrolled and relative geometric positioning result accuracy can be increased by about 50%. PMID:27483287
Multi-Stage System for Automatic Target Recognition
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin; Lu, Thomas T.; Ye, David; Edens, Weston; Johnson, Oliver
2010-01-01
A multi-stage automated target recognition (ATR) system has been designed to perform computer vision tasks with adequate proficiency in mimicking human vision. The system is able to detect, identify, and track targets of interest. Potential regions of interest (ROIs) are first identified by the detection stage using an Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter combined with a wavelet transform. False positives are then eliminated by the verification stage using feature extraction methods in conjunction with neural networks. Feature extraction transforms the ROIs using filtering and binning algorithms to create feature vectors. A feedforward back-propagation neural network (NN) is then trained to classify each feature vector and to remove false positives. The system parameter optimizations process has been developed to adapt to various targets and datasets. The objective was to design an efficient computer vision system that can learn to detect multiple targets in large images with unknown backgrounds. Because the target size is small relative to the image size in this problem, there are many regions of the image that could potentially contain the target. A cursory analysis of every region can be computationally efficient, but may yield too many false positives. On the other hand, a detailed analysis of every region can yield better results, but may be computationally inefficient. The multi-stage ATR system was designed to achieve an optimal balance between accuracy and computational efficiency by incorporating both models. The detection stage first identifies potential ROIs where the target may be present by performing a fast Fourier domain OT-MACH filter-based correlation. Because threshold for this stage is chosen with the goal of detecting all true positives, a number of false positives are also detected as ROIs. The verification stage then transforms the regions of interest into feature space, and eliminates false positives using an artificial neural network classifier. The multi-stage system allows tuning the detection sensitivity and the identification specificity individually in each stage. It is easier to achieve optimized ATR operation based on its specific goal. The test results show that the system was successful in substantially reducing the false positive rate when tested on a sonar and video image datasets.
NASA Astrophysics Data System (ADS)
Hazelaar, Colien; Dahele, Max; Mostafavi, Hassan; van der Weide, Lineke; Slotman, Ben; Verbakel, Wilko
2018-06-01
Lung tumors treated in breath-hold are subject to inter- and intra-breath-hold variations, which makes tumor position monitoring during each breath-hold important. A markerless technique is desirable, but limited tumor visibility on kV images makes this challenging. We evaluated if template matching + triangulation of kV projection images acquired during breath-hold stereotactic treatments could determine 3D tumor position. Band-pass filtering and/or digital tomosynthesis (DTS) were used as image pre-filtering/enhancement techniques. On-board kV images continuously acquired during volumetric modulated arc irradiation of (i) a 3D-printed anthropomorphic thorax phantom with three lung tumors (n = 6 stationary datasets, n = 2 gradually moving), and (ii) four patients (13 datasets) were analyzed. 2D reference templates (filtered DRRs) were created from planning CT data. Normalized cross-correlation was used for 2D matching between templates and pre-filtered/enhanced kV images. For 3D verification, each registration was triangulated with multiple previous registrations. Generally applicable image processing/algorithm settings for lung tumors in breath-hold were identified. For the stationary phantom, the interquartile range of the 3D position vector was on average 0.25 mm for 12° DTS + band-pass filtering (average detected positions in 2D = 99.7%, 3D = 96.1%, and 3D excluding first 12° due to triangulation angle = 99.9%) compared to 0.81 mm for band-pass filtering only (55.8/52.9/55.0%). For the moving phantom, RMS errors for the lateral/longitudinal/vertical direction after 12° DTS + band-pass filtering were 1.5/0.4/1.1 mm and 2.2/0.3/3.2 mm. For the clinical data, 2D position was determined for at least 93% of each dataset and 3D position excluding first 12° for at least 82% of each dataset using 12° DTS + band-pass filtering. Template matching + triangulation using DTS + band-pass filtered images could accurately determine the position of stationary lung tumors. However, triangulation was less accurate/reliable for targets with continuous, gradual displacement in the lateral and vertical directions. This technique is therefore currently most suited to detect/monitor offsets occurring between initial setup and the start of treatment, inter-breath-hold variations, and tumors with predominantly longitudinal motion.
Optimization and Verification of a Brushless DC-Motor for Cryogenic Mechanisms
NASA Astrophysics Data System (ADS)
Eggens, M.; van Loon, D.; Smit, H. P.; Jellema, W.; Dieleman, P.; Detrain, A.; Stokroos, M.; Nieuwenhuizen, A. C. T.
2013-09-01
In this paper we report on the results of the investigation on the feasibility of a cryogenic motor for a Filter Wheel Mechanism (FWM) for the instrument SpicA FAR-infrared Instrument (SAFARI). The maximum allowed dissipation of 1 mW is a key requirement, as a result of the limited cooling resources of the satellite. Therefore a quasi 3D electromagnetic (EM) model of a Brushless DC (BLDC) motor has been developed. To withstand the severe launch loads a mechanical concept has been designed to limit the friction torque in the bearings. The model was verified by room temperature and cryogenic measurements on an existing motor from the test setup. The model shows that the proposed BLDC motor design fulfills the requirements.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Optimisation of SIW bandpass filter with wide and sharp stopband using space mapping
NASA Astrophysics Data System (ADS)
Xu, Juan; Bi, Jun Jian; Li, Zhao Long; Chen, Ru shan
2016-12-01
This work presents a substrate integrated waveguide (SIW) bandpass filter with wide and precipitous stopband, which is different from filters with a direct input/output coupling structure. Higher modes in the SIW cavities are used to generate the finite transmission zeros for improved stopband performance. The design of SIW filters requires full wave electromagnetic simulation and extensive optimisation. If a full wave solver is used for optimisation, the design process is very time consuming. The space mapping (SM) approach has been called upon to alleviate this problem. In this case, the coarse model is optimised using an equivalent circuit model-based representation of the structure for fast computations. On the other hand, the verification of the design is completed with an accurate fine model full wave simulation. A fourth-order filter with a passband of 12.0-12.5 GHz is fabricated on a single layer Rogers RT/Duroid 5880 substrate. The return loss is better than 17.4 dB in the passband and the rejection is more than 40 dB in the stopband. The stopband is from 2 to 11 GHz and 13.5 to 17.3 GHz, demonstrating a wide bandwidth performance.
Gordine, Samantha Alex; Fedak, Michael; Boehme, Lars
2015-01-01
ABSTRACT In southern elephant seals (Mirounga leonina), fasting- and foraging-related fluctuations in body composition are reflected by buoyancy changes. Such buoyancy changes can be monitored by measuring changes in the rate at which a seal drifts passively through the water column, i.e. when all active swimming motion ceases. Here, we present an improved knowledge-based method for detecting buoyancy changes from compressed and abstracted dive profiles received through telemetry. By step-wise filtering of the dive data, the developed algorithm identifies fragments of dives that correspond to times when animals drift. In the dive records of 11 southern elephant seals from South Georgia, this filtering method identified 0.8–2.2% of all dives as drift dives, indicating large individual variation in drift diving behaviour. The obtained drift rate time series exhibit that, at the beginning of each migration, all individuals were strongly negatively buoyant. Over the following 75–150 days, the buoyancy of all individuals peaked close to or at neutral buoyancy, indicative of a seal's foraging success. Independent verification with visually inspected detailed high-resolution dive data confirmed that this method is capable of reliably detecting buoyancy changes in the dive records of drift diving species using abstracted data. This also affirms that abstracted dive profiles convey the geometric shape of drift dives in sufficient detail for them to be identified. Further, it suggests that, using this step-wise filtering method, buoyancy changes could be detected even in old datasets with compressed dive information, for which conventional drift dive classification previously failed. PMID:26486362
Plasma Metamaterials for Arbitrary Complex-Amplitude Wave Filters
2013-09-10
plasmas as reflectors , 4 absorbers, 4,5 and antennae 6 of electromagnetic waves. In contrast with the other materials in these devices, parameters...are controlled using launching antenna and high-power wave sources. One of the fundamental facts we have learned in microwave plasmas is that...metamaterials.” 29 In this report, we demonstrate the functional composites of plasmas and metamaterials, and the focusing point is verification of
Verification testing of the Pall Corporation Microza MF S;ystem equipped with a 3-inch filter module, took place between April 30 and August 9, 2000 in Manchester, NH. The source water was drawn from a canal connected to Lake Massabesic, the public reservoir that serves the Town...
Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.
Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David
2013-12-01
Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.
Probabilistic verification of cloud fraction from three different products with CALIPSO
NASA Astrophysics Data System (ADS)
Jung, B. J.; Descombes, G.; Snyder, C.
2017-12-01
In this study, we present how Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) can be used for probabilistic verification of cloud fraction, and apply this probabilistic approach to three cloud fraction products: a) The Air Force Weather (AFW) World Wide Merged Cloud Analysis (WWMCA), b) Satellite Cloud Observations and Radiative Property retrieval Systems (SatCORPS) from NASA Langley Research Center, and c) Multi-sensor Advection Diffusion nowCast (MADCast) from NCAR. Although they differ in their details, both WWMCA and SatCORPS retrieve cloud fraction from satellite observations, mainly of infrared radiances. MADCast utilizes in addition a short-range forecast of cloud fraction (provided by the Model for Prediction Across Scales, assuming cloud fraction is advected as a tracer) and a column-by-column particle filter implemented within the Gridpoint Statistical Interpolation (GSI) data-assimilation system. The probabilistic verification considers the retrieved or analyzed cloud fractions as predicting the probability of cloud at any location within a grid cell and the 5-km vertical feature mask (VFM) from CALIPSO level-2 products as a point observation of cloud.
PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory
NASA Astrophysics Data System (ADS)
Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.
2018-02-01
PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.
NASA Astrophysics Data System (ADS)
Wang, Qi; Song, Huaqing; Wang, Xingpeng; Wang, Dongdong; Li, Li
2018-03-01
In this paper, we demonstrated thermally tunable 1- μm single-frequency fiber lasers utilizing loop mirror filters (LMFs) with unpumped Yb-doped fibers. The frequency selection and tracking was achieved by combining a fiber Bragg grating (FBG) and a dynamic grating established inside the LMF. The central emission wavelength was at 1064.07 nm with a tuning range of 1.4 nm, and the measured emission linewidth was less than 10 kHz. We also systematically studied the wavelength-tracking thermal stability of the LMF with separate thermal treatment upon the FBG and LMF, respectively. Finally, we presented a selection criterion for the minimum unpumped doped fiber length inside the LMF with experimental verification.
Application of based on improved wavelet algorithm in fiber temperature sensor
NASA Astrophysics Data System (ADS)
Qi, Hui; Tang, Wenjuan
2018-03-01
It is crucial point that accurate temperature in distributed optical fiber temperature sensor. In order to solve the problem of temperature measurement error due to weak Raman scattering signal and strong noise in system, a new based on improved wavelet algorithm is presented. On the basis of the traditional modulus maxima wavelet algorithm, signal correlation is considered to improve the ability to capture signals and noise, meanwhile, combined with wavelet decomposition scale adaptive method to eliminate signal loss or noise not filtered due to mismatch scale. Superiority of algorithm filtering is compared with others by Matlab. At last, the 3km distributed optical fiber temperature sensing system is used for verification. Experimental results show that accuracy of temperature generally increased by 0.5233.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sotiriadis, Charalampos; Hajdu, Steven David; Degrauwe, Sophie
With the increased use of implanted venous access devices (IVADs) for continuous long-term venous access, several techniques such as percutaneous endovascular fibrin sheath removal, have been described, to maintain catheter function. Most standard techniques do not capture the stripped fibrin sheath, which is subsequently released in the pulmonary circulation and may lead to symptomatic pulmonary embolism. The presented case describes an endovascular technique which includes stripping, capture, and removal of fibrin sheath using a novel filter device. A 64-year-old woman presented with IVAD dysfunction. Stripping was performed using a co-axial snare to the filter to capture the fibrin sheath. Themore » captured fragment was subsequently removed for visual and pathological verification. No immediate complication was observed and the patient was discharged the day of the procedure.« less
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
Automatic x-ray image contrast enhancement based on parameter auto-optimization.
Qiu, Jianfeng; Harold Li, H; Zhang, Tiezhi; Ma, Fangfang; Yang, Deshan
2017-11-01
Insufficient image contrast associated with radiation therapy daily setup x-ray images could negatively affect accurate patient treatment setup. We developed a method to perform automatic and user-independent contrast enhancement on 2D kilo voltage (kV) and megavoltage (MV) x-ray images. The goal was to provide tissue contrast optimized for each treatment site in order to support accurate patient daily treatment setup and the subsequent offline review. The proposed method processes the 2D x-ray images with an optimized image processing filter chain, which consists of a noise reduction filter and a high-pass filter followed by a contrast limited adaptive histogram equalization (CLAHE) filter. The most important innovation is to optimize the image processing parameters automatically to determine the required image contrast settings per disease site and imaging modality. Three major parameters controlling the image processing chain, i.e., the Gaussian smoothing weighting factor for the high-pass filter, the block size, and the clip limiting parameter for the CLAHE filter, were determined automatically using an interior-point constrained optimization algorithm. Fifty-two kV and MV x-ray images were included in this study. The results were manually evaluated and ranked with scores from 1 (worst, unacceptable) to 5 (significantly better than adequate and visually praise worthy) by physicians and physicists. The average scores for the images processed by the proposed method, the CLAHE, and the best window-level adjustment were 3.92, 2.83, and 2.27, respectively. The percentage of the processed images received a score of 5 were 48, 29, and 18%, respectively. The proposed method is able to outperform the standard image contrast adjustment procedures that are currently used in the commercial clinical systems. When the proposed method is implemented in the clinical systems as an automatic image processing filter, it could be useful for allowing quicker and potentially more accurate treatment setup and facilitating the subsequent offline review and verification. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Design and experimental verification of a dual-band metamaterial filter
NASA Astrophysics Data System (ADS)
Zhu, Hong-Yang; Yao, Ai-Qin; Zhong, Min
2016-10-01
In this paper, we present the design, simulation, and experimental verification of a dual-band free-standing metamaterial filter operating in a frequency range of 1 THz-30 THz. The proposed structure consists of periodically arranged composite air holes, and exhibits two broad and flat transmission bands. To clarify the effects of the structural parameters on both resonant transmission bands, three sets of experiments are performed. The first resonant transmission band shows a shift towards higher frequency when the side width w 1 of the main air hole is increased. In contrast, the second resonant transmission band displays a shift towards lower frequency when the side width w 2 of the sub-holes is increased, while the first resonant transmission band is unchanged. The measured results indicate that these resonant bands can be modulated individually by simply optimizing the relevant structural parameters (w 1 or w 2) for the required band. In addition, these resonant bands merge into a single resonant band with a bandwidth of 7.7 THz when w 1 and w 2 are optimized simultaneously. The structure proposed in this paper adopts different resonant mechanisms for transmission at different frequencies and thus offers a method to achieve a dual-band and low-loss filter. Project supported by the Doctorate Scientific Research Foundation of Hezhou University, China (Grant No. HZUBS201503), the Promotion of the Basic Ability of Young and Middle-aged Teachers in Universities Project of Guangxi Zhuang Autonomous Region, China (Grant No. KY2016YB453), the Guangxi Colleges and Universities Key Laboratory Symbolic Computation, China, Engineering Data Processing and Mathematical Support Autonomous Discipline Project of Hezhou University, China (Grant No. 2016HZXYSX01).
Early Observations with the ACS Ramp Filters
NASA Astrophysics Data System (ADS)
Tsvetanov, Z.; Hartig, G.; Bohlin, R.; Tran, H. D.; Martel, A.; Sirianni, M.; Clampin, M.
2002-05-01
The Advanced Camera for Surveys (ACS) on-board the Hubble Space Telescope (HST) is equipped with a set of ramp filters which provide imaging capability at 2% and 9% bandwidth in the range 3700-10700 Å. Each ramp filter consist of three segments where the middle segment can be used with both the Wide Field Channel (WFC) and High Resolution Channel (HRC), while the inner and outer segments can be used only with WFC. The monochromatic field of view is approximately 40'' by 80''. We will present observations of the planetary nebula (PN) NGC6543 (the Cat's Eye) taken with the ACS ramp filetrs in several key emission lines - [O II] 3727, [O III] 5007, H-alpha+[N II], and [S II] 6725. These four emission lines fall onto three separate middle ramp segments - FR388N, FR505N, and FR656N - and will allow inter-comparison between the ACS ramp filters and fixed bandpass narrow-band filters F502N and F658N for both the WFC and HRC detectors. These observations were taken as part of the HST Servicing Mission Orbital Verification program and were designed to test ramp filters performance. We will demostrate our ability to obtain monochromatic (i.e., emission line) images at arbitrary wavelength and recover the surface brightness distribution. This work was supported by a NASA contract and a NASA grant.
2002-09-01
Secure Multicast......................................................................24 i. Message Digests and Message Authentication Codes ( MACs ...that is, the needs of the VE will determine what the design will look like (e.g., reliable vs . unreliable data communications). In general, there...Molva00] and [Abdalla00]. i. Message Digests and Message Authentication Codes ( MACs ) Message digests and MACs are used for data integrity verification
The politics of verification and the control of nuclear tests, 1945-1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, N.W.
1990-01-01
This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less
Online sales: profit without question.
Bryant, J A; Cody, M J; Murphy, S T
2002-09-01
To examine the ease with which underage smokers can purchase cigarettes online using money orders and to evaluate the effectiveness of internet filtering programs in blocking access to internet cigarette vendors (ICVs). Four young people purchased 32 money orders using 32 different names to buy one carton of cigarettes for each named individual. Each money order was subsequently mailed to a different ICV in the USA. No age related information accompanied these online orders. Two internet filtering programs ("Bess" and filtertobacco.org) were tested for their relative efficacy in blocking access to ICV sites. Of the 32 orders placed, four orders never reached the intended ICV. Of the remaining 28 orders, 20 (71%) were filled despite a lack of age verification. Only four (14%) of the orders received were rejected because they lacked proof of age. "Bess" blocked access to 84% and filtertobacco.org to 94% of the ICV sites. Although underage smokers can easily purchase cigarettes online using money orders, access to these sites can be largely blocked if appropriate filtering devices are installed.
Optimization of a matched-filter receiver for frequency hopping code acquisition in jamming
NASA Astrophysics Data System (ADS)
Pawlowski, P. R.; Polydoros, A.
A matched-filter receiver for frequency hopping (FH) code acquisition is optimized when either partial-band tone jamming or partial-band Gaussian noise jamming is present. The receiver is matched to a segment of the FH code sequence, sums hard per-channel decisions to form a test, and uses multiple tests to verify acquisition. The length of the matched filter and the number of verification tests are fixed. Optimization is then choosing thresholds to maximize performance based upon the receiver's degree of knowledge about the jammer ('side-information'). Four levels of side-information are considered, ranging from none to complete. The latter level results in a constant-false-alarm-rate (CFAR) design. At each level, performance sensitivity to threshold choice is analyzed. Robust thresholds are chosen to maximize performance as the jammer varies its power distribution, resulting in simple design rules which aid threshold selection. Performance results, which show that optimum distributions for the jammer power over the total FH bandwidth exist, are presented.
Anabtawi, Nijad; Ferzli, Rony; Harmanani, Haidar M.
2017-01-01
This paper presents a step down, switched mode power converter for use in multi-standard envelope tracking radio frequency power amplifiers (RFPA). The converter is based on a programmable order sigma delta modulator that can be configured to operate with either 1st, 2nd, 3rd or 4th order loop filters, eliminating the need for a bulky passive output filter. Output ripple, sideband noise and spectral emission requirements of different wireless standards can be met by configuring the modulator’s filter order and converter’s sampling frequency. The proposed converter is entirely digital and is implemented in 14nm bulk CMOS process for post layout verification. For an input voltage of 3.3V, the converter’s output can be regulated to any voltage level from 0.5V to 2.5V, at a nominal switching frequency of 150MHz. It achieves a maximum efficiency of 94% at 1.5 W output power. PMID:28919657
NASA Astrophysics Data System (ADS)
Aitomäki, Yvonne; Westin, Mikael; Korpimäki, Jani; Oksman, Kristiina
2016-07-01
In this study a model based on simple scattering is developed and used to predict the distribution of nanofibrillated cellulose in composites manufactured by resin transfer moulding (RTM) where the resin contains nanofibres. The model is a Monte Carlo based simulation where nanofibres are randomly chosen from probability density functions for length, diameter and orientation. Their movements are then tracked as they advance through a random arrangement of fibres in defined fibre bundles. The results of the model show that the fabric filters the nanofibres within the first 20 µm unless clear inter-bundle channels are available. The volume fraction of the fabric fibres, flow velocity and size of nanofibre influence this to some extent. To verify the model, an epoxy with 0.5 wt.% Kraft Birch nanofibres was made through a solvent exchange route and stained with a colouring agent. This was infused into a glass fibre fabric using an RTM process. The experimental results confirmed the filtering of the nanofibres by the fibre bundles and their penetration in the fabric via the inter-bundle channels. Hence, the model is a useful tool for visualising the distribution of the nanofibres in composites in this manufacturing process.
Requirements, Verification, and Compliance (RVC) Database Tool
NASA Technical Reports Server (NTRS)
Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale
2001-01-01
This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".
Optical fiber sensors measurement system and special fibers improvement
NASA Astrophysics Data System (ADS)
Jelinek, Michal; Hrabina, Jan; Hola, Miroslava; Hucl, Vaclav; Cizek, Martin; Rerucha, Simon; Lazar, Josef; Mikel, Bretislav
2017-06-01
We present method for the improvement of the measurement accuracy in the optical frequency spectra measurements based on tunable optical filters. The optical filter was used during the design and realization of the measurement system for the inspection of the fiber Bragg gratings. The system incorporates a reference block for the compensation of environmental influences, an interferometric verification subsystem and a PC - based control software implemented in LabView. The preliminary experimental verification of the measurement principle and the measurement system functionality were carried out on a testing rig with a specially prepared concrete console in the UJV Řež. The presented system is the laboratory version of the special nuclear power plant containment shape deformation measurement system which was installed in the power plant Temelin during last year. On the base of this research we started with preparation other optical fiber sensors to nuclear power plants measurement. These sensors will be based on the microstructured and polarization maintaining optical fibers. We started with development of new methods and techniques of the splicing and shaping optical fibers. We are able to made optical tapers from ultra-short called adiabatic with length around 400 um up to long tapers with length up to 6 millimeters. We developed new techniques of splicing standard Single Mode (SM) and Multimode (MM) optical fibers and splicing of optical fibers with different diameters in the wavelength range from 532 to 1550 nm. Together with development these techniques we prepared other techniques to splicing and shaping special optical fibers like as Polarization-Maintaining (PM) or hollow core Photonic Crystal Fiber (PCF) and theirs cross splicing methods with focus to minimalize backreflection and attenuation. The splicing special optical fibers especially PCF fibers with standard telecommunication and other SM fibers can be done by our developed techniques. Adjustment of the splicing process has to be prepared for any new optical fibers and new fibers combinations. The splicing of the same types of fibers from different manufacturers can be adjusted by several tested changes in the splicing process. We are able to splice PCF with standard telecommunication fiber with attenuation up to 2 dB. The method is also presented. Development of these new techniques and methods of the optical fibers splicing are made with respect to using these fibers to another research and development in the field of optical fibers sensors, laser frequency stabilization and laser interferometry based on optical fibers. Especially for the field of laser frequency stabilization we developed and present new techniques to closing microstructured fibers with gases inside.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Chen, Chiung-An; Chen, Shih-Lun; Huang, Hong-Yi; Luo, Ching-Hsing
2012-11-22
In this paper, a low-cost, low-power and high performance micro control unit (MCU) core is proposed for wireless body sensor networks (WBSNs). It consists of an asynchronous interface, a register bank, a reconfigurable filter, a slop-feature forecast, a lossless data encoder, an error correct coding (ECC) encoder, a UART interface, a power management (PWM), and a multi-sensor controller. To improve the system performance and expansion abilities, the asynchronous interface is added for handling signal exchanges between different clock domains. To eliminate the noise of various bio-signals, the reconfigurable filter is created to provide the functions of average, binomial and sharpen filters. The slop-feature forecast and the lossless data encoder is proposed to reduce the data of various biomedical signals for transmission. Furthermore, the ECC encoder is added to improve the reliability for the wireless transmission and the UART interface is employed the proposed design to be compatible with wireless devices. For long-term healthcare monitoring application, a power management technique is developed for reducing the power consumption of the WBSN system. In addition, the proposed design can be operated with four different bio-sensors simultaneously. The proposed design was successfully tested with a FPGA verification board. The VLSI architecture of this work contains 7.67-K gate counts and consumes the power of 5.8 mW or 1.9 mW at 100 MHz or 133 MHz processing rate using a TSMC 0.18 μm or 0.13 μm CMOS process. Compared with previous techniques, this design achieves higher performance, more functions, more flexibility and higher compatibility than other micro controller designs.
Exploring Model Error through Post-processing and an Ensemble Kalman Filter on Fire Weather Days
NASA Astrophysics Data System (ADS)
Erickson, Michael J.
The proliferation of coupling atmospheric ensemble data to models in other related fields requires a priori knowledge of atmospheric ensemble biases specific to the desired application. In that spirit, this dissertation focuses on elucidating atmospheric ensemble model bias and error through a variety of different methods specific to fire weather days (FWDs) over the Northeast United States (NEUS). Other than a handful of studies that use models to predict fire indices for single fire seasons (Molders 2008, Simpson et al. 2014), an extensive exploration of model performance specific to FWDs has not been attempted. Two unique definitions for FWDs are proposed; one that uses pre-existing fire indices (FWD1) and another from a new statistical fire weather index (FWD2) relating fire occurrence and near-surface meteorological observations. Ensemble model verification reveals FWDs to have warmer (> 1 K), moister (~ 0.4 g kg-1) and less windy (~ 1 m s-1) biases than the climatological average for both FWD1 and FWD2. These biases are not restricted to the near surface but exist through the entirety of the planetary boundary layer (PBL). Furthermore, post-processing methods are more effective when previous FWDs are incorporated into the statistical training, suggesting that model bias could be related to the synoptic flow pattern. An Ensemble Kalman Filter (EnKF) is used to explore the effectiveness of data assimilation during a period of extensive FWDs in April 2012. Model biases develop rapidly on FWDs, consistent with the FWD1 and FWD2 verification. However, the EnKF is effective at removing most biases for temperature, wind speed and specific humidity. Potential sources of error in the parameterized physics of the PBL are explored by rerunning the EnKF with simultaneous state and parameter estimation (SSPE) for two relevant parameters within the ACM2 PBL scheme. SSPE helps to reduce the cool temperature bias near the surface on FWDs, with the variability in parameter estimates exhibiting some relationship to model bias for temperature. This suggests the potential for structural model error within the ACM2 PBL scheme and could lead toward the future development of improved PBL parameterizations.
Tracking Algorithm of Multiple Pedestrians Based on Particle Filters in Video Sequences
Liu, Yun; Wang, Chuanxu; Zhang, Shujun; Cui, Xuehong
2016-01-01
Pedestrian tracking is a critical problem in the field of computer vision. Particle filters have been proven to be very useful in pedestrian tracking for nonlinear and non-Gaussian estimation problems. However, pedestrian tracking in complex environment is still facing many problems due to changes of pedestrian postures and scale, moving background, mutual occlusion, and presence of pedestrian. To surmount these difficulties, this paper presents tracking algorithm of multiple pedestrians based on particle filters in video sequences. The algorithm acquires confidence value of the object and the background through extracting a priori knowledge thus to achieve multipedestrian detection; it adopts color and texture features into particle filter to get better observation results and then automatically adjusts weight value of each feature according to current tracking environment. During the process of tracking, the algorithm processes severe occlusion condition to prevent drift and loss phenomena caused by object occlusion and associates detection results with particle state to propose discriminated method for object disappearance and emergence thus to achieve robust tracking of multiple pedestrians. Experimental verification and analysis in video sequences demonstrate that proposed algorithm improves the tracking performance and has better tracking results. PMID:27847514
NASA Technical Reports Server (NTRS)
1989-01-01
The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.
Expose : procedure and results of the joint experiment verification tests
NASA Astrophysics Data System (ADS)
Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.
The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.
Ranger, R; Butler, P; Yahnke, C; Valentino, D
2012-06-01
To develop and validate an Optically Stimulated Luminescent (OSL) dosimeter for exposure control verification of x-ray projection mammography imaging systems. The active detection element of the dosimeter is a strip of OSL material 3.0 mm wide, 0.13 mm thick and 30.0 mm long with an overlying aluminum step wedge with thicknesses of 0, 0.2, 0.4 and 0.6 mm Al, encapsulated in a light-tight plastic enclosure with outer dimensions of 10.0 mm wide, 5.4 mm thick, and 54.0 mm long. The dosimeter is used in conjunction with a breast phantom for the purpose of estimating the half-value layer (HVL), entrance surface exposure (ESE), and average glandular dose (AGD) in conventional projection mammography. ESE and HVL were computed based on analysis of exposure profiles obtained from exposed strip dosimeters. The AGD was estimated by multiplying the ESE by the appropriate exposure to dose conversion factor for the thickness and % glandular tissue fraction represented by the phantom and target-filter combination employed. The accuracy and reproducibility of the ESE, HVL and AGD estimates obtained using the dosimeter positioned on the surface of the ACR phantom at the chest wall edge, was evaluated using mammography systems utilizing different imaging receptor technology, i.e. screen-film (SF), computed radiography (CR) and direct radiography (DR) and compared against results obtained using a calibrated ion chamber fitted with a mammography probe. ESE, AGD and HVL results obtained using the OSL mammography QA dosimeter agreed with results obtained using an ion chamber to within 5-10%, depending on the target-filter combination used. Repeat readings were highly consistent with a coefficient of variation = 5%. The OSL mammography QA dosimeter has been shown to effectively estimate ESE, HVL and AGD, demonstrating its usefulness for secondary monitoring of output exposure of mammography imaging systems. © 2012 American Association of Physicists in Medicine.
Crew Exploration Vehicle (CEV) Potable Water System Verification Description
NASA Technical Reports Server (NTRS)
Peterson, Laurie; DeVera, Jean; Vega, Leticia; Adam, Nik; Steele, John; Gazda, Daniel; Roberts, Michael
2009-01-01
The Crew Exploration Vehicle (CEV), also known as Orion, will ferry a crew of up to six astronauts to the International Space Station (ISS), or a crew of up to four astronauts to the moon. The first launch of CEV is scheduled for approximately 2014. A stored water system on the CEV will supply the crew with potable water for various purposes: drinking and food rehydration, hygiene, medical needs, sublimation, and various contingency situations. The current baseline biocide for the stored water system is ionic silver, similar in composition to the biocide used to maintain quality of the water transferred from the Orbiter to the ISS and stored in Contingency Water Containers (CWCs). In the CEV water system, the ionic silver biocide is expected to be depleted from solution due to ionic silver plating onto the surfaces of the materials within the CEV water system, thus negating its effectiveness as a biocide. Since the biocide depletion is expected to occur within a short amount of time after loading the water into the CEV water tanks at the Kennedy Space Center (KSC), an additional microbial control is a 0.1 micron point of use filter that will be used at the outlet of the Potable Water Dispenser (PWD). Because this may be the first time NASA is considering a stored water system for longterm missions that does not maintain a residual biocide, a team of experts in materials compatibility, biofilms and point of use filters, surface treatment and coatings, and biocides has been created to pinpoint concerns and perform testing to help alleviate those concerns related to the CEV water system. Results from the test plans laid out in the paper presented to SAE last year (Crew Exploration Vehicle (CEV) Potable Water System Verification Coordination, 2008012083) will be detailed in this paper. Additionally, recommendations for the CEV verification will be described for risk mitigation in meeting the physicochemical and microbiological requirements on the CEV PWS.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael
2007-08-21
Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.
A lysimeter-based approach to quantify the impact of climate change on soil hydrological processes
NASA Astrophysics Data System (ADS)
Slawitsch, Veronika; Steffen, Birk; Herndl, Markus
2016-04-01
The predicted climate change involving increasing CO2 concentrations and increasing temperatures will have effects on both vegetation and soil properties and thus on the soil water balance. The aim of this work is to quantify the effects of changes in these climatic factors on soil hydrological processes and parameters. For this purpose data of six high precision weighable lysimeters will be used. The lysimeters are part of a Lysi-T-FACE concept, where free-air will be enriched with CO2 (FACE-Technique) and infrared heaters heat the plots for investigation on effects of increasing temperatures (T-FACE-Technique). The Lysi-T-FACE concept was developed on the „Clim Grass Site" at the HBLFA Raumberg-Gumpenstein (Styria, Austria) in 2011 and 2012 with a total of 54 experimental plots. These include six plots with lysimeters where the two climatic factors are varied in different combinations. On the basis of these grass land lysimeters the soil hydraulic parameters under different experimental conditions will be investigated. The lysimeters are equipped with TDR-Trime sensors and temperature sensors combined with tensiometers in different depths. In addition, a mechanical separation snow cover system is implemented to obtain a correct water balance in winter. To be able to infer differences between the lysimeters reliably a verification of functionalities and a plausibility check of the data from the lysimeters as well as adequate data corrections are needed. Both an automatic and a user-defined control including the recently developed filter method AWAT (Adaptive Window and Adaptive Threshold Filter) are combined with a visualisation tool using the software NI DIAdem. For each lysimeter the raw data is classified in groups of matric potentials, soil water contents and lysimeter weights. Values exceeding technical thresholds are eliminated and marked automatically. The manual data control is employed every day to obtain high precision seepage water weights. The subsequent application of the AWAT Filter reduces up to 80% of the oscillations in the calculated precipitation and evapotranspiration. The filtered data of the reference plot in June 2014 yields a precipitation of about 100 mm, whereas the non-filtered raw data result in approximately 170 mm and thus an obvious overestimation of precipitation. The resulting evapotranspiration amounts to slightly more than 100 mm with filter and 200 mm without filter in the same time period. The total water balance (precipitation minus evapotranspiration) of the year 2014 obtained with the automatic and manual data filter is 470 mm on the reference plot but only 358 mm on a plot where CO2 is enriched and temperature increased. In summary, these first results demonstrate that an adequate data correction is the precondition to identify changes of soil hydrological processes and properties.
First SN Discoveries from the Dark Energy Survey
NASA Astrophysics Data System (ADS)
Abbott, T.; Abdalla, F.; Achitouv, I.; Ahn, E.; Aldering, G.; Allam, S.; Alonso, D.; Amara, A.; Annis, J.; Antonik, M.; Aragon-Salamanca, A.; Armstrong, R.; Ashall, C.; Asorey, J.; Bacon, D.; Balbinot, E.; Banerji, M.; Barbary, K.; Barkhouse, W.; Baruah, L.; Bauer, A.; Bechtol, K.; Becker, M.; Bender, R.; Benoist, C.; Benoit-Levy, A.; Bernardi, M.; Bernstein, G.; Bernstein, J. P.; Bernstein, R.; Bertin, E.; Beynon, E.; Bhattacharya, S.; Biesiadzinski, T.; Biswas, R.; Blake, C.; Bloom, J. S.; Bocquet, S.; Brandt, C.; Bridle, S.; Brooks, D.; Brown, P. J.; Brunner, R.; Buckley-Geer, E.; Burke, D.; Burkert, A.; Busha, M.; Campa, J.; Campbell, H.; Cane, R.; Capozzi, D.; Carlstrom, J.; Carnero Rosell, A.; Carollo, M.; Carrasco-Kind, M.; Carretero, J.; Carter, M.; Casas, R.; Castander, F. J.; Chen, Y.; Chiu, I.; Chue, C.; Clampitt, J.; Clerkin, L.; Cohn, J.; Colless, M.; Copeland, E.; Covarrubias, R. A.; Crittenden, R.; Crocce, M.; Cunha, C.; da Costa, L.; d'Andrea, C.; Das, S.; Das, R.; Davis, T. M.; Deb, S.; DePoy, D.; Derylo, G.; Desai, S.; de Simoni, F.; Devlin, M.; Diehl, H. T.; Dietrich, J.; Dodelson, S.; Doel, P.; Dolag, K.; Efstathiou, G.; Eifler, T.; Erickson, B.; Eriksen, M.; Estrada, J.; Etherington, J.; Evrard, A.; Farrens, S.; Fausti Neto, A.; Fernandez, E.; Ferreira, P. C.; Finley, D.; Fischer, J. A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Furlanetto, C.; Garcia-Bellido, J.; Gaztanaga, E.; Gelman, M.; Gerdes, D.; Giannantonio, T.; Gilhool, S.; Gill, M.; Gladders, M.; Gladney, L.; Glazebrook, K.; Gray, M.; Gruen, D.; Gruendl, R.; Gupta, R.; Gutierrez, G.; Habib, S.; Hall, E.; Hansen, S.; Hao, J.; Heitmann, K.; Helsby, J.; Henderson, R.; Hennig, C.; High, W.; Hirsch, M.; Hoffmann, K.; Holhjem, K.; Honscheid, K.; Host, O.; Hoyle, B.; Hu, W.; Huff, E.; Huterer, D.; Jain, B.; James, D.; Jarvis, M.; Jarvis, M. J.; Jeltema, T.; Johnson, M.; Jouvel, S.; Kacprzak, T.; Karliner, I.; Katsaros, J.; Kent, S.; Kessler, R.; Kim, A.; Kim-Vy, T.; King, L.; Kirk, D.; Kochanek, C.; Kopp, M.; Koppenhoefer, J.; Kovacs, E.; Krause, E.; Kravtsov, A.; Kron, R.; Kuehn, K.; Kuemmel, M.; Kuhlmann, S.; Kunder, A.; Kuropatkin, N.; Kwan, J.; Lahav, O.; Leistedt, B.; Levi, M.; Lewis, P.; Liddle, A.; Lidman, C.; Lilly, S.; Lin, H.; Liu, J.; Lopez-Arenillas, C.; Lorenzon, W.; LoVerde, M.; Ma, Z.; Maartens, R.; Maccrann, N.; Macri, L.; Maia, M.; Makler, M.; Manera, M.; Maraston, C.; March, M.; Markovic, K.; Marriner, J.; Marshall, J.; Marshall, S.; Martini, P.; Marti Sanahuja, P.; Mayers, J.; McKay, T.; McMahon, R.; Melchior, P.; Merritt, K. W.; Merson, A.; Miller, C.; Miquel, R.; Mohr, J.; Moore, T.; Mortonson, M.; Mosher, J.; Mould, J.; Mukherjee, P.; Neilsen, E.; Ngeow, C.; Nichol, R.; Nidever, D.; Nord, B.; Nugent, P.; Ogando, R.; Old, L.; Olsen, J.; Ostrovski, F.; Paech, K.; Papadopoulos, A.; Papovich, C.; Patton, K.; Peacock, J.; Pellegrini, P. S. S.; Peoples, J.; Percival, W.; Perlmutter, S.; Petravick, D.; Plazas, A.; Ponce, R.; Poole, G.; Pope, A.; Refregier, A.; Reyes, R.; Ricker, P.; Roe, N.; Romer, K.; Roodman, A.; Rooney, P.; Ross, A.; Rowe, B.; Rozo, E.; Rykoff, E.; Sabiu, C.; Saglia, R.; Sako, M.; Sanchez, A.; Sanchez, C.; Sanchez, E.; Sanchez, J.; Santiago, B.; Saro, A.; Scarpine, V.; Schindler, R.; Schmidt, B. P.; Schmitt, R. L.; Schubnell, M.; Seitz, S.; Senger, R.; Sevilla, I.; Sharp, R.; Sheldon, E.; Sheth, R.; Smith, R. C.; Smith, M.; Snigula, J.; Soares-Santos, M.; Sobreira, F.; Song, J.; Soumagnac, M.; Spinka, H.; Stebbins, A.; Stoughton, C.; Suchyta, E.; Suhada, R.; Sullivan, M.; Sun, F.; Suntzeff, N.; Sutherland, W.; Swanson, M. E. C.; Sypniewski, A. J.; Szepietowski, R.; Talaga, R.; Tarle, G.; Tarrant, E.; Balan, S. Thaithara; Thaler, J.; Thomas, D.; Thomas, R. C.; Tucker, D.; Uddin, S. A.; Ural, S.; Vikram, V.; Voigt, L.; Walker, A. R.; Walker, T.; Wechsler, R.; Weinberg, D.; Weller, J.; Wester, W.; Wetzstein, M.; White, M.; Wilcox, H.; Wilman, D.; Yanny, B.; Young, J.; Zablocki, A.; Zenteno, A.; Zhang, Y.; Zuntz, J.
2012-12-01
The Dark Energy Survey (DES) report the discovery of the first set of supernovae (SN) from the project. Images were observed as part of the DES Science Verification phase using the newly-installed 570-Megapixel Dark Energy Camera on the CTIO Blanco 4-m telescope by observers J. Annis, E. Buckley-Geer, and H. Lin. SN observations are planned throughout the observing campaign on a regular cadence of 4-6 days in each of the ten 3-deg2 fields in the DES griz filters.
2011-09-01
m b e r o f O cc u rr e n ce s 50 ( a ) Kp 0-3 (b) Kp 4-9 Figure 25. Scatter plot of...dependent physics based model that uses the Ionospheric Forecast Model ( IFM ) as a background model upon which perturbations are imposed via a Kalman filter...vertical output resolution as the IFM . GAIM-GM can also be run in a regional mode with a finer resolution (Scherliess et al., 2006). GAIM-GM is
Abstracting data warehousing issues in scientific research.
Tews, Cody; Bracio, Boris R
2002-01-01
This paper presents the design and implementation of the Idaho Biomedical Data Management System (IBDMS). This system preprocesses biomedical data from the IMPROVE (Improving Control of Patient Status in Critical Care) library via an Open Database Connectivity (ODBC) connection. The ODBC connection allows for local and remote simulations to access filtered, joined, and sorted data using the Structured Query Language (SQL). The tool is capable of providing an overview of available data in addition to user defined data subset for verification of models of the human respiratory system.
Space transportation system payload interface verification
NASA Technical Reports Server (NTRS)
Everline, R. T.
1977-01-01
The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).
24 CFR 960.259 - Family information and verification.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...
The Learner Verification of Series r: The New Macmillan Reading Program; Highlights.
ERIC Educational Resources Information Center
National Evaluation Systems, Inc., Amherst, MA.
National Evaluation Systems, Inc., has developed curriculum evaluation techniques, in terms of learner verification, which may be used to help the curriculum-development efforts of publishing companies, state education departments, and universities. This document includes a summary of the learner-verification approach, with data collected about a…
24 CFR 960.259 - Family information and verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Family information and verification... URBAN DEVELOPMENT ADMISSION TO, AND OCCUPANCY OF, PUBLIC HOUSING Rent and Reexamination § 960.259 Family information and verification. (a) Family obligation to supply information. (1) The family must supply any...
Interpreter composition issues in the formal verification of a processor-memory module
NASA Technical Reports Server (NTRS)
Fura, David A.; Cohen, Gerald C.
1994-01-01
This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
Portable traceability solution for ground-based calibration of optical instruments
NASA Astrophysics Data System (ADS)
El Gawhary, Omar; van Veghel, Marijn; Kenter, Pepijn; van der Leden, Natasja; Dekker, Paul; Revtova, Elena; Heemskerk, Maurice; Trarbach, André; Vink, Ramon; Doyle, Dominic
2017-11-01
We present a portable traceability solution for the ground-based optical calibration of earth observation (EO) instruments. Currently, traceability for this type of calibration is typically based on spectral irradiance sources (e.g. FEL lamps) calibrated at a national metrology institute (NMI). Disadvantages of this source-based traceability are the inflexibility in operating conditions of the source, which are limited to the settings used during calibration at the NMI, and the susceptibility to aging, which requires frequent recalibrations, and which cannot be easily checked on-site. The detector-based traceability solution presented in this work uses a portable filter radiometer to calibrate light sources onsite, immediately before and after, or even during instrument calibration. The filter radiometer itself is traceable to the primary standard of radiometry in the Netherlands. We will discuss the design and realization, calibration and performance verification.
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
24 CFR 5.512 - Verification of eligible immigration status.
Code of Federal Regulations, 2010 CFR
2010-04-01
... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...
Built-in-Test Verification Techniques
1987-02-01
report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical
NASA Technical Reports Server (NTRS)
1986-01-01
Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.
24 CFR 985.3 - Indicators, HUD verification methods and ratings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicators, HUD verification..., HUD verification methods and ratings. This section states the performance indicators that are used to assess PHA Section 8 management. HUD will use the verification method identified for each indicator in...
Verification test report on a solar heating and hot water system
NASA Technical Reports Server (NTRS)
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
Assessment of female breast dose for thoracic cone-beam CT using MOSFET dosimeters.
Sun, Wenzhao; Wang, Bin; Qiu, Bo; Liang, Jian; Xie, Weihao; Deng, Xiaowu; Qi, Zhenyu
2017-03-21
To assess the breast dose during a routine thoracic cone-beam CT (CBCT) check with the efforts to explore the possible dose reduction strategy. Metal oxide semiconductor field-effect transistor (MOSFET) dosimeters were used to measure breast surface doses during a thorax kV CBCT scan in an anthropomorphic phantom. Breast doses for different scanning protocols and breast sizes were compared. Dose reduction was attempted by using partial arc CBCT scan with bowtie filter. The impact of this dose reduction strategy on image registration accuracy was investigated. The average breast surface doses were 20.02 mGy and 11.65 mGy for thoracic CBCT without filtration and with filtration, respectively. This indicates a dose reduction of 41.8% by use of bowtie filter. It was found 220° partial arc scanning significantly reduced the dose to contralateral breast (44.4% lower than ipsilateral breast), while the image registration accuracy was not compromised. Breast dose reduction can be achieved by using ipsilateral 220° partial arc scan with bowtie filter. This strategy also provides sufficient image quality for thorax image registration in daily patient positioning verification.
24 CFR 1000.128 - Is income verification required for assistance under NAHASDA?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Is income verification required for assistance under NAHASDA? 1000.128 Section 1000.128 Housing and Urban Development Regulations Relating to... § 1000.128 Is income verification required for assistance under NAHASDA? (a) Yes, the recipient must...
Five-equation and robust three-equation methods for solution verification of large eddy simulation
NASA Astrophysics Data System (ADS)
Dutta, Rabijit; Xing, Tao
2018-02-01
This study evaluates the recently developed general framework for solution verification methods for large eddy simulation (LES) using implicitly filtered LES of periodic channel flows at friction Reynolds number of 395 on eight systematically refined grids. The seven-equation method shows that the coupling error based on Hypothesis I is much smaller as compared with the numerical and modeling errors and therefore can be neglected. The authors recommend five-equation method based on Hypothesis II, which shows a monotonic convergence behavior of the predicted numerical benchmark ( S C ), and provides realistic error estimates without the need of fixing the orders of accuracy for either numerical or modeling errors. Based on the results from seven-equation and five-equation methods, less expensive three and four-equation methods for practical LES applications were derived. It was found that the new three-equation method is robust as it can be applied to any convergence types and reasonably predict the error trends. It was also observed that the numerical and modeling errors usually have opposite signs, which suggests error cancellation play an essential role in LES. When Reynolds averaged Navier-Stokes (RANS) based error estimation method is applied, it shows significant error in the prediction of S C on coarse meshes. However, it predicts reasonable S C when the grids resolve at least 80% of the total turbulent kinetic energy.
Monocular precrash vehicle detection: features and classifiers.
Sun, Zehang; Bebis, George; Miller, Ronald
2006-07-01
Robust and reliable vehicle detection from images acquired by a moving vehicle (i.e., on-road vehicle detection) is an important problem with applications to driver assistance systems and autonomous, self-guided vehicles. The focus of this work is on the issues of feature extraction and classification for rear-view vehicle detection. Specifically, by treating the problem of vehicle detection as a two-class classification problem, we have investigated several different feature extraction methods such as principal component analysis, wavelets, and Gabor filters. To evaluate the extracted features, we have experimented with two popular classifiers, neural networks and support vector machines (SVMs). Based on our evaluation results, we have developed an on-board real-time monocular vehicle detection system that is capable of acquiring grey-scale images, using Ford's proprietary low-light camera, achieving an average detection rate of 10 Hz. Our vehicle detection algorithm consists of two main steps: a multiscale driven hypothesis generation step and an appearance-based hypothesis verification step. During the hypothesis generation step, image locations where vehicles might be present are extracted. This step uses multiscale techniques not only to speed up detection, but also to improve system robustness. The appearance-based hypothesis verification step verifies the hypotheses using Gabor features and SVMs. The system has been tested in Ford's concept vehicle under different traffic conditions (e.g., structured highway, complex urban streets, and varying weather conditions), illustrating good performance.
Cheng, Wen-Chang
2012-01-01
In this paper we propose a robust lane detection and tracking method by combining particle filters with the particle swarm optimization method. This method mainly uses the particle filters to detect and track the local optimum of the lane model in the input image and then seeks the global optimal solution of the lane model by a particle swarm optimization method. The particle filter can effectively complete lane detection and tracking in complicated or variable lane environments. However, the result obtained is usually a local optimal system status rather than the global optimal system status. Thus, the particle swarm optimization method is used to further refine the global optimal system status in all system statuses. Since the particle swarm optimization method is a global optimization algorithm based on iterative computing, it can find the global optimal lane model by simulating the food finding way of fish school or insects under the mutual cooperation of all particles. In verification testing, the test environments included highways and ordinary roads as well as straight and curved lanes, uphill and downhill lanes, lane changes, etc. Our proposed method can complete the lane detection and tracking more accurately and effectively then existing options. PMID:23235453
NASA Astrophysics Data System (ADS)
Jeffrey, N.; Abdalla, F. B.; Lahav, O.; Lanusse, F.; Starck, J.-L.; Leonard, A.; Kirk, D.; Chang, C.; Baxter, E.; Kacprzak, T.; Seitz, S.; Vikram, V.; Whiteway, L.; Abbott, T. M. C.; Allam, S.; Avila, S.; Bertin, E.; Brooks, D.; Rosell, A. Carnero; Kind, M. Carrasco; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Davis, C.; De Vicente, J.; Desai, S.; Doel, P.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; Hartley, W. G.; Honscheid, K.; Hoyle, B.; James, D. J.; Jarvis, M.; Kuehn, K.; Lima, M.; Lin, H.; March, M.; Melchior, P.; Menanteau, F.; Miquel, R.; Plazas, A. A.; Reil, K.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Walker, A. R.
2018-05-01
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion, not accounting for survey masks or noise. The Wiener filter is well-motivated for Gaussian density fields in a Bayesian framework. GLIMPSE uses sparsity, aiming to reconstruct non-linearities in the density field. We compare these methods with several tests using public Dark Energy Survey (DES) Science Verification (SV) data and realistic DES simulations. The Wiener filter and GLIMPSE offer substantial improvements over smoothed KS with a range of metrics. Both the Wiener filter and GLIMPSE convergence reconstructions show a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated ΛCDM shear catalogues and catalogues with no mass fluctuations (a standard data vector when inferring cosmology from peak statistics); the maximum signal-to-noise of these peak statistics is increased by a factor of 3.5 for the Wiener filter and 9 for GLIMPSE. With simulations we measure the reconstruction of the harmonic phases; the phase residuals' concentration is improved 17% by GLIMPSE and 18% by the Wiener filter. The correlation between reconstructions from data and foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE.
NASA Astrophysics Data System (ADS)
Kunii, M.; Ito, K.; Wada, A.
2015-12-01
An ensemble Kalman filter (EnKF) using a regional mesoscale atmosphere-ocean coupled model was developed to represent the uncertainties of sea surface temperature (SST) in ensemble data assimilation strategies. The system was evaluated through data assimilation cycle experiments over a one-month period from July to August 2014, during which a tropical cyclone as well as severe rainfall events occurred. The results showed that the data assimilation cycle with the coupled model could reproduce SST distributions realistically even without updating SST and salinity during the data assimilation cycle. Therefore, atmospheric variables and radiation applied as a forcing to ocean models can control oceanic variables to some extent in the current data assimilation configuration. However, investigations of the forecast error covariance estimated in EnKF revealed that the correlation between atmospheric and oceanic variables could possibly lead to less flow-dependent error covariance for atmospheric variables owing to the difference in the time scales between atmospheric and oceanic variables. A verification of the analyses showed positive impacts of applying the ocean model to EnKF on precipitation forecasts. The use of EnKF with the coupled model system captured intensity changes of a tropical cyclone better than it did with an uncoupled atmosphere model, even though the impact on the track forecast was negligibly small.
Mohajeri, Parviz; Yazdani, Laya; Shahraki, Abdolrazagh Hashemi; Alvandi, Amirhoshang; Atashi, Sara; Farahani, Abbas; Almasi, Ali; Rezaei, Mansour
2017-04-01
Nontuberculous mycobacteria are habitants of environment, especially in aquatic systems. Some of them cause problems in immunodeficient patients. Over the last decade, 16S rRNA gene sequencing was established in 45 novel species of nontuberculous mycobacteria. Experiences revealed that this method underestimates the diversity, but does not distinguish between some of mycobacterium subsp. To recognize emerging rapidly growing mycobacteria and identify their subsp, rpoB gene sequencing has been developed. To better understand the transmission of nontuberculous mycobacterial species from drinking water and preventing the spread of illness with these bacteria, the aim of this study was to detect the presence of bacteria by PCR-sequencing techniques. Drinking water samples were collected from different areas of Kermanshah city in west of IRAN. After decontamination with cetylpyridinium chloride, samples were filtered with 0.45-micron filters, the filter transferred directly on growth medium waiting to appear in colonies, then DNA extraction and PCR were performed, and products were sent to sequencing. We found 35/110 (32%) nontuberculous mycobacterial species in drinking water samples, isolates included Mycobacterium goodii, Mycobacterium aurum, and Mycobacterium gastri with the most abundance (11.5%), followed by Mycobacterium smegmatis, Mycobacterium porcinum, Mycobacterium peregrinum, Mycobacterium mucogenicum, and Mycobacterium chelonae (8%). In this study, we recognized the evidence of contamination by nontuberculous mycobacteria in corroded water pipes. As a result of the high prevalence of these bacteria in drinking water in Kermanshah, this is important evidence of transmission through drinking water. This finding can also help public health policy makers control these isolates in drinking water supplies in Kermanshah.
Design of a 32-Channel EEG System for Brain Control Interface Applications
Wang, Ching-Sung
2012-01-01
This study integrates the hardware circuit design and the development support of the software interface to achieve a 32-channel EEG system for BCI applications. Since the EEG signals of human bodies are generally very weak, in addition to preventing noise interference, it also requires avoiding the waveform distortion as well as waveform offset and so on; therefore, the design of a preamplifier with high common-mode rejection ratio and high signal-to-noise ratio is very important. Moreover, the friction between the electrode pads and the skin as well as the design of dual power supply will generate DC bias which affects the measurement signals. For this reason, this study specially designs an improved single-power AC-coupled circuit, which effectively reduces the DC bias and improves the error caused by the effects of part errors. At the same time, the digital way is applied to design the adjustable amplification and filter function, which can design for different EEG frequency bands. For the analog circuit, a frequency band will be taken out through the filtering circuit and then the digital filtering design will be used to adjust the extracted frequency band for the target frequency band, combining with MATLAB to design man-machine interface for displaying brain wave. Finally the measured signals are compared to the traditional 32-channel EEG signals. In addition to meeting the IFCN standards, the system design also conducted measurement verification in the standard EEG isolation room in order to demonstrate the accuracy and reliability of this system design. PMID:22778545
Design of a 32-channel EEG system for brain control interface applications.
Wang, Ching-Sung
2012-01-01
This study integrates the hardware circuit design and the development support of the software interface to achieve a 32-channel EEG system for BCI applications. Since the EEG signals of human bodies are generally very weak, in addition to preventing noise interference, it also requires avoiding the waveform distortion as well as waveform offset and so on; therefore, the design of a preamplifier with high common-mode rejection ratio and high signal-to-noise ratio is very important. Moreover, the friction between the electrode pads and the skin as well as the design of dual power supply will generate DC bias which affects the measurement signals. For this reason, this study specially designs an improved single-power AC-coupled circuit, which effectively reduces the DC bias and improves the error caused by the effects of part errors. At the same time, the digital way is applied to design the adjustable amplification and filter function, which can design for different EEG frequency bands. For the analog circuit, a frequency band will be taken out through the filtering circuit and then the digital filtering design will be used to adjust the extracted frequency band for the target frequency band, combining with MATLAB to design man-machine interface for displaying brain wave. Finally the measured signals are compared to the traditional 32-channel EEG signals. In addition to meeting the IFCN standards, the system design also conducted measurement verification in the standard EEG isolation room in order to demonstrate the accuracy and reliability of this system design.
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
Ada(R) Test and Verification System (ATVS)
NASA Technical Reports Server (NTRS)
Strelich, Tom
1986-01-01
The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.
The Environmental Technology Verification Program, established by the EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance.
NASA Astrophysics Data System (ADS)
Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.
2017-08-01
In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.
[Verification of bacteriological safety of PCM 40 air conditioner].
Dumas, J L; Ducel, G; Rouge, J C
1991-01-01
This study assessed the bacteriological safety of the bedside air conditioner PCM 40 (Howorth Airtech), used for prevention of intraoperative hypothermia, by blowing filtered warm air through a special mattress. The 3 microns bacterial filter of the device released 2,968 +/- 5,618 particles of diameter less than 3 microns per m3 of room air, containing 78,798 +/- 37,243 of such particles per m3. The amount of bacteries in the air pulsed from the mattress was 30 +/- 41 cfu/m3 vs 120 cfu/m3 in the ambient air and in the hot air supply tubing it reached 6 +/- 5 cfu/m3 vs 175 +/- 77 cfu/m3. It is concluded that bacteriological data do not contra-indicate the use of this air conditioner in the operating theater. The only limitations for use are the position (prone or lateral position) and type of surgery (neurosurgery).
NASA Astrophysics Data System (ADS)
Lavers, Chris R.; Mason, Travis
2017-07-01
High-resolution satellite imagery permits verification of human rights land clearance violations across international borders as a result of unstable regimes or socio-economic upheaval. Without direct access to these areas to validate allegations of human rights abuse, the use of remote sensing tools, techniques, and data is extremely important. Humanitarian assessment can benefit from software-based solutions, involving radiometrically calibrated normalized difference vegetation index and temporal change imagery. We discuss the introduction of a matrix filter approach for change detection studies to help assist rapid building detection over large search areas against a bright background to evaluate internally displaced people in the 2005 Porta Farm Zimbabwe clearances. Future wide-scale near real-time space-based monitoring with a range of digital filters would be of great benefit to international human rights observers and human rights networks.
WRAP-RIB antenna technology development
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Garcia, N. F.; Iwamoto, H.
1985-01-01
The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.
Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3
The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.
ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION
The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Jae-ik; Yoo, SeungHoon; Cho, Sungho
Purpose: The significant issue of particle therapy such as proton and carbon ion was a accurate dose delivery from beam line to patient. For designing the complex delivery system, Monte Carlo simulation can be used for the simulation of various physical interaction in scatters and filters. In this report, we present the development of Monte Carlo simulation platform to help design the prototype of particle therapy nozzle and performed the Monte Carlo simulation using Geant4. Also we show the prototype design of particle therapy beam nozzle for Korea Heavy Ion Medical Accelerator (KHIMA) project in Korea Institute of Radiological andmore » Medical Science(KIRAMS) at Republic of Korea. Methods: We developed a simulation platform for particle therapy beam nozzle using Geant4. In this platform, the prototype nozzle design of Scanning system for carbon was simply designed. For comparison with theoretic beam optics, the beam profile on lateral distribution at isocenter is compared with Mont Carlo simulation result. From the result of this analysis, we can expected the beam spot property of KHIMA system and implement the spot size optimization for our spot scanning system. Results: For characteristics study of scanning system, various combination of the spot size from accerlator with ridge filter and beam monitor was tested as simple design for KHIMA dose delivery system. Conclusion: In this report, we presented the part of simulation platform and the characteristics study. This study is now on-going in order to develop the simulation platform including the beam nozzle and the dose verification tool with treatment planning system. This will be presented as soon as it is become available.« less
Investigation of air cleaning system response to accident conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrae, R.W.; Bolstad, J.W.; Foster, R.D.
1980-01-01
Air cleaning system response to the stress of accident conditions are being investigated. A program overview and hghlight recent results of our investigation are presented. The program includes both analytical and experimental investigations. Computer codes for predicting effects of tornados, explosions, fires, and material transport are described. The test facilities used to obtain supportive experimental data to define structural integrity and confinement effectiveness of ventilation system components are described. Examples of experimental results for code verification, blower response to tornado transients, and filter response to tornado and explosion transients are reported.
FEM and Multiphysics Applications at NASA/GSFC
NASA Technical Reports Server (NTRS)
Loughlin, James
2004-01-01
FEM software available to the Mechanical Systems Analysis and Simulation Branch at Goddard Space Flight Center (GSFC) include: 1) MSC/Nastran; 2) Abaqus; 3) Ansys/Multiphysics; 4) COSMOS/M; 5) 'Home-grown' programs; 6) Pre/post processors such as Patran and FEMAP. This viewgraph presentation provides additional information on MSC/Nastran and Ansys/Multiphysics, and includes screen shots of analyzed equipment, including the Wilkinson Microwave Anistropy Probe, a micro-mirror, a MEMS tunable filter, and a micro-shutter array. The presentation also includes information on the verification of results.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Commerce.
This hearing addresses legislative proposals to protect children from inappropriate materials on the Internet. Among the issues discussed are federal investments and information access, defining standards for protection, child pornography and marketing to children, filtering technology and adult verification services, and freedom of speech.…
A High Power Density Single-Phase PWM Rectifier with Active Ripple Energy Storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ning, Puqi; Wang, Ruxi; Wang, Fei
It is well known that there exist second-order harmonic current and corresponding ripple voltage on dc bus for single phase PWM rectifiers. The low frequency harmonic current is normally filtered using a bulk capacitor in the bus which results in low power density. This paper proposed an active ripple energy storage method that can effectively reduce the energy storage capacitance. The feed-forward control method and design considerations are provided. Simulation and 15 kW experimental results are provided for verification purposes.
Dust devil signatures in infrasound records of the International Monitoring System
NASA Astrophysics Data System (ADS)
Lorenz, Ralph D.; Christie, Douglas
2015-03-01
We explore whether dust devils have a recognizable signature in infrasound array records, since several Comprehensive Nuclear-Test-Ban Treaty verification stations conducting continuous measurements with microbarometers are in desert areas which see dust devils. The passage of dust devils (and other boundary layer vortices, whether dust laden or not) causes a local temporary drop in pressure: the high-pass time domain filtering in microbarometers results in a "heartbeat" signature, which we observe at the Warramunga station in Australia. We also observe a ~50 min pseudoperiodicity in the occurrence of these signatures and some higher-frequency infrasound. Dust devils do not significantly degrade the treaty verification capability. The pipe arrays for spatial averaging used in infrasound monitoring degrade the detection efficiency of small devils, but the long observation time may allow a useful census of large vortices, and thus, the high-sensitivity infrasonic array data from the monitoring network can be useful in studying columnar vortices in the lower atmosphere.
Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application
NASA Technical Reports Server (NTRS)
Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond
2018-01-01
The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.
Engineering of the LISA Pathfinder mission—making the experiment a practical reality
NASA Astrophysics Data System (ADS)
Warren, Carl; Dunbar, Neil; Backler, Mike
2009-05-01
LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matloch, L.; Vaccaro, S.; Couland, M.
The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction ofmore » encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)« less
A Dynamic Attitude Measurement System Based on LINS
Li, Hanzhou; Pan, Quan; Wang, Xiaoxu; Zhang, Juanni; Li, Jiang; Jiang, Xiangjun
2014-01-01
A dynamic attitude measurement system (DAMS) is developed based on a laser inertial navigation system (LINS). Three factors of the dynamic attitude measurement error using LINS are analyzed: dynamic error, time synchronization and phase lag. An optimal coning errors compensation algorithm is used to reduce coning errors, and two-axis wobbling verification experiments are presented in the paper. The tests indicate that the attitude accuracy is improved 2-fold by the algorithm. In order to decrease coning errors further, the attitude updating frequency is improved from 200 Hz to 2000 Hz. At the same time, a novel finite impulse response (FIR) filter with three notches is designed to filter the dither frequency of the ring laser gyro (RLG). The comparison tests suggest that the new filter is five times more effective than the old one. The paper indicates that phase-frequency characteristics of FIR filter and first-order holder of navigation computer constitute the main sources of phase lag in LINS. A formula to calculate the LINS attitude phase lag is introduced in the paper. The expressions of dynamic attitude errors induced by phase lag are derived. The paper proposes a novel synchronization mechanism that is able to simultaneously solve the problems of dynamic test synchronization and phase compensation. A single-axis turntable and a laser interferometer are applied to verify the synchronization mechanism. The experiments results show that the theoretically calculated values of phase lag and attitude error induced by phase lag can both match perfectly with testing data. The block diagram of DAMS and physical photos are presented in the paper. The final experiments demonstrate that the real-time attitude measurement accuracy of DAMS can reach up to 20″ (1σ) and the synchronization error is less than 0.2 ms on the condition of three axes wobbling for 10 min. PMID:25177802
Software development for airborne radar
NASA Astrophysics Data System (ADS)
Sundstrom, Ingvar G.
Some aspects for development of software in a modern multimode airborne nose radar are described. First, an overview of where software is used in the radar units is presented. The development phases-system design, functional design, detailed design, function verification, and system verification-are then used as the starting point for the discussion. Methods, tools, and the most important documents are described. The importance of video flight recording in the early stages and use of a digital signal generators for performance verification is emphasized. Some future trends are discussed.
ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS
The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...
ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR
The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
NASA Astrophysics Data System (ADS)
Fu, Weihua; Dai, Jianrong; Hu, Yimin; Han, Dongsheng; Song, Yixin
2004-04-01
The treatment delivery time of intensity-modulated radiation therapy (IMRT) with a multileaf collimator (MLC) is generally longer than that of conventional radiotherapy. In theory, removing the flattening filter from the treatment head may reduce the beam-on time by enhancing the output dose rate, and then reduce the treatment delivery time. And in practice, there is a possibility of delivering the required fluence distribution by modulating the unflattened non-uniform fluence distribution. However, the reduction of beam-on time may be discounted by the increase of leaf-travel time and (or) verification-and-recording (V&R) time. Here we investigate the overall effect of flattening filter on the treatment delivery time of IMRT with MLCs implemented in the step and shoot method, as well as with compensators on six hybrid machines. We compared the treatment delivery time with/without flattening filter for ten nasopharynx cases and ten prostate cases by observing the variations of the ratio of the beam-on time, segment number, leaf-travel time and the treatment delivery time with dose rate, leaf speed and V&R time. The results show that, without the flattening filter, the beam-on time reduces for both static MLC and compensator-based techniques; the number of segments and the leaf-travel time increase slightly for the static MLC technique; the relative IMRT treatment delivery time decreases more with lower dose rate, higher leaf speed and shorter V&R overhead time. The absolute treatment delivery time reduction depends on the fraction dose. It is not clinically significant at a fraction dose of 2 Gy for the technique of removing the flattening filter, but becomes significant when the fraction dose is as high as that for radiosurgery.
24 CFR 5.659 - Family information and verification.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...
This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...
24 CFR 5.659 - Family information and verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family information and verification... Assisted Housing Serving Persons with Disabilities: Family Income and Family Payment; Occupancy... § 5.659 Family information and verification. (a) Applicability. This section states requirements for...
BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR
The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...
NASA Astrophysics Data System (ADS)
Bush, Craig R.
This dissertation presents a novel current source converter topology that is primarily intended for single-phase photovoltaic (PV) applications. In comparison with the existing PV inverter technology, the salient features of the proposed topology are: a) the low frequency (double of line frequency) ripple that is common to single-phase inverters is greatly reduced; b) the absence of low frequency ripple enables significantly reduced size pass components to achieve necessary DC-link stiffness and c) improved maximum power point tracking (MPPT) performance is readily achieved due to the tightened current ripple even with reduced-size passive components. The proposed topology does not utilize any electrolytic capacitors. Instead an inductor is used as the DC-link filter and reliable AC film capacitors are utilized for the filter and auxiliary capacitor. The proposed topology has a life expectancy on par with PV panels. The proposed modulation technique can be used for any current source inverter where an unbalanced three-phase operation is desires such as active filters and power controllers. The proposed topology is ready for the next phase of microgrid and power system controllers in that it accepts reactive power commands. This work presents the proposed topology and its working principle supported by with numerical verifications and hardware results. Conclusions and future work are also presented.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
Wang, Baofeng; Qi, Zhiquan; Chen, Sizhong; Liu, Zhaodu; Ma, Guocheng
2017-01-01
Vision-based vehicle detection is an important issue for advanced driver assistance systems. In this paper, we presented an improved multi-vehicle detection and tracking method using cascade Adaboost and Adaptive Kalman filter(AKF) with target identity awareness. A cascade Adaboost classifier using Haar-like features was built for vehicle detection, followed by a more comprehensive verification process which could refine the vehicle hypothesis in terms of both location and dimension. In vehicle tracking, each vehicle was tracked with independent identity by an Adaptive Kalman filter in collaboration with a data association approach. The AKF adaptively adjusted the measurement and process noise covariance through on-line stochastic modelling to compensate the dynamics changes. The data association correctly assigned different detections with tracks using global nearest neighbour(GNN) algorithm while considering the local validation. During tracking, a temporal context based track management was proposed to decide whether to initiate, maintain or terminate the tracks of different objects, thus suppressing the sparse false alarms and compensating the temporary detection failures. Finally, the proposed method was tested on various challenging real roads, and the experimental results showed that the vehicle detection performance was greatly improved with higher accuracy and robustness.
Assessment of female breast dose for thoracic cone-beam CT using MOSFET dosimeters
Qiu, Bo; Liang, Jian; Xie, Weihao; Deng, Xiaowu; Qi, Zhenyu
2017-01-01
Objective: To assess the breast dose during a routine thoracic cone-beam CT (CBCT) check with the efforts to explore the possible dose reduction strategy. Materials and Methods: Metal oxide semiconductor field-effect transistor (MOSFET) dosimeters were used to measure breast surface doses during a thorax kV CBCT scan in an anthropomorphic phantom. Breast doses for different scanning protocols and breast sizes were compared. Dose reduction was attempted by using partial arc CBCT scan with bowtie filter. The impact of this dose reduction strategy on image registration accuracy was investigated. Results: The average breast surface doses were 20.02 mGy and 11.65 mGy for thoracic CBCT without filtration and with filtration, respectively. This indicates a dose reduction of 41.8% by use of bowtie filter. It was found 220° partial arc scanning significantly reduced the dose to contralateral breast (44.4% lower than ipsilateral breast), while the image registration accuracy was not compromised. Conclusions: Breast dose reduction can be achieved by using ipsilateral 220° partial arc scan with bowtie filter. This strategy also provides sufficient image quality for thorax image registration in daily patient positioning verification. PMID:28423624
Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)
NASA Technical Reports Server (NTRS)
Basinger, Scott A.
2012-01-01
This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the software package can be used to verify that the underlying requirements have been met.
Adaptive correction of ensemble forecasts
NASA Astrophysics Data System (ADS)
Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane
2017-04-01
Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO-LEPS ensemble forecasts. Deterministic verification scores (e.g., mean absolute error, bias) and probabilistic scores (e.g., CRPS) are used to evaluate the post-processing techniques. We conclude that the new adaptive method outperforms the simpler running bias-correction. The proposed adaptive method often outperforms the MBM method in removing bias. The MBM method has the advantage of correcting the ensemble spread, although it needs more training data.
The Environmental Technology Verification (ETV) Program, established by the U.S. EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance. The Air Pollution Control Technolog...
The U.S. EPA's Office of Research and Development operates the Environmental Technology Verification (ETV) program to facilitate the deployment of innovative technologies through performance verification and information dissemination. Congress funds ETV in response to the belief ...
24 CFR 4001.112 - Income verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements: (a...
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)
2003-01-01
Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
NASA Technical Reports Server (NTRS)
Armstrong, Jeffrey B.; Simon, Donald L.
2012-01-01
Self-tuning aircraft engine models can be applied for control and health management applications. The self-tuning feature of these models minimizes the mismatch between any given engine and the underlying engineering model describing an engine family. This paper provides details of the construction of a self-tuning engine model centered on a piecewise linear Kalman filter design. Starting from a nonlinear transient aerothermal model, a piecewise linear representation is first extracted. The linearization procedure creates a database of trim vectors and state-space matrices that are subsequently scheduled for interpolation based on engine operating point. A series of steady-state Kalman gains can next be constructed from a reduced-order form of the piecewise linear model. Reduction of the piecewise linear model to an observable dimension with respect to available sensed engine measurements can be achieved using either a subset or an optimal linear combination of "health" parameters, which describe engine performance. The resulting piecewise linear Kalman filter is then implemented for faster-than-real-time processing of sensed engine measurements, generating outputs appropriate for trending engine performance, estimating both measured and unmeasured parameters for control purposes, and performing on-board gas-path fault diagnostics. Computational efficiency is achieved by designing multidimensional interpolation algorithms that exploit the shared scheduling of multiple trim vectors and system matrices. An example application illustrates the accuracy of a self-tuning piecewise linear Kalman filter model when applied to a nonlinear turbofan engine simulation. Additional discussions focus on the issue of transient response accuracy and the advantages of a piecewise linear Kalman filter in the context of validation and verification. The techniques described provide a framework for constructing efficient self-tuning aircraft engine models from complex nonlinear simulations.Self-tuning aircraft engine models can be applied for control and health management applications. The self-tuning feature of these models minimizes the mismatch between any given engine and the underlying engineering model describing an engine family. This paper provides details of the construction of a self-tuning engine model centered on a piecewise linear Kalman filter design. Starting from a nonlinear transient aerothermal model, a piecewise linear representation is first extracted. The linearization procedure creates a database of trim vectors and state-space matrices that are subsequently scheduled for interpolation based on engine operating point. A series of steady-state Kalman gains can next be constructed from a reduced-order form of the piecewise linear model. Reduction of the piecewise linear model to an observable dimension with respect to available sensed engine measurements can be achieved using either a subset or an optimal linear combination of "health" parameters, which describe engine performance. The resulting piecewise linear Kalman filter is then implemented for faster-than-real-time processing of sensed engine measurements, generating outputs appropriate for trending engine performance, estimating both measured and unmeasured parameters for control purposes, and performing on-board gas-path fault diagnostics. Computational efficiency is achieved by designing multidimensional interpolation algorithms that exploit the shared scheduling of multiple trim vectors and system matrices. An example application illustrates the accuracy of a self-tuning piecewise linear Kalman filter model when applied to a nonlinear turbofan engine simulation. Additional discussions focus on the issue of transient response accuracy and the advantages of a piecewise linear Kalman filter in the context of validation and verification. The techniques described provide a framework for constructing efficient self-tuning aircraft engine models from complex nonlinear simulatns.
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
24 CFR 5.216 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 5.216 Section 5.216 Housing and Urban Development...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Disclosure and Verification of Social Security Numbers and...
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Triple-Quantum Filtered NMR Imaging of Sodium -23 in the Human Brain
NASA Astrophysics Data System (ADS)
Keltner, John Robinson
In the past multiple-quantum filtered imaging of biexponential relaxation sodium-23 nuclei in the human brain has been limited by low signal to noise ratios; this thesis demonstrates that such imaging is feasible when using a modified gradient-selected triple-quantum filter at a repetition time which maximizes the signal to noise ratio. Nuclear magnetic resonance imaging of biexponential relaxation sodium-23 (^{23}Na) nuclei in the human brain may be useful for detecting ischemia, cancer, and pathophysiology related to manic-depression. Multiple -quantum filters may be used to selectively image biexponential relaxation ^{23}Na signals since these filters suppress single-exponential relaxation ^{23}Na signals. In this thesis, the typical repetition times (200 -300 ms) used for in vivo multiple-quantum filtered ^{23}Na experiments are shown to be approximately 5 times greater than the optimal repetition time which maximizes multiple-quantum filtered SNR. Calculations and experimental verification show that the gradient-selected triple-quantum (GS3Q) filtered SNR for ^ {23}Na in a 4% agarose gel increases by a factor of two as the repetition time decreases from 300 ms to 55 ms. It is observed that a simple reduction of repetition time also increases spurious single-quantum signals from GS3Q filtered experiments. Irreducible superoperator calculations have been used to design a modified GS3Q filter which more effectively suppresses the spurious single-quantum signals. The modified GS3Q filter includes a preparatory crusher gradient and two-step-phase cycling. Using the modified GS3Q filter and a repetition time of 70 ms, a three dimensional triple-quantum filtered image of a phantom modelling ^{23} Na in the brain was obtained. The phantom consisted of two 4 cm diameter spheres inside of a 8.5 cm x 7 cm ellipsoid. The two spheres contained 0.012 and 0.024 M ^{23}Na in 4% agarose gel. Surrounding the spheres and inside the ellipsoid was 0.03 M aqueous ^{23}Na. The image dimensions were 16 x 16 x 16 voxels with the dimension of a voxel being 1.5 x 1.5 x 1.5 cm^3. The signal to noise ratio for the GS3Q filtered ^ {23}Na signal from the 0.012 and 0.024 M ^{23}Na spheres was 17 and 30 for a 54 minute experiment at 2.35 T. (Abstract shortened by UMI.).
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
The U.S. EPA's Office of Research and Development operates the Environmental Technology Verification (ETV) program to facilitate the deployment of innovative technologies through performance verification and information dissemination. Congress funds ETV in response to the belief ...
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...
Joint ETV/NOWATECH test plan for the Sorbisense GSW40 passive sampler
The joint test plan is the implementation of a test design developed for verification of the performance of an environmental technology following the NOWATECH ETV method. The verification is a joint verification with the US EPA ETV scheme and the Advanced Monitoring Systems Cent...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
24 CFR 206.40 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 206.40 Section 206.40 Housing and Urban Development... Eligibility; Endorsement Eligible Mortgagors § 206.40 Disclosure and verification of Social Security and... verification of Social Security and Employer Identification Numbers, as provided by part 200, subpart U, of...
24 CFR 203.35 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 203.35 Section 203.35 Housing and Urban Development... Requirements and Underwriting Procedures Eligible Mortgagors § 203.35 Disclosure and verification of Social... mortgagor must meet the requirements for the disclosure and verification of Social Security and Employer...
24 CFR 201.6 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 201.6 Section 201.6 Housing and Urban Development... HOME LOANS General § 201.6 Disclosure and verification of Social Security and Employer Identification... the disclosure and verification of Social Security and Employer Identification Numbers, as provided by...
NASA Astrophysics Data System (ADS)
Williams, John L.; Maxwell, Reed M.; Monache, Luca Delle
2013-12-01
Wind power is rapidly gaining prominence as a major source of renewable energy. Harnessing this promising energy source is challenging because of the chaotic nature of wind and its inherently intermittent nature. Accurate forecasting tools are critical to support the integration of wind energy into power grids and to maximize its impact on renewable energy portfolios. We have adapted the Data Assimilation Research Testbed (DART), a community software facility which includes the ensemble Kalman filter (EnKF) algorithm, to expand our capability to use observational data to improve forecasts produced with a fully coupled hydrologic and atmospheric modeling system, the ParFlow (PF) hydrologic model and the Weather Research and Forecasting (WRF) mesoscale atmospheric model, coupled via mass and energy fluxes across the land surface, and resulting in the PF.WRF model. Numerous studies have shown that soil moisture distribution and land surface vegetative processes profoundly influence atmospheric boundary layer development and weather processes on local and regional scales. We have used the PF.WRF model to explore the connections between the land surface and the atmosphere in terms of land surface energy flux partitioning and coupled variable fields including hydraulic conductivity, soil moisture, and wind speed and demonstrated that reductions in uncertainty in these coupled fields realized through assimilation of soil moisture observations propagate through the hydrologic and atmospheric system. The sensitivities found in this study will enable further studies to optimize observation strategies to maximize the utility of the PF.WRF-DART forecasting system.
Jeffrey, N.; Abdalla, F. B.; Lahav, O.; ...
2018-05-15
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey, N.; et al.
2018-01-26
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey, N.; Abdalla, F. B.; Lahav, O.
Mapping the underlying density field, including non-visible dark matter, using weak gravitational lensing measurements is now a standard tool in cosmology. Due to its importance to the science results of current and upcoming surveys, the quality of the convergence reconstruction methods should be well understood. We compare three different mass map reconstruction methods: Kaiser-Squires (KS), Wiener filter, and GLIMPSE. KS is a direct inversion method, taking no account of survey masks or noise. The Wiener filter is well motivated for Gaussian density fields in a Bayesian framework. The GLIMPSE method uses sparsity, with the aim of reconstructing non-linearities in themore » density field. We compare these methods with a series of tests on the public Dark Energy Survey (DES) Science Verification (SV) data and on realistic DES simulations. The Wiener filter and GLIMPSE methods offer substantial improvement on the standard smoothed KS with a range of metrics. For both the Wiener filter and GLIMPSE convergence reconstructions we present a 12% improvement in Pearson correlation with the underlying truth from simulations. To compare the mapping methods' abilities to find mass peaks, we measure the difference between peak counts from simulated {\\Lambda}CDM shear catalogues and catalogues with no mass fluctuations. This is a standard data vector when inferring cosmology from peak statistics. The maximum signal-to-noise value of these peak statistic data vectors was increased by a factor of 3.5 for the Wiener filter and by a factor of 9 using GLIMPSE. With simulations we measure the reconstruction of the harmonic phases, showing that the concentration of the phase residuals is improved 17% by GLIMPSE and 18% by the Wiener filter. We show that the correlation between the reconstructions from data and the foreground redMaPPer clusters is increased 18% by the Wiener filter and 32% by GLIMPSE. [Abridged]« less
Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S R; Bihari, B L; Salari, K
As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.
Implementation and verification of global optimization benchmark problems
NASA Astrophysics Data System (ADS)
Posypkin, Mikhail; Usov, Alexander
2017-12-01
The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.
Aqueous cleaning and verification processes for precision cleaning of small parts
NASA Technical Reports Server (NTRS)
Allen, Gale J.; Fishell, Kenneth A.
1995-01-01
The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.
Verification of a level-3 diesel emissions control strategy for transport refrigeration units
NASA Astrophysics Data System (ADS)
Shewalla, Umesh
Transport Refrigeration Units (TRUs) are refrigeration systems used to control the environment of temperature sensitive products while they are being transported from one place to another in trucks, trailers or shipping containers. The TRUs typically use an internal combustion engine to power the compressor of the refrigeration unit. In the United States TRUs are most commonly powered by diesel engines which vary from 9 to 40 horsepower. TRUs are capable of both heating and cooling. The TRU engines are relatively small, inexpensive and do not use emissions reduction techniques such as exhaust gas recirculation (EGR). A significant number of these engines operate in highly populated areas like distribution centers, truck stops, and other facilities which make them one of the potential causes for health risks to the people who live and work nearby. Diesel particulate matter (PM) is known for its adverse effects on both human beings and the environment. Considering these effects, regulatory bodies have imposed limitations on the PM emissions from a TRU engine. The objective of this study was to measure and analyze the regulated emissions from a TRU engine under both engine out and particulate filter system out conditions during pre-durability (when the filter system was new) and post-durability test (after the filter system was subjected to 1000 hours in-field trial). The verification program was performed by the Center for Alternative Fuel, Engines and Emissions (CAFEE) at West Virginia University (WVU). In this program, a catalyzed silicon carbide (SiC) diesel particulate filter (DPF) was evaluated and verified as a Level-3 Verified Diesel Emissions Control Strategy (VDECS) (. 85% PM reduction) under California Air Resources Board (CARB) regulations 2702 [1]. The emissions result showed that the filter system reduced diesel PM by a percentage of 96 +/- 1 over ISO 8178-C1 [2] cycle and 92 +/- 5 over EPA TRU [3] cycle, qualifying as a Level 3 VDECS. The percentage emission reduction in hydrocarbons (HC) and carbon monoxide (CO) was 76.8 +/- 4.8 and 72.2 +/- 5.2, respectively over both ISO 8178-C1 [2] and EPA TRU [3] cycles. It was also observed that there was 3.6 +/- 2.9 and 7.2 +/- 3.1 percentage reduction in oxides of nitrogen (NOx) and nitric oxide (NO), respectively with a slight increase in fuel consumption and carbon dioxide as a consequence of increased exhaust back pressure. It is required by the CARB regulations that the diesel emissions control strategy must not increase emissions of NO2 by more than 20% by mass over the baseline value. In this study, it was observed that the total increase in NO2 level was 5.6 +/- 2.6 percent, well within the limit specified by the CARB.
Formal verification of mathematical software
NASA Technical Reports Server (NTRS)
Sutherland, D.
1984-01-01
Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
Guidelines for qualifying cleaning and verification materials
NASA Technical Reports Server (NTRS)
Webb, D.
1995-01-01
This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.
This protocol was developed under the Environmental Protection Agency's Environmental Technology Verification (ETV) Program, and is intended to be used as a guide in preparing laboratory test plans for the purpose of verifying the performance of grouting materials used for infra...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
24 CFR 242.68 - Disclosure and verification of Social Security and Employer Identification Numbers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers. 242.68 Section 242.68 Housing and Urban Development... Requirements § 242.68 Disclosure and verification of Social Security and Employer Identification Numbers. The requirements set forth in 24 CFR part 5, regarding the disclosure and verification of Social Security Numbers...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samuel, D; Testa, M; Park, Y
Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
Time-domain damping models in structural acoustics using digital filtering
NASA Astrophysics Data System (ADS)
Parret-Fréaud, Augustin; Cotté, Benjamin; Chaigne, Antoine
2016-02-01
This paper describes a new approach in order to formulate well-posed time-domain damping models able to represent various frequency domain profiles of damping properties. The novelty of this approach is to represent the behavior law of a given material directly in a discrete-time framework as a digital filter, which is synthesized for each material from a discrete set of frequency-domain data such as complex modulus through an optimization process. A key point is the addition of specific constraints to this process in order to guarantee stability, causality and verification of thermodynamics second law when transposing the resulting discrete-time behavior law into the time domain. Thus, this method offers a framework which is particularly suitable for time-domain simulations in structural dynamics and acoustics for a wide range of materials (polymers, wood, foam, etc.), allowing to control and even reduce the distortion effects induced by time-discretization schemes on the frequency response of continuous-time behavior laws.
Neural Network Target Identification System for False Alarm Reduction
NASA Technical Reports Server (NTRS)
Ye, David; Edens, Weston; Lu, Thomas T.; Chao, Tien-Hsin
2009-01-01
A multi-stage automated target recognition (ATR) system has been designed to perform computer vision tasks with adequate proficiency in mimicking human vision. The system is able to detect, identify, and track targets of interest. Potential regions of interest (ROIs) are first identified by the detection stage using an Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter combined with a wavelet transform. False positives are then eliminated by the verification stage using feature extraction methods in conjunction with neural networks. Feature extraction transforms the ROIs using filtering and binning algorithms to create feature vectors. A feed forward back propagation neural network (NN) is then trained to classify each feature vector and remove false positives. This paper discusses the test of the system performance and parameter optimizations process which adapts the system to various targets and datasets. The test results show that the system was successful in substantially reducing the false positive rate when tested on a sonar image dataset.
STS-99 Shuttle Radar Topography Mission Stability and Control
NASA Technical Reports Server (NTRS)
Hamelin, Jennifer L.; Jackson, Mark C.; Kirchwey, Christopher B.; Pileggi, Roberto A.
2001-01-01
The Shuttle Radar Topography Mission (SRTM) flew aboard Space Shuttle Endeavor February 2000 and used interferometry to map 80% of the Earth's landmass. SRTM employed a 200-foot deployable mast structure to extend a second antenna away from the main antenna located in the Shuttle payload bay. Mapping requirements demanded precision pointing and orbital trajectories from the Shuttle on-orbit Flight Control System (PCS). Mast structural dynamics interaction with the FCS impacted stability and performance of the autopilot for attitude maneuvers and pointing during mapping operations. A damper system added to ensure that mast tip motion remained with in the limits of the outboard antenna tracking system while mapping also helped to mitigate structural dynamic interaction with the FCS autopilot. Late changes made to the payload damper system, which actually failed on-orbit, required a redesign and verification of the FCS autopilot filtering schemes necessary to ensure rotational control stability. In-flight measurements using three sensors were used to validate models and gauge the accuracy and robustness of the pre-mission notch filter design.
SPECS: Secure and Privacy Enhancing Communications Schemes for VANETs
NASA Astrophysics Data System (ADS)
Chim, T. W.; Yiu, S. M.; Hui, L. C. K.; Jiang, Zoe L.; Li, Victor O. K.
Vehicular ad hoc network (VANET) is an emerging type of networks which facilitates vehicles on roads to communicate for driving safety. The basic idea is to allow arbitrary vehicles to broadcast ad hoc messages (e.g. traffic accidents) to other vehicles. However, this raises the concern of security and privacy. Messages should be signed and verified before they are trusted while the real identity of vehicles should not be revealed, but traceable by authorized party. Existing solutions either rely heavily on a tamper-proof hardware device, or cannot satisfy the privacy requirement and do not have an effective message verification scheme. In this paper, we provide a software-based solution which makes use of only two shared secrets to satisfy the privacy requirement and gives lower message overhead and at least 45% higher successful rate than previous solutions in the message verification phase using the bloom filter and the binary search techniques. We also provide the first group communication protocol to allow vehicles to authenticate and securely communicate with others in a group of known vehicles.
Method for modeling post-mortem biometric 3D fingerprints
NASA Astrophysics Data System (ADS)
Rajeev, Srijith; Shreyas, Kamath K. M.; Agaian, Sos S.
2016-05-01
Despite the advancements of fingerprint recognition in 2-D and 3-D domain, authenticating deformed/post-mortem fingerprints continue to be an important challenge. Prior cleansing and reconditioning of the deceased finger is required before acquisition of the fingerprint. The victim's finger needs to be precisely and carefully operated by a medium to record the fingerprint impression. This process may damage the structure of the finger, which subsequently leads to higher false rejection rates. This paper proposes a non-invasive method to perform 3-D deformed/post-mortem finger modeling, which produces a 2-D rolled equivalent fingerprint for automated verification. The presented novel modeling method involves masking, filtering, and unrolling. Computer simulations were conducted on finger models with different depth variations obtained from Flashscan3D LLC. Results illustrate that the modeling scheme provides a viable 2-D fingerprint of deformed models for automated verification. The quality and adaptability of the obtained unrolled 2-D fingerprints were analyzed using NIST fingerprint software. Eventually, the presented method could be extended to other biometric traits such as palm, foot, tongue etc. for security and administrative applications.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Disclosure and verification of Social Security and Employer Identification Numbers by owners. 886.305 Section 886.305 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF THE ASSISTANT...
24 CFR 5.233 - Mandated use of HUD's Enterprise Income Verification (EIV) System.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Mandated use of HUD's Enterprise Income Verification (EIV) System. 5.233 Section 5.233 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development GENERAL HUD PROGRAM REQUIREMENTS; WAIVERS Disclosure...
24 CFR 5.240 - Family disclosure of income information to the responsible entity and verification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Family disclosure of income information to the responsible entity and verification. 5.240 Section 5.240 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development GENERAL HUD PROGRAM REQUIREMENTS...
Very fast road database verification using textured 3D city models obtained from airborne imagery
NASA Astrophysics Data System (ADS)
Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie
2014-10-01
Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.
Software verification plan for GCS. [guidance and control software
NASA Technical Reports Server (NTRS)
Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.
1990-01-01
This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.
Experimental Verification of Guided-Wave Lumped Circuits Using Waveguide Metamaterials
NASA Astrophysics Data System (ADS)
Li, Yue; Zhang, Zhijun
2018-04-01
Through the construction and characterization in microwave frequencies, we experimentally demonstrate our recently developed theory of waveguide lumped circuits, i.e., waveguide metatronics [Sci. Adv. 2, e1501790 (2016), 10.1126/sciadv.1501790], as a method to design subwavelength-scaled analog circuits. In the paradigm of waveguide metatronics, numbers of lumped inductors and capacitors are easily integrated functionally inside the waveguide, which is an irreplaceable transmission line in millimeter-wave and terahertz systems with the advantages of low radiation loss and low crosstalk. An example of multiple-ordered metatronic filters with layered structures is fabricated utilizing the technique of substrate integrated waveguides, which can be easily constructed by the printed-circuit-board process. The materials used in the construction are also typical microwave materials with positive permittivity, low loss, and negligible dispersion, imitating the plasmonic materials with negative permittivity in the optical domain. The results verify the theory of waveguide metatronics, which provides an efficient platform of functional lumped circuit design for guided-wave processing.
Huang, Haoqian; Chen, Xiyuan; Zhou, Zhikai; Xu, Yuan; Lv, Caiping
2014-01-01
High accuracy attitude and position determination is very important for underwater gliders. The cross-coupling among three attitude angles (heading angle, pitch angle and roll angle) becomes more serious when pitch or roll motion occurs. This cross-coupling makes attitude angles inaccurate or even erroneous. Therefore, the high accuracy attitude and position determination becomes a difficult problem for a practical underwater glider. To solve this problem, this paper proposes backing decoupling and adaptive extended Kalman filter (EKF) based on the quaternion expanded to the state variable (BD-AEKF). The backtracking decoupling can eliminate effectively the cross-coupling among the three attitudes when pitch or roll motion occurs. After decoupling, the adaptive extended Kalman filter (AEKF) based on quaternion expanded to the state variable further smoothes the filtering output to improve the accuracy and stability of attitude and position determination. In order to evaluate the performance of the proposed BD-AEKF method, the pitch and roll motion are simulated and the proposed method performance is analyzed and compared with the traditional method. Simulation results demonstrate the proposed BD-AEKF performs better. Furthermore, for further verification, a new underwater navigation system is designed, and the three-axis non-magnetic turn table experiments and the vehicle experiments are done. The results show that the proposed BD-AEKF is effective in eliminating cross-coupling and reducing the errors compared with the conventional method. PMID:25479331
Huang, Haoqian; Chen, Xiyuan; Zhou, Zhikai; Xu, Yuan; Lv, Caiping
2014-12-03
High accuracy attitude and position determination is very important for underwater gliders. The cross-coupling among three attitude angles (heading angle, pitch angle and roll angle) becomes more serious when pitch or roll motion occurs. This cross-coupling makes attitude angles inaccurate or even erroneous. Therefore, the high accuracy attitude and position determination becomes a difficult problem for a practical underwater glider. To solve this problem, this paper proposes backing decoupling and adaptive extended Kalman filter (EKF) based on the quaternion expanded to the state variable (BD-AEKF). The backtracking decoupling can eliminate effectively the cross-coupling among the three attitudes when pitch or roll motion occurs. After decoupling, the adaptive extended Kalman filter (AEKF) based on quaternion expanded to the state variable further smoothes the filtering output to improve the accuracy and stability of attitude and position determination. In order to evaluate the performance of the proposed BD-AEKF method, the pitch and roll motion are simulated and the proposed method performance is analyzed and compared with the traditional method. Simulation results demonstrate the proposed BD-AEKF performs better. Furthermore, for further verification, a new underwater navigation system is designed, and the three-axis non-magnetic turn table experiments and the vehicle experiments are done. The results show that the proposed BD-AEKF is effective in eliminating cross-coupling and reducing the errors compared with the conventional method.
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luke, S J
2011-12-20
This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
Verification testing of the Aquionics, Inc. bersonInLine® 4250 UV System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills Wastewater Treatment Plant test site in Parsippany, New Jersey. Two full-scale reactors were mounted in series. T...
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
Verification testing of the Ondeo Degremont, Inc. Aquaray® 40 HO VLS Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Three reactor modules were m...
Signature Verification Using N-tuple Learning Machine.
Maneechot, Thanin; Kitjaidure, Yuttana
2005-01-01
This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.
Review and verification of CARE 3 mathematical model and code
NASA Technical Reports Server (NTRS)
Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.
1983-01-01
The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.
A synergistic method for vibration suppression of an elevator mechatronic system
NASA Astrophysics Data System (ADS)
Knezevic, Bojan Z.; Blanusa, Branko; Marcetic, Darko P.
2017-10-01
Modern elevators are complex mechatronic systems which have to satisfy high performance in precision, safety and ride comfort. Each elevator mechatronic system (EMS) contains a mechanical subsystem which is characterized by its resonant frequency. In order to achieve high performance of the whole system, the control part of the EMS inevitably excites resonant circuits causing the occurrence of vibration. This paper proposes a synergistic solution based on the jerk control and the upgrade of the speed controller with a band-stop filter to restore lost ride comfort and speed control caused by vibration. The band-stop filter eliminates the resonant component from the speed controller spectra and jerk control provides operating of the speed controller in a linear mode as well as increased ride comfort. The original method for band-stop filter tuning based on Goertzel algorithm and Kiefer search algorithm is proposed in this paper. In order to generate the speed reference trajectory which can be defined by different shapes and amplitudes of jerk, a unique generalized model is proposed. The proposed algorithm is integrated in the power drive control algorithm and implemented on the digital signal processor. Through experimental verifications on a scale down prototype of the EMS it has been verified that only synergistic effect of controlling jerk and filtrating the reference torque can completely eliminate vibrations.
Wang, Baofeng; Qi, Zhiquan; Chen, Sizhong; Liu, Zhaodu; Ma, Guocheng
2017-01-01
Vision-based vehicle detection is an important issue for advanced driver assistance systems. In this paper, we presented an improved multi-vehicle detection and tracking method using cascade Adaboost and Adaptive Kalman filter(AKF) with target identity awareness. A cascade Adaboost classifier using Haar-like features was built for vehicle detection, followed by a more comprehensive verification process which could refine the vehicle hypothesis in terms of both location and dimension. In vehicle tracking, each vehicle was tracked with independent identity by an Adaptive Kalman filter in collaboration with a data association approach. The AKF adaptively adjusted the measurement and process noise covariance through on-line stochastic modelling to compensate the dynamics changes. The data association correctly assigned different detections with tracks using global nearest neighbour(GNN) algorithm while considering the local validation. During tracking, a temporal context based track management was proposed to decide whether to initiate, maintain or terminate the tracks of different objects, thus suppressing the sparse false alarms and compensating the temporary detection failures. Finally, the proposed method was tested on various challenging real roads, and the experimental results showed that the vehicle detection performance was greatly improved with higher accuracy and robustness. PMID:28296902
Development and Verification of Sputtered Thin-Film Nickel-Titanium (NiTi) Shape Memory Alloy (SMA)
2015-08-01
Shape Memory Alloy (SMA) by Cory R Knick and Christopher J Morris Approved for public release; distribution unlimited...Laboratory Development and Verification of Sputtered Thin-Film Nickel-Titanium (NiTi) Shape Memory Alloy (SMA) by Cory R Knick and Christopher
The development, verification, and comparison study between LC-MS libraries for two manufacturers’ instruments and a verified protocol are discussed. The LC-MS library protocol was verified through an inter-laboratory study that involved Federal, State, and private laboratories. ...
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
Options and Risk for Qualification of Electric Propulsion System
NASA Technical Reports Server (NTRS)
Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)
2002-01-01
Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
NASA Technical Reports Server (NTRS)
Powell, John D.
2003-01-01
This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.
The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...
Handbook: Design of automated redundancy verification
NASA Technical Reports Server (NTRS)
Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.
1971-01-01
The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.
The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) development
NASA Technical Reports Server (NTRS)
Shimamoto, Mike S.
1993-01-01
The development of an anthropomorphic, undersea manipulator system, the TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) is described. The TOPS system's design philosophy, which results from NRaD's experience in undersea vehicles and manipulator systems development and operations, is presented. The TOPS design approach, task teams, manipulator, and vision system development and results, conclusions, and recommendations are presented.
Development and Verification of the Charring Ablating Thermal Protection Implicit System Solver
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan D.; Kirk, Benjamin S.
2010-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method with first and second order implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton's method, while the fully implicit linear system is solved with the Generalized Minimal Residual method. Verification results from exact solutions and the Method of Manufactured Solutions are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
Development and Verification of the Charring, Ablating Thermal Protection Implicit System Simulator
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan; Kirk, Benjamin S.
2011-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver (CATPISS) is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method (FEM) with first and second order fully implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton s method, while the linear system is solved via the Generalized Minimum Residual method (GMRES). Verification results from exact solutions and Method of Manufactured Solutions (MMS) are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
NASA Astrophysics Data System (ADS)
Roed-Larsen, Trygve; Flach, Todd
The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.
Requirement Specifications for a Design and Verification Unit.
ERIC Educational Resources Information Center
Pelton, Warren G.; And Others
A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…
Research on the attitude detection technology of the tetrahedron robot
NASA Astrophysics Data System (ADS)
Gong, Hao; Chen, Keshan; Ren, Wenqiang; Cai, Xin
2017-10-01
The traditional attitude detection technology can't tackle the problem of attitude detection of the polyhedral robot. Thus we propose a novel algorithm of multi-sensor data fusion which is based on Kalman filter. In the algorithm a tetrahedron robot is investigated. We devise an attitude detection system for the polyhedral robot and conduct the verification of data fusion algorithm. It turns out that the minimal attitude detection system we devise could capture attitudes of the tetrahedral robot in different working conditions. Thus the Kinematics model we establish for the tetrahedron robot is correct and the feasibility of the attitude detection system is proven.
INS/EKF-based stride length, height and direction intent detection for walking assistance robots.
Brescianini, Dario; Jung, Jun-Young; Jang, In-Hun; Park, Hyun Sub; Riener, Robert
2011-01-01
We propose an algorithm used to obtain the information on stride length, height difference, and direction based on user's intent during walking. For exoskeleton robots used to assist paraplegic patients' walking, this information is used to generate gait patterns by themselves in on-line. To obtain this information, we attach an inertial measurement unit(IMU) on crutches and apply an extended kalman filter-based error correction method to reduce the phenomena of drift due to bias of the IMU. The proposed method is verifed in real walking scenarios including walking, climbing up-stairs, and changing direction of walking with normal. © 2011 IEEE
Smirr, Jean-Loup; Guilbaud, Sylvain; Ghalbouni, Joe; Frey, Robert; Diamanti, Eleni; Alléaume, Romain; Zaquine, Isabelle
2011-01-17
Fast characterization of pulsed spontaneous parametric down conversion (SPDC) sources is important for applications in quantum information processing and communications. We propose a simple method to perform this task, which only requires measuring the counts on the two output channels and the coincidences between them, as well as modeling the filter used to reduce the source bandwidth. The proposed method is experimentally tested and used for a complete evaluation of SPDC sources (pair emission probability, total losses, and fidelity) of various bandwidths. This method can find applications in the setting up of SPDC sources and in the continuous verification of the quality of quantum communication links.
NASA Technical Reports Server (NTRS)
Dyer, Edward F.
1992-01-01
In view of the serious performance deficiencies inherent in conventional modular and welded shielding EMC test enclosures, in which multipath reflections and resonant standing waves can damage flight hardware during RF susceptibility tests, NASA-Goddard has undertaken the modification of a 20 x 24 ft modular-shielded enclosure through installation of steel panels to which ferrite tiles will be mounted with epoxy. The internally reflected RF energy will thereby be absorbed, and exterior power-line noise will be reduced. Isolation of power-line filters and control of 60-Hz ground connections will also be undertaken in the course of upgrading.
Ripple feedback for the resonant-filter unity-power-factor rectifier
DOE Office of Scientific and Technical Information (OSTI.GOV)
Streng, S.A.; King, R.J.
1992-07-01
An unusual bucklike unity-power-factor rectifier with a resonant load-balancing network permits current-limited operation down to zero output voltage in a single-stage-topology. However, this rectifier has been found to be sensitive to ac-line voltage distortion and is potentially unstable with realistic values of ac-line impedance. In this paper, a new ripple feedback is proposed that solves both problems. A large-signal time-varying analysis is given along with incremental, quasi-static, and low-frequency approximations. Experimental verification is provided by a 500-W 50-kHz rectifier operating from the 120-V 60-Hz distribution system.
Random phase encoding for optical security
NASA Astrophysics Data System (ADS)
Wang, RuiKang K.; Watson, Ian A.; Chatwin, Christopher R.
1996-09-01
A new optical encoding method for security applications is proposed. The encoded image (encrypted into the security products) is merely a random phase image statistically and randomly generated by a random number generator using a computer, which contains no information from the reference pattern (stored for verification) or the frequency plane filter (a phase-only function for decoding). The phase function in the frequency plane is obtained using a modified phase retrieval algorithm. The proposed method uses two phase-only functions (images) at both the input and frequency planes of the optical processor leading to maximum optical efficiency. Computer simulation shows that the proposed method is robust for optical security applications.
Palmese, A.; Lahav, O.; Banerji, M.; ...
2016-08-20
We derive the stellar mass fraction in the galaxy cluster RXC J2248.7-4431 observed with the Dark Energy Survey (DES) during the Science Verification period. We compare the stellar mass results from DES (5 filters) with those from the Hubble Space Telescope CLASH (17 filters). When the cluster spectroscopic redshift is assumed, we show that stellar masses from DES can be estimated within 25% of CLASH values. We compute the stellar mass contribution coming from red and blue galaxies, and study the relation between stellar mass and the underlying dark matter using weak lensing studies with DES and CLASH. An analysismore » of the radial profiles of the DES total and stellar mass yields a stellar-to-total fraction of f*=7.0+-2.2x10^-3 within a radius of r_200c~3 Mpc. Our analysis also includes a comparison of photometric redshifts and star/galaxy separation efficiency for both datasets. We conclude that space-based small field imaging can be used to calibrate the galaxy properties in DES for the much wider field of view. The technique developed to derive the stellar mass fraction in galaxy clusters can be applied to the ~100 000 clusters that will be observed within this survey. The stacking of all the DES clusters would reduce the errors on f* estimates and deduce important information about galaxy evolution.« less
NASA Astrophysics Data System (ADS)
Palmese, A.; Lahav, O.; Banerji, M.; Gruen, D.; Jouvel, S.; Melchior, P.; Aleksić, J.; Annis, J.; Diehl, H. T.; Hartley, W. G.; Jeltema, T.; Romer, A. K.; Rozo, E.; Rykoff, E. S.; Seitz, S.; Suchyta, E.; Zhang, Y.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Dietrich, J. P.; Doel, P.; Estrada, J.; Evrard, A. E.; Flaugher, B.; Frieman, J.; Gerdes, D. W.; Goldstein, D. A.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Kuehn, K.; Kuropatkin, N.; Li, T. S.; Lima, M.; Maia, M. A. G.; Marshall, J. L.; Miller, C. J.; Miquel, R.; Nord, B.; Ogando, R.; Plazas, A. A.; Roodman, A.; Sanchez, E.; Scarpine, V.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D.; Vikram, V.
2016-12-01
We derive the stellar mass fraction in the galaxy cluster RXC J2248.7-4431 observed with the Dark Energy Survey (DES) during the Science Verification period. We compare the stellar mass results from DES (five filters) with those from the Hubble Space Telescope Cluster Lensing And Supernova Survey (CLASH; 17 filters). When the cluster spectroscopic redshift is assumed, we show that stellar masses from DES can be estimated within 25 per cent of CLASH values. We compute the stellar mass contribution coming from red and blue galaxies, and study the relation between stellar mass and the underlying dark matter using weak lensing studies with DES and CLASH. An analysis of the radial profiles of the DES total and stellar mass yields a stellar-to-total fraction of f⋆ = (6.8 ± 1.7) × 10-3 within a radius of r200c ≃ 2 Mpc. Our analysis also includes a comparison of photometric redshifts and star/galaxy separation efficiency for both data sets. We conclude that space-based small field imaging can be used to calibrate the galaxy properties in DES for the much wider field of view. The technique developed to derive the stellar mass fraction in galaxy clusters can be applied to the ˜100 000 clusters that will be observed within this survey and yield important information about galaxy evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmese, A.; Lahav, O.; Banerji, M.
We derive the stellar mass fraction in the galaxy cluster RXC J2248.7-4431 observed with the Dark Energy Survey (DES) during the Science Verification period. We compare the stellar mass results from DES (5 filters) with those from the Hubble Space Telescope CLASH (17 filters). When the cluster spectroscopic redshift is assumed, we show that stellar masses from DES can be estimated within 25% of CLASH values. We compute the stellar mass contribution coming from red and blue galaxies, and study the relation between stellar mass and the underlying dark matter using weak lensing studies with DES and CLASH. An analysismore » of the radial profiles of the DES total and stellar mass yields a stellar-to-total fraction of f*=7.0+-2.2x10^-3 within a radius of r_200c~3 Mpc. Our analysis also includes a comparison of photometric redshifts and star/galaxy separation efficiency for both datasets. We conclude that space-based small field imaging can be used to calibrate the galaxy properties in DES for the much wider field of view. The technique developed to derive the stellar mass fraction in galaxy clusters can be applied to the ~100 000 clusters that will be observed within this survey. The stacking of all the DES clusters would reduce the errors on f* estimates and deduce important information about galaxy evolution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmese, A.; Lahav, O.; Banerji, M.
We derive the stellar mass fraction in the galaxy cluster RXC J2248.7-4431 observed with the Dark Energy Survey (DES) during the Science Verification period. We compare the stellar mass results from DES (five filters) with those from the Hubble Space Telescope Cluster Lensing And Supernova Survey (CLASH; 17 filters). When the cluster spectroscopic redshift is assumed, we show that stellar masses from DES can be estimated within 25 per cent of CLASH values. We compute the stellar mass contribution coming from red and blue galaxies, and study the relation between stellar mass and the underlying dark matter using weak lensingmore » studies with DES and CLASH. An analysis of the radial profiles of the DES total and stellar mass yields a stellar-to-total fraction of f(star) = (6.8 +/- 1.7) x 10(-3) within a radius of r(200c) similar or equal to 2 Mpc. Our analysis also includes a comparison of photometric redshifts and star/galaxy separation efficiency for both data sets. We conclude that space-based small field imaging can be used to calibrate the galaxy properties in DES for the much wider field of view. The technique developed to derive the stellar mass fraction in galaxy clusters can be applied to the similar to 100 000 clusters that will be observed within this survey and yield important information about galaxy evolution.« less
Use of electronic portal imaging devices for electron treatment verification.
Kairn, T; Aland, T; Crowe, S B; Trapp, J V
2016-03-01
This study aims to help broaden the use of electronic portal imaging devices (EPIDs) for pre-treatment patient positioning verification, from photon-beam radiotherapy to photon- and electron-beam radiotherapy, by proposing and testing a method for acquiring clinically-useful EPID images of patient anatomy using electron beams, with a view to enabling and encouraging further research in this area. EPID images used in this study were acquired using all available beams from a linac configured to deliver electron beams with nominal energies of 6, 9, 12, 16 and 20 MeV, as well as photon beams with nominal energies of 6 and 10 MV. A widely-available heterogeneous, approximately-humanoid, thorax phantom was used, to provide an indication of the contrast and noise produced when imaging different types of tissue with comparatively realistic thicknesses. The acquired images were automatically calibrated, corrected for the effects of variations in the sensitivity of individual photodiodes, using a flood field image. For electron beam imaging, flood field EPID calibration images were acquired with and without the placement of blocks of water-equivalent plastic (with thicknesses approximately equal to the practical range of electrons in the plastic) placed upstream of the EPID, to filter out the primary electron beam, leaving only the bremsstrahlung photon signal. While the electron beam images acquired using a standard (unfiltered) flood field calibration were observed to be noisy and difficult to interpret, the electron beam images acquired using the filtered flood field calibration showed tissues and bony anatomy with levels of contrast and noise that were similar to the contrast and noise levels seen in the clinically acceptable photon beam EPID images. The best electron beam imaging results (highest contrast, signal-to-noise and contrast-to-noise ratios) were achieved when the images were acquired using the higher energy electron beams (16 and 20 MeV) when the EPID was calibrated using an intermediate (12 MeV) electron beam energy. These results demonstrate the feasibility of acquiring clinically-useful EPID images of patient anatomy using electron beams and suggest important avenues for future investigation, thus enabling and encouraging further research in this area. There is manifest potential for the EPID imaging method proposed in this work to lead to the clinical use of electron beam imaging for geometric verification of electron treatments in the future.
Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...
Formally verifying Ada programs which use real number types
NASA Technical Reports Server (NTRS)
Sutherland, David
1986-01-01
Formal verification is applied to programs which use real number arithmetic operations (mathematical programs). Formal verification of a program P consists of creating a mathematical model of F, stating the desired properties of P in a formal logical language, and proving that the mathematical model has the desired properties using a formal proof calculus. The development and verification of the mathematical model are discussed.
NASA Technical Reports Server (NTRS)
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
Dynamic testing for shuttle design verification
NASA Technical Reports Server (NTRS)
Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.
1972-01-01
Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael
This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less
Verification and quality control of routine hematology analyzers.
Vis, J Y; Huisman, A
2016-05-01
Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.
Study on verifying the angle measurement performance of the rotary-laser system
NASA Astrophysics Data System (ADS)
Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui
2018-04-01
An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.
Loads and Structural Dynamics Requirements for Spaceflight Hardware
NASA Technical Reports Server (NTRS)
Schultz, Kenneth P.
2011-01-01
The purpose of this document is to establish requirements relating to the loads and structural dynamics technical discipline for NASA and commercial spaceflight launch vehicle and spacecraft hardware. Requirements are defined for the development of structural design loads and recommendations regarding methodologies and practices for the conduct of load analyses are provided. As such, this document represents an implementation of NASA STD-5002. Requirements are also defined for structural mathematical model development and verification to ensure sufficient accuracy of predicted responses. Finally, requirements for model/data delivery and exchange are specified to facilitate interactions between Launch Vehicle Providers (LVPs), Spacecraft Providers (SCPs), and the NASA Technical Authority (TA) providing insight/oversight and serving in the Independent Verification and Validation role. In addition to the analysis-related requirements described above, a set of requirements are established concerning coupling phenomena or other interaction between structural dynamics and aerodynamic environments or control or propulsion system elements. Such requirements may reasonably be considered structure or control system design criteria, since good engineering practice dictates consideration of and/or elimination of the identified conditions in the development of those subsystems. The requirements are included here, however, to ensure that such considerations are captured in the design space for launch vehicles (LV), spacecraft (SC) and the Launch Abort Vehicle (LAV). The requirements in this document are focused on analyses to be performed to develop data needed to support structural verification. As described in JSC 65828, Structural Design Requirements and Factors of Safety for Spaceflight Hardware, implementation of the structural verification requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The requirement for and expected contents of the SVP are defined in JSC 65828. The SVP may also document unique verifications that meet or exceed these requirements with Technical Authority approval.
Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey
2010-09-01
Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.
Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis
NASA Technical Reports Server (NTRS)
Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.
2009-01-01
Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).
Developing a NASA strategy for the verification of large space telescope observatories
NASA Astrophysics Data System (ADS)
Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie
2006-06-01
In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.
Formal methods for dependable real-time systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
Formal specification and verification of Ada software
NASA Technical Reports Server (NTRS)
Hird, Geoffrey R.
1991-01-01
The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.
NASA Astrophysics Data System (ADS)
Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.
2013-05-01
Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
Model-Based Control of an Aircraft Engine using an Optimal Tuner Approach
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Chicatelli, Amy; Garg, Sanjay
2012-01-01
This paper covers the development of a model-based engine control (MBEC) method- ology applied to an aircraft turbofan engine. Here, a linear model extracted from the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40k) at a cruise operating point serves as the engine and the on-board model. The on-board model is up- dated using an optimal tuner Kalman Filter (OTKF) estimation routine, which enables the on-board model to self-tune to account for engine performance variations. The focus here is on developing a methodology for MBEC with direct control of estimated parameters of interest such as thrust and stall margins. MBEC provides the ability for a tighter control bound of thrust over the entire life cycle of the engine that is not achievable using traditional control feedback, which uses engine pressure ratio or fan speed. CMAPSS40k is capable of modeling realistic engine performance, allowing for a verification of the MBEC tighter thrust control. In addition, investigations of using the MBEC to provide a surge limit for the controller limit logic are presented that could provide benefits over a simple acceleration schedule that is currently used in engine control architectures.
Time dependent pre-treatment EPID dosimetry for standard and FFF VMAT.
Podesta, Mark; Nijsten, Sebastiaan M J J G; Persoon, Lucas C G G; Scheib, Stefan G; Baltes, Christof; Verhaegen, Frank
2014-08-21
Methods to calibrate Megavoltage electronic portal imaging devices (EPIDs) for dosimetry have been previously documented for dynamic treatments such as intensity modulated radiotherapy (IMRT) using flattened beams and typically using integrated fields. While these methods verify the accumulated field shape and dose, the dose rate and differential fields remain unverified. The aim of this work is to provide an accurate calibration model for time dependent pre-treatment dose verification using amorphous silicon (a-Si) EPIDs in volumetric modulated arc therapy (VMAT) for both flattened and flattening filter free (FFF) beams. A general calibration model was created using a Varian TrueBeam accelerator, equipped with an aS1000 EPID, for each photon spectrum 6 MV, 10 MV, 6 MV-FFF, 10 MV-FFF. As planned VMAT treatments use control points (CPs) for optimization, measured images are separated into corresponding time intervals for direct comparison with predictions. The accuracy of the calibration model was determined for a range of treatment conditions. Measured and predicted CP dose images were compared using a time dependent gamma evaluation using criteria (3%, 3 mm, 0.5 sec). Time dependent pre-treatment dose verification is possible without an additional measurement device or phantom, using the on-board EPID. Sufficient data is present in trajectory log files and EPID frame headers to reliably synchronize and resample portal images. For the VMAT plans tested, significantly more deviation is observed when analysed in a time dependent manner for FFF and non-FFF plans than when analysed using only the integrated field. We show EPID-based pre-treatment dose verification can be performed on a CP basis for VMAT plans. This model can measure pre-treatment doses for both flattened and unflattened beams in a time dependent manner which highlights deviations that are missed in integrated field verifications.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Microcode Verification Project.
1980-05-01
numerical constant. The internal syntax for these minimum and maximum values is REALMIN and REALMAX. ISPSSIMP ISPSSIMP is the file simplifying bitstring...To be fair , it is quito clear that much of the ILbor Il tile verification task can be reduced If verification and. code development are carried out...basi.a of and the language we have chosen for both encoding our descriptions of machines and reasoning about the course of computations. Internally , our
Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.
ERIC Educational Resources Information Center
Chen, Joseph C.; Chang, Ted C.
2000-01-01
Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)
NASA Astrophysics Data System (ADS)
Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui
2011-05-01
During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
International Space Station Requirement Verification for Commercial Visiting Vehicles
NASA Technical Reports Server (NTRS)
Garguilo, Dan
2017-01-01
The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.
NASA Astrophysics Data System (ADS)
Vinande, Eric T.
This research proposes several means to overcome challenges in the urban environment to ground vehicle global positioning system (GPS) receiver navigation performance through the integration of external sensor information. The effects of narrowband radio frequency interference and signal attenuation, both common in the urban environment, are examined with respect to receiver signal tracking processes. Low-cost microelectromechanical systems (MEMS) inertial sensors, suitable for the consumer market, are the focus of receiver augmentation as they provide an independent measure of motion and are independent of vehicle systems. A method for estimating the mounting angles of an inertial sensor cluster utilizing typical urban driving maneuvers is developed and is able to provide angular measurements within two degrees of truth. The integration of GPS and MEMS inertial sensors is developed utilizing a full state navigation filter. Appropriate statistical methods are developed to evaluate the urban environment navigation improvement due to the addition of MEMS inertial sensors. A receiver evaluation metric that combines accuracy, availability, and maximum error measurements is presented and evaluated over several drive tests. Following a description of proper drive test techniques, record and playback systems are evaluated as the optimal way of testing multiple receivers and/or integrated navigation systems in the urban environment as they simplify vehicle testing requirements.
Developing a Test for Assessing Elementary Students' Comprehension of Science Texts
ERIC Educational Resources Information Center
Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien
2012-01-01
This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…
NASA Astrophysics Data System (ADS)
Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.
Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.
ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF PAINT OVERSPRAY ARRESTORS
The paper discusses the environmental technology verification (ETV) of paint overspray arrestors undertaken as part of a program to accelerate the development and commercialization of improved environmental technologies through third-party verififcation and reporting of performan...
ETV PILOT FOR SOURCE WATER PROTECTION TECHNOLOGY VERIFICATION
The Environmental Technology Verification (ETV) Program, a five-year pilot, provides technology purchasers, permitters and developers with objective, quality assured performance data on new and/or improved technologies. EPA has partnered with the National Sanitation Foundation (...
In-Field Performance Testing of Stormwater Treatment Devices
The Environmental Technology Verification (ETV) Program was created by EPA’s Office of Research and Development to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program’s goal ...
ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: WET-WEATHER FLOW/SOURCE WATER PROTECTION
This paper presents an overview of the Environmental Protection Agency's (EPA) Environmental Technology Verification (ETV) program which was established to overcome the numerous impediments to commercialization experienced by developers of innovative environmental technologies. ...
Software Verification of Orion Cockpit Displays
NASA Technical Reports Server (NTRS)
Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee
2017-01-01
NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
A Study of Feature Combination for Vehicle Detection Based on Image Processing
2014-01-01
Video analytics play a critical role in most recent traffic monitoring and driver assistance systems. In this context, the correct detection and classification of surrounding vehicles through image analysis has been the focus of extensive research in the last years. Most of the pieces of work reported for image-based vehicle verification make use of supervised classification approaches and resort to techniques, such as histograms of oriented gradients (HOG), principal component analysis (PCA), and Gabor filters, among others. Unfortunately, existing approaches are lacking in two respects: first, comparison between methods using a common body of work has not been addressed; second, no study of the combination potentiality of popular features for vehicle classification has been reported. In this study the performance of the different techniques is first reviewed and compared using a common public database. Then, the combination capabilities of these techniques are explored and a methodology is presented for the fusion of classifiers built upon them, taking into account also the vehicle pose. The study unveils the limitations of single-feature based classification and makes clear that fusion of classifiers is highly beneficial for vehicle verification. PMID:24672299
Verification of target motion effects on SAR imagery using the Gotcha GMTI challenge dataset
NASA Astrophysics Data System (ADS)
Hack, Dan E.; Saville, Michael A.
2010-04-01
This paper investigates the relationship between a ground moving target's kinematic state and its SAR image. While effects such as cross-range offset, defocus, and smearing appear well understood, their derivations in the literature typically employ simplifications of the radar/target geometry and assume point scattering targets. This study adopts a geometrical model for understanding target motion effects in SAR imagery, termed the target migration path, and focuses on experimental verification of predicted motion effects using both simulated and empirical datasets based on the Gotcha GMTI challenge dataset. Specifically, moving target imagery is generated from three data sources: first, simulated phase history for a moving point target; second, simulated phase history for a moving vehicle derived from a simulated Mazda MPV X-band signature; and third, empirical phase history from the Gotcha GMTI challenge dataset. Both simulated target trajectories match the truth GPS target position history from the Gotcha GMTI challenge dataset, allowing direct comparison between all three imagery sets and the predicted target migration path. This paper concludes with a discussion of the parallels between the target migration path and the measurement model within a Kalman filtering framework, followed by conclusions.
GateKeeper: a new hardware architecture for accelerating pre-alignment in DNA short read mapping.
Alser, Mohammed; Hassan, Hasan; Xin, Hongyi; Ergin, Oguz; Mutlu, Onur; Alkan, Can
2017-11-01
High throughput DNA sequencing (HTS) technologies generate an excessive number of small DNA segments -called short reads- that cause significant computational burden. To analyze the entire genome, each of the billions of short reads must be mapped to a reference genome based on the similarity between a read and 'candidate' locations in that reference genome. The similarity measurement, called alignment, formulated as an approximate string matching problem, is the computational bottleneck because: (i) it is implemented using quadratic-time dynamic programming algorithms and (ii) the majority of candidate locations in the reference genome do not align with a given read due to high dissimilarity. Calculating the alignment of such incorrect candidate locations consumes an overwhelming majority of a modern read mapper's execution time. Therefore, it is crucial to develop a fast and effective filter that can detect incorrect candidate locations and eliminate them before invoking computationally costly alignment algorithms. We propose GateKeeper, a new hardware accelerator that functions as a pre-alignment step that quickly filters out most incorrect candidate locations. GateKeeper is the first design to accelerate pre-alignment using Field-Programmable Gate Arrays (FPGAs), which can perform pre-alignment much faster than software. When implemented on a single FPGA chip, GateKeeper maintains high accuracy (on average >96%) while providing, on average, 90-fold and 130-fold speedup over the state-of-the-art software pre-alignment techniques, Adjacency Filter and Shifted Hamming Distance (SHD), respectively. The addition of GateKeeper as a pre-alignment step can reduce the verification time of the mrFAST mapper by a factor of 10. https://github.com/BilkentCompGen/GateKeeper. mohammedalser@bilkent.edu.tr or onur.mutlu@inf.ethz.ch or calkan@cs.bilkent.edu.tr. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baeza, J.A.; Ureba, A.; Jimenez-Ortega, E.
Purpose: Although there exist several radiotherapy research platforms, such as: CERR, the most widely used and referenced; SlicerRT, which allows treatment plan comparison from various sources; and MMCTP, a full MCTP system; it is still needed a full MCTP toolset that provides users complete control of calculation grids, interpolation methods and filters in order to “fairly” compare results from different TPSs, supporting verification with experimental measurements. Methods: This work presents CARMEN, a MatLab-based platform including multicore and GPGPU accelerated functions for loading RT data; designing treatment plans; and evaluating dose matrices and experimental data.CARMEN supports anatomic and functional imaging inmore » DICOM format, as well as RTSTRUCT, RTPLAN and RTDOSE. Besides, it contains numerous tools to accomplish the MCTP process, managing egs4phant and phase space files.CARMEN planning mode assist in designing IMRT, VMAT and MERT treatments via both inverse and direct optimization. The evaluation mode contains a comprehensive toolset (e.g. 2D/3D gamma evaluation, difference matrices, profiles, DVH, etc.) to compare datasets from commercial TPS, MC simulations (i.e. 3ddose) and radiochromic film in a user-controlled manner. Results: CARMEN has been validated against commercial RTPs and well-established evaluation tools, showing coherent behavior of its multiple algorithms. Furthermore, CARMEN platform has been used to generate competitive complex treatment that has been published in comparative studies. Conclusion: A new research oriented MCTP platform with a customized validation toolset has been presented. Despite of being coded with a high-level programming language, CARMEN is agile due to the use of parallel algorithms. The wide-spread use of MatLab provides straightforward access to CARMEN’s algorithms to most researchers. Similarly, our platform can benefit from the MatLab community scientific developments as filters, registration algorithms etc. Finally, CARMEN arises the importance of grid and filtering control in treatment plan comparison.« less
Integrated testing and verification system for research flight software
NASA Technical Reports Server (NTRS)
Taylor, R. N.
1979-01-01
The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.
van Hoof, Joris J; Gosselt, Jordy F; de Jong, Menno D T
2010-02-01
To compare traditional in-store age verification with a newly developed remote age verification system, 100 cigarette purchase attempts were made by 15-year-old "mystery shoppers." The remote system led to a strong increase in compliance (96% vs. 12%), reflecting more identification requests and more sale refusals when adolescents showed their identification cards. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.
C formal verification with unix communication and concurrency
NASA Technical Reports Server (NTRS)
Hoover, Doug N.
1990-01-01
The results of a NASA SBIR project are presented in which CSP-Ariel, a verification system for C programs which use Unix system calls for concurrent programming, interprocess communication, and file input and output, was developed. This project builds on ORA's Ariel C verification system by using the system of Hoare's book, Communicating Sequential Processes, to model concurrency and communication. The system runs in ORA's Clio theorem proving environment. The use of CSP to model Unix concurrency and sketch the CSP semantics of a simple concurrent program is outlined. Plans for further development of CSP-Ariel are discussed. This paper is presented in viewgraph form.
Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
LECHELT, J.A.
2000-10-17
The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System,more » Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix.« less
NASA Astrophysics Data System (ADS)
Yuan, Shenfang; Chen, Jian; Yang, Weibo; Qiu, Lei
2017-08-01
Fatigue crack growth prognosis is important for prolonging service time, improving safety, and reducing maintenance cost in many safety-critical systems, such as in aircraft, wind turbines, bridges, and nuclear plants. Combining fatigue crack growth models with the particle filter (PF) method has proved promising to deal with the uncertainties during fatigue crack growth and reach a more accurate prognosis. However, research on prognosis methods integrating on-line crack monitoring with the PF method is still lacking, as well as experimental verifications. Besides, the PF methods adopted so far are almost all sequential importance resampling-based PFs, which usually encounter sample impoverishment problems, and hence performs poorly. To solve these problems, in this paper, the piezoelectric transducers (PZTs)-based active Lamb wave method is adopted for on-line crack monitoring. The deterministic resampling PF (DRPF) is proposed to be used in fatigue crack growth prognosis, which can overcome the sample impoverishment problem. The proposed method is verified through fatigue tests of attachment lugs, which are a kind of important joint component in aerospace systems.
Simulation and analyses of the aeroassist flight experiment attitude update method
NASA Technical Reports Server (NTRS)
Carpenter, J. R.
1991-01-01
A method which will be used to update the alignment of the Aeroassist Flight Experiment's Inertial Measuring Unit is simulated and analyzed. This method, the Star Line Maneuver, uses measurements from the Space Shuttle Orbiter star trackers along with an extended Kalman filter to estimate a correction to the attitude quaternion maintained by an Inertial Measuring Unit in the Orbiter's payload bay. This quaternion is corrupted by on-orbit bending of the Orbiter payload bay with respect to the Orbiter navigation base, which is incorporated into the payload quaternion when it is initialized via a direct transfer of the Orbiter attitude state. The method of updating this quaternion is examined through verification of baseline cases and Monte Carlo analysis using a simplified simulation, The simulation uses nominal state dynamics and measurement models from the Kalman filter as its real world models, and is programmed on Microvax minicomputer using Matlab, and interactive matrix analysis tool. Results are presented which confirm and augment previous performance studies, thereby enhancing confidence in the Star Line Maneuver design methodology.
NASA Astrophysics Data System (ADS)
Zhu, Yanwei; Yi, Fajun; Meng, Songhe; Zhuo, Lijun; Pan, Weizhen
2017-11-01
Improving the surface heat load measurement technique for vehicles in aerodynamic heating environments is imperative, regarding aspects of both the apparatus design and identification efficiency. A simple novel apparatus is designed for heat load identification, taking into account the lessons learned from several aerodynamic heating measurement devices. An inverse finite difference scheme (invFDM) for the apparatus is studied to identify its surface heat flux from the interior temperature measurements with high efficiency. A weighted piecewise regression filter is also proposed for temperature measurement prefiltering. Preliminary verification of the invFDM scheme and the filter is accomplished via numerical simulation experiments. Three specific pieces of apparatus have been concretely designed and fabricated using different sensing materials. The aerodynamic heating process is simulated by an inductively coupled plasma wind tunnel facility. The identification of surface temperature and heat flux from the temperature measurements is performed by invFDM. The results validate the high efficiency, reliability and feasibility of heat load measurements with different heat flux levels utilizing the designed apparatus and proposed method.
False star detection and isolation during star tracking based on improved chi-square tests.
Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Yang, Yanqiang; Su, Guohua
2017-08-01
The star sensor is a precise attitude measurement device for a spacecraft. Star tracking is the main and key working mode for a star sensor. However, during star tracking, false stars become an inevitable interference for star sensor applications, which may result in declined measurement accuracy. A false star detection and isolation algorithm in star tracking based on improved chi-square tests is proposed in this paper. Two estimations are established based on a Kalman filter and a priori information, respectively. The false star detection is operated through adopting the global state chi-square test in a Kalman filter. The false star isolation is achieved using a local state chi-square test. Semi-physical experiments under different trajectories with various false stars are designed for verification. Experiment results show that various false stars can be detected and isolated from navigation stars during star tracking, and the attitude measurement accuracy is hardly influenced by false stars. The proposed algorithm is proved to have an excellent performance in terms of speed, stability, and robustness.
Definition of ground test for Large Space Structure (LSS) control verification
NASA Technical Reports Server (NTRS)
Waites, H. B.; Doane, G. B., III; Tollison, D. K.
1984-01-01
An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.
Potluri, Chandrasekhar; Anugolu, Madhavi; Schoen, Marco P; Subbaram Naidu, D; Urfer, Alex; Chiu, Steve
2013-11-01
Estimating skeletal muscle (finger) forces using surface Electromyography (sEMG) signals poses many challenges. In general, the sEMG measurements are based on single sensor data. In this paper, two novel hybrid fusion techniques for estimating the skeletal muscle force from the sEMG array sensors are proposed. The sEMG signals are pre-processed using five different filters: Butterworth, Chebychev Type II, Exponential, Half-Gaussian and Wavelet transforms. Dynamic models are extracted from the acquired data using Nonlinear Wiener Hammerstein (NLWH) models and Spectral Analysis Frequency Dependent Resolution (SPAFDR) models based system identification techniques. A detailed comparison is provided for the proposed filters and models using 18 healthy subjects. Wavelet transforms give higher mean correlation of 72.6 ± 1.7 (mean ± SD) and 70.4 ± 1.5 (mean ± SD) for NLWH and SPAFDR models, respectively, when compared to the other filters used in this work. Experimental verification of the fusion based hybrid models with wavelet transform shows a 96% mean correlation and 3.9% mean relative error with a standard deviation of ± 1.3 and ± 0.9 respectively between the overall hybrid fusion algorithm estimated and the actual force for 18 test subjects' k-fold cross validation data. © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parchevsky, K. V.; Zhao, J.; Hartlep, T.
We performed three-dimensional numerical simulations of the solar surface acoustic wave field for the quiet Sun and for three models with different localized sound-speed perturbations in the interior with deep, shallow, and two-layer structures. We used the simulated data generated by two solar acoustics codes that employ the same standard solar model as a background model, but utilize different integration techniques and different models of stochastic wave excitation. Acoustic travel times were measured using a time-distance helioseismology technique, and compared with predictions from ray theory frequently used for helioseismic travel-time inversions. It is found that the measured travel-time shifts agreemore » well with the helioseismic theory for sound-speed perturbations, and for the measurement procedure with and without phase-speed filtering of the oscillation signals. This testing verifies the whole measuring-filtering-inversion procedure for static sound-speed anomalies with small amplitude inside the Sun outside regions of strong magnetic field. It is shown that the phase-speed filtering, frequently used to extract specific wave packets and improve the signal-to-noise ratio, does not introduce significant systematic errors. Results of the sound-speed inversion procedure show good agreement with the perturbation models in all cases. Due to its smoothing nature, the inversion procedure may overestimate sound-speed variations in regions with sharp gradients of the sound-speed profile.« less
Assessment of replicate bias in 454 pyrosequencing and a multi-purpose read-filtering tool.
Jérôme, Mariette; Noirot, Céline; Klopp, Christophe
2011-05-26
Roche 454 pyrosequencing platform is often considered the most versatile of the Next Generation Sequencing technology platforms, permitting the sequencing of large genomes, the analysis of variations or the study of transcriptomes. A recent reported bias leads to the production of multiple reads for a unique DNA fragment in a random manner within a run. This bias has a direct impact on the quality of the measurement of the representation of the fragments using the reads. Other cleaning steps are usually performed on the reads before assembly or alignment. PyroCleaner is a software module intended to clean 454 pyrosequencing reads in order to ease the assembly process. This program is a free software and is distributed under the terms of the GNU General Public License as published by the Free Software Foundation. It implements several filters using criteria such as read duplication, length, complexity, base-pair quality and number of undetermined bases. It also permits to clean flowgram files (.sff) of paired-end sequences generating on one hand validated paired-ends file and the other hand single read file. Read cleaning has always been an important step in sequence analysis. The pyrocleaner python module is a Swiss knife dedicated to 454 reads cleaning. It includes commonly used filters as well as specialised ones such as duplicated read removal and paired-end read verification.
A New Objective Technique for Verifying Mesoscale Numerical Weather Prediction Models
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Manobianco, John; Lane, John E.; Immer, Christopher D.
2003-01-01
This report presents a new objective technique to verify predictions of the sea-breeze phenomenon over east-central Florida by the Regional Atmospheric Modeling System (RAMS) mesoscale numerical weather prediction (NWP) model. The Contour Error Map (CEM) technique identifies sea-breeze transition times in objectively-analyzed grids of observed and forecast wind, verifies the forecast sea-breeze transition times against the observed times, and computes the mean post-sea breeze wind direction and speed to compare the observed and forecast winds behind the sea-breeze front. The CEM technique is superior to traditional objective verification techniques and previously-used subjective verification methodologies because: It is automated, requiring little manual intervention, It accounts for both spatial and temporal scales and variations, It accurately identifies and verifies the sea-breeze transition times, and It provides verification contour maps and simple statistical parameters for easy interpretation. The CEM uses a parallel lowpass boxcar filter and a high-order bandpass filter to identify the sea-breeze transition times in the observed and model grid points. Once the transition times are identified, CEM fits a Gaussian histogram function to the actual histogram of transition time differences between the model and observations. The fitted parameters of the Gaussian function subsequently explain the timing bias and variance of the timing differences across the valid comparison domain. Once the transition times are all identified at each grid point, the CEM computes the mean wind direction and speed during the remainder of the day for all times and grid points after the sea-breeze transition time. The CEM technique performed quite well when compared to independent meteorological assessments of the sea-breeze transition times and results from a previously published subjective evaluation. The algorithm correctly identified a forecast or observed sea-breeze occurrence or absence 93% of the time during the two- month evaluation period from July and August 2000. Nearly all failures in CEM were the result of complex precipitation features (observed or forecast) that contaminated the wind field, resulting in a false identification of a sea-breeze transition. A qualitative comparison between the CEM timing errors and the subjectively determined observed and forecast transition times indicate that the algorithm performed very well overall. Most discrepancies between the CEM results and the subjective analysis were again caused by observed or forecast areas of precipitation that led to complex wind patterns. The CEM also failed on a day when the observed sea- breeze transition affected only a very small portion of the verification domain. Based on the results of CEM, the RAMS tended to predict the onset and movement of the sea-breeze transition too early and/or quickly. The domain-wide timing biases provided by CEM indicated an early bias on 30 out of 37 days when both an observed and forecast sea breeze occurred over the same portions of the analysis domain. These results are consistent with previous subjective verifications of the RAMS sea breeze predictions. A comparison of the mean post-sea breeze winds indicate that RAMS has a positive wind-speed bias for .all days, which is also consistent with the early bias in the sea-breeze transition time since the higher wind speeds resulted in a faster inland penetration of the sea breeze compared to reality.
Environmental Technology Verification Program Fact Sheet
This is a Fact Sheet for the ETV Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program ...
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.
Development of a three-dimensional high-order strand-grids approach
NASA Astrophysics Data System (ADS)
Tong, Oisin
Development of a novel high-order flux correction method on strand grids is presented. The method uses a combination of flux correction in the unstructured plane and summation-by-parts operators in the strand direction to achieve high-fidelity solutions. Low-order truncation errors are cancelled with accurate flux and solution gradients in the flux correction method, thereby achieving a formal order of accuracy of 3, although higher orders are often obtained, especially for highly viscous flows. In this work, the scheme is extended to high-Reynolds number computations in both two and three dimensions. Turbulence closure is achieved with a robust version of the Spalart-Allmaras turbulence model that accommodates negative values of the turbulence working variable, and the Menter SST turbulence model, which blends the k-epsilon and k-o turbulence models for better accuracy. A major advantage of this high-order formulation is the ability to implement traditional finite volume-like limiters to cleanly capture shocked and discontinuous flows. In this work, this approach is explored via a symmetric limited positive (SLIP) limiter. Extensive verification and validation is conducted in two and three dimensions to determine the accuracy and fidelity of the scheme for a number of different cases. Verification studies show that the scheme achieves better than third order accuracy for low and high-Reynolds number flows. Cost studies show that in three-dimensions, the third-order flux correction scheme requires only 30% more walltime than a traditional second-order scheme on strand grids to achieve the same level of convergence. In order to overcome meshing issues at sharp corners and other small-scale features, a unique approach to traditional geometry, coined "asymptotic geometry," is explored. Asymptotic geometry is achieved by filtering out small-scale features in a level set domain through min/max flow. This approach is combined with a curvature based strand shortening strategy in order to qualitatively improve strand grid mesh quality.
NASA Astrophysics Data System (ADS)
Meneguz, Elena; Turp, Debi; Wells, Helen
2015-04-01
It is well known that encounters with moderate or severe turbulence can lead to passenger injuries and incur high costs for airlines from compensation and litigation. As one of two World Area Forecast Centres (WAFCs), the Met Office has responsibility for forecasting en-route weather hazards worldwide for aviation above a height of 10,000 ft. Observations from commercial aircraft provide a basis for gaining a better understanding of turbulence and for improving turbulence forecasts through verification. However there is currently a lack of information regarding the possible cause of the observed turbulence, or whether the turbulence occurred within cloud. Such information would be invaluable for the development of forecasting techniques for particular types of turbulence and for forecast verification. Of all the possible sources of turbulence, convective activity is believed to be a major cause of turbulence. Its relative importance over the Europe and North Atlantic area has not been yet quantified in a systematic way: in this study, a new approach is developed to automate identification of turbulent encounters in the proximity of convective clouds. Observations of convection are provided from two independent sources: a surface based lightning network and satellite imagery. Lightning observations are taken from the Met Office Arrival Time Detections network (ATDnet). ATDnet has been designed to identify cloud-to-ground flashes over Europe but also detects (a smaller fraction of) strikes over the North Atlantic. Meteosat Second Generation (MSG) satellite products are used to identify convective clouds by applying a brightness temperature filtering technique. The morphological features of cold cloud tops are also investigated. The system is run for all in situ turbulence reports received from airlines for a total of 12 months during summer 2013 and 2014 for the domain of interest. Results of this preliminary short term climatological study show significant intra-seasonal variability and an average of 15% of all aircraft encounters with turbulence are found in the proximity of convective clouds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C
2012-01-01
US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.
NASA Astrophysics Data System (ADS)
Faghihi, F.; Khalili, S.
2013-08-01
This article involves two aims for BNCT. First case includes a beam shaping assembly estimation for a D-T neutron source to find epi-thermal neutrons which are the goal in the BNCT. Second issue is the percent depth dose calculation in the adult Snyder head phantom. Monte-Carlo simulations and verification of a suggested beam shaping assembly (including internal neutron multiplier, moderator, filter, external neutron multiplier, collimator, and reflector dimensions) for thermalizing a D-T neutron source as well as increasing neutron flux are carried out and our results are given herein. Finally, we have simulated its corresponding doses for treatment planning of a deeply-seated tumor.
CT reconstruction from portal images acquired during volumetric-modulated arc therapy
NASA Astrophysics Data System (ADS)
Poludniowski, G.; Thomas, M. D. R.; Evans, P. M.; Webb, S.
2010-10-01
Volumetric-modulated arc therapy (VMAT), a form of intensity-modulated arc therapy (IMAT), has become a topic of research and clinical activity in recent years. As a form of arc therapy, portal images acquired during the treatment fraction form a (partial) Radon transform of the patient. We show that these portal images, when used in a modified global cone-beam filtered backprojection (FBP) algorithm, allow a surprisingly recognizable CT-volume to be reconstructed. The possibility of distinguishing anatomy in such VMAT-CT reconstructions suggests that this could prove to be a valuable treatment position-verification tool. Further, some potential for local-tomography techniques to improve image quality is shown.
Fast ℓ1-regularized space-time adaptive processing using alternating direction method of multipliers
NASA Astrophysics Data System (ADS)
Qin, Lilong; Wu, Manqing; Wang, Xuan; Dong, Zhen
2017-04-01
Motivated by the sparsity of filter coefficients in full-dimension space-time adaptive processing (STAP) algorithms, this paper proposes a fast ℓ1-regularized STAP algorithm based on the alternating direction method of multipliers to accelerate the convergence and reduce the calculations. The proposed algorithm uses a splitting variable to obtain an equivalent optimization formulation, which is addressed with an augmented Lagrangian method. Using the alternating recursive algorithm, the method can rapidly result in a low minimum mean-square error without a large number of calculations. Through theoretical analysis and experimental verification, we demonstrate that the proposed algorithm provides a better output signal-to-clutter-noise ratio performance than other algorithms.
Pols, David H.J.; Bramer, Wichor M.; Bindels, Patrick J.E.; van de Laar, Floris A.; Bohnen, Arthur M.
2015-01-01
Physicians and researchers in the field of family medicine often need to find relevant articles in online medical databases for a variety of reasons. Because a search filter may help improve the efficiency and quality of such searches, we aimed to develop and validate search filters to identify research studies of relevance to family medicine. Using a new and objective method for search filter development, we developed and validated 2 search filters for family medicine. The sensitive filter had a sensitivity of 96.8% and a specificity of 74.9%. The specific filter had a specificity of 97.4% and a sensitivity of 90.3%. Our new filters should aid literature searches in the family medicine field. The sensitive filter may help researchers conducting systematic reviews, whereas the specific filter may help family physicians find answers to clinical questions at the point of care when time is limited. PMID:26195683
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
Generic Kalman Filter Software
NASA Technical Reports Server (NTRS)
Lisano, Michael E., II; Crues, Edwin Z.
2005-01-01
The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.
Design verification test matrix development for the STME thrust chamber assembly
NASA Technical Reports Server (NTRS)
Dexter, Carol E.; Elam, Sandra K.; Sparks, David L.
1993-01-01
This report presents the results of the test matrix development for design verification at the component level for the National Launch System (NLS) space transportation main engine (STME) thrust chamber assembly (TCA) components including the following: injector, combustion chamber, and nozzle. A systematic approach was used in the development of the minimum recommended TCA matrix resulting in a minimum number of hardware units and a minimum number of hot fire tests.
NASA Technical Reports Server (NTRS)
1995-01-01
The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Development of emergent processing loops as a system of systems concept
NASA Astrophysics Data System (ADS)
Gainey, James C., Jr.; Blasch, Erik P.
1999-03-01
This paper describes an engineering approach toward implementing the current neuroscientific understanding of how the primate brain fuses, or integrates, 'information' in the decision-making process. We describe a System of Systems (SoS) design for improving the overall performance, capabilities, operational robustness, and user confidence in Identification (ID) systems and show how it could be applied to biometrics security. We use the Physio-associative temporal sensor integration algorithm (PATSIA) which is motivated by observed functions and interactions of the thalamus, hippocampus, and cortical structures in the brain. PATSIA utilizes signal theory mathematics to model how the human efficiently perceives and uses information from the environment. The hybrid architecture implements a possible SoS-level description of the Joint Directors of US Laboratories for Fusion Working Group's functional description involving 5 levels of fusion and their associated definitions. This SoS architecture propose dynamic sensor and knowledge-source integration by implementing multiple Emergent Processing Loops for predicting, feature extracting, matching, and Searching both static and dynamic database like MSTAR's PEMS loops. Biologically, this effort demonstrates these objectives by modeling similar processes from the eyes, ears, and somatosensory channels, through the thalamus, and to the cortices as appropriate while using the hippocampus for short-term memory search and storage as necessary. The particular approach demonstrated incorporates commercially available speaker verification and face recognition software and hardware to collect data and extract features to the PATSIA. The PATSIA maximizes the confidence levels for target identification or verification in dynamic situations using a belief filter. The proof of concept described here is easily adaptable and scaleable to other military and nonmilitary sensor fusion applications.
Environmental Technology Verification (ETV) Quality Program (Poster)
This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...
PROMOTING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATION
The paper discusses the promotion of improved air quality through environmental technology verifications (ETVs). In 1995, the U.S. EPA's Office of Research and Development began the ETV Program in response to President Clinton's "Bridge to a Sustainable Future" and Vice Presiden...
GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES
This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...
Speaker diarization system on the 2007 NIST rich transcription meeting recognition evaluation
NASA Astrophysics Data System (ADS)
Sun, Hanwu; Nwe, Tin Lay; Koh, Eugene Chin Wei; Bin, Ma; Li, Haizhou
2007-09-01
This paper presents a speaker diarization system developed at the Institute for Infocomm Research (I2R) for NIST Rich Transcription 2007 (RT-07) evaluation task. We describe in details our primary approaches for the speaker diarization on the Multiple Distant Microphones (MDM) conditions in conference room scenario. Our proposed system consists of six modules: 1). Least-mean squared (NLMS) adaptive filter for the speaker direction estimate via Time Difference of Arrival (TDOA), 2). An initial speaker clustering via two-stage TDOA histogram distribution quantization approach, 3). Multiple microphone speaker data alignment via GCC-PHAT Time Delay Estimate (TDE) among all the distant microphone channel signals, 4). A speaker clustering algorithm based on GMM modeling approach, 5). Non-speech removal via speech/non-speech verification mechanism and, 6). Silence removal via "Double-Layer Windowing"(DLW) method. We achieves error rate of 31.02% on the 2006 Spring (RT-06s) MDM evaluation task and a competitive overall error rate of 15.32% for the NIST Rich Transcription 2007 (RT-07) MDM evaluation task.
INEL BNCT Research Program Annual Report 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venhuizen, J.R.
1994-08-01
This report is a summary of the progress and research produced for the Idaho National Engineering Laboratory Boron Neutron Capture Therapy Research Program for calendar year 1993. Contributions from all the principal investigators are included, covering chemistry (pituitary tumor studies, boron drug development including liposomes, lipoproteins, and carboranylalanine derivatives), pharmacology (murine screenings, toxicity testing, boron drug analysis), physics (radiation dosimetry software, neutron beam and filter design, neutron beam measurement dosimetry), and radiation biology (tissue and efficacy studies of small and large animal models). Information on the potential toxicity of borocaptate sodium and boronophenylalanine is presented. Results of 21 spontaneous-tumor-bearing dogsmore » that have been treated with boron neutron capture therapy at the Brookhaven National Laboratory are updated. Boron-containing drug purity verification is discussed in some detail. Advances in magnetic resonance imaging of boron in vivo are discussed. Several boron-carrying drugs exhibiting good tumor uptake are described. Significant progress in the potential of treating pituitary tumors is presented. Measurement of the epithermal-neutron flux of the Petten (The Netherlands) High Flux Reactor beam (HFB11B), and comparison to predictions are shown.« less
NASA Astrophysics Data System (ADS)
Aa, Ercha; Huang, Wengeng; Yu, Shimei; Liu, Siqing; Shi, Liqin; Gong, Jiancun; Chen, Yanhong; Shen, Hua
2015-06-01
In this paper, a regional total electron content (TEC) mapping technique over China and adjacent areas (70°E-140°E and 15°N-55°N) is developed on the basis of a Kalman filter data assimilation scheme driven by Global Navigation Satellite Systems (GNSS) data from the Crustal Movement Observation Network of China and International GNSS Service. The regional TEC maps can be generated accordingly with the spatial and temporal resolution being 1°×1° and 5 min, respectively. The accuracy and quality of the TEC mapping technique have been validated through the comparison with GNSS observations, the International Reference Ionosphere model values, the global ionosphere maps from Center for Orbit Determination of Europe, and the Massachusetts Institute of Technology Automated Processing of GPS TEC data from Madrigal database. The verification results indicate that great systematic improvements can be obtained when data are assimilated into the background model, which demonstrates the effectiveness of this technique in providing accurate regional specification of the ionospheric TEC over China and adjacent areas.
Agents Based e-Commerce and Securing Exchanged Information
NASA Astrophysics Data System (ADS)
Al-Jaljouli, Raja; Abawajy, Jemal
Mobile agents have been implemented in e-Commerce to search and filter information of interest from electronic markets. When the information is very sensitive and critical, it is important to develop a novel security protocol that can efficiently protect the information from malicious tampering as well as unauthorized disclosure or at least detect any malicious act of intruders. In this chapter, we describe robust security techniques that ensure a sound security of information gathered throughout agent’s itinerary against various security attacks, as well as truncation attacks. A sound security protocol is described, which implements the various security techniques that would jointly prevent or at least detect any malicious act of intruders. We reason about the soundness of the protocol usingSymbolic Trace Analyzer (STA), a formal verification tool that is based on symbolic techniques. We analyze the protocol in key configurations and show that it is free of flaws. We also show that the protocol fulfils the various security requirements of exchanged information in MAS, including data-integrity, data-confidentiality, data-authenticity, origin confidentiality and data non-repudiability.
Considerations in STS payload environmental verification
NASA Technical Reports Server (NTRS)
Keegan, W. B.
1978-01-01
Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.
A digital flight control system verification laboratory
NASA Technical Reports Server (NTRS)
De Feo, P.; Saib, S.
1982-01-01
A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.
VERIFYING CLEANER TECHNOLOGIES WITH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM
The US EPA's Office of Research and Development Environmental Technology Verification (ETV) Program completed its five-year pilot period in 2001. Now in 2002 lessons learned in the pilot period are being incorporated seamlessly into six operating ETV Centers which cover technolo...
NASA Technical Reports Server (NTRS)
Fisher, Marcus S.; Northey, Jeffrey; Stanton, William
2014-01-01
The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MCGREW, D.L.
2001-10-31
This Requirements Verification Report provides the traceability of how Project W-314 fulfilled the Project Development Specification requirements for the AN Farm to 200E Waste Transfer System Upgrade package.
A Comparative Analysis of Kalman Filters Using a Hypervelocity Missile Simulation.
1981-12-01
2-29 2.9 Summary ......... ...................... . 2-35 III. Kalman Filter Development ..... ............... ... 3-1 3.1 Introduction...3-2 3.1.4 Assumptions ................ 3-3 3.2 Development of Line-of-Sight Filters ......... ... 3-4 3.2.1 Introduction ....... .............. . 3-4... Development of Inertial Filters ... ......... ... 3-20 3.3.1 Introduction ...... ................ ... 3-20 3.3.2 Filter Model I.I
2005-05-01
TANK WALL.........................74 6 VERIFICATION - BONDED JOINT HOMOGENOUS ISOTROPIC AND ORTHOTROPIC DELALE & ERDOGAN PUBLICATION (SIX EXAMPLES...developed for verification of BondJo 87 6.3.2 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 88...comparisons for condition 1 91 6.3.6 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 92 x FIGURE
Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Mechanical verification of a schematic Byzantine clock synchronization algorithm
NASA Technical Reports Server (NTRS)
Shankar, Natarajan
1991-01-01
Schneider generalizes a number of protocols for Byzantine fault tolerant clock synchronization and presents a uniform proof for their correctness. The authors present a machine checked proof of this schematic protocol that revises some of the details in Schneider's original analysis. The verification was carried out with the EHDM system developed at the SRI Computer Science Laboratory. The mechanically checked proofs include the verification that the egocentric mean function used in Lamport and Melliar-Smith's Interactive Convergence Algorithm satisfies the requirements of Schneider's protocol.
2013-04-01
project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of
RELAP5-3D Resolution of Known Restart/Backup Issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesina, George L.; Anderson, Nolan A.
2014-12-01
The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less
[Validation and verfication of microbiology methods].
Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción
2015-01-01
Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
HDM/PASCAL Verification System User's Manual
NASA Technical Reports Server (NTRS)
Hare, D.
1983-01-01
The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.
While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describemore » our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.« less
Pols, David H J; Bramer, Wichor M; Bindels, Patrick J E; van de Laar, Floris A; Bohnen, Arthur M
2015-01-01
Physicians and researchers in the field of family medicine often need to find relevant articles in online medical databases for a variety of reasons. Because a search filter may help improve the efficiency and quality of such searches, we aimed to develop and validate search filters to identify research studies of relevance to family medicine. Using a new and objective method for search filter development, we developed and validated 2 search filters for family medicine. The sensitive filter had a sensitivity of 96.8% and a specificity of 74.9%. The specific filter had a specificity of 97.4% and a sensitivity of 90.3%. Our new filters should aid literature searches in the family medicine field. The sensitive filter may help researchers conducting systematic reviews, whereas the specific filter may help family physicians find answers to clinical questions at the point of care when time is limited. © 2015 Annals of Family Medicine, Inc.
Regenerative particulate filter development
NASA Technical Reports Server (NTRS)
Descamp, V. A.; Boex, M. W.; Hussey, M. W.; Larson, T. P.
1972-01-01
Development, design, and fabrication of a prototype filter regeneration unit for regenerating clean fluid particle filter elements by using a backflush/jet impingement technique are reported. Development tests were also conducted on a vortex particle separator designed for use in zero gravity environment. A maintainable filter was designed, fabricated and tested that allows filter element replacement without any leakage or spillage of system fluid. Also described are spacecraft fluid system design and filter maintenance techniques with respect to inflight maintenance for the space shuttle and space station.
NASA Astrophysics Data System (ADS)
do Lago, Naydson Emmerson S. P.; Kardec Barros, Allan; Sousa, Nilviane Pires S.; Junior, Carlos Magno S.; Oliveira, Guilherme; Guimares Polisel, Camila; Eder Carvalho Santana, Ewaldo
2018-01-01
This study aims to develop an algorithm of an adaptive filter to determine the percentage of body fat based on the use of anthropometric indicators in adolescents. Measurements such as body mass, height and waist circumference were collected for a better analysis. The development of this filter was based on the Wiener filter, used to produce an estimate of a random process. The Wiener filter minimizes the mean square error between the estimated random process and the desired process. The LMS algorithm was also studied for the development of the filter because it is important due to its simplicity and facility of computation. Excellent results were obtained with the filter developed, being these results analyzed and compared with the data collected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samuvel, K; Yadav, G; Bhushan, M
2016-06-15
Purpose: To quantify the dosimetric accuracy of junction dose in double isocenter flattened and flatten filter free(FFF) intensity modulated radiation therapy(IMRT) and volumetric modulated arc therapy(VMAT) plan delivery using pelvis phantom. Methods: Five large field pelvis patients were selected for this study. Double isocenter IMRT and VMAT treatment plans were generated in Eclipse Treatment planning System (V.11.0) using 6MV FB and FFF beams. For all the plans same distance 17.0cm was kept between one isocenter to another isocenter. IMRT Plans were made with 7 coplanar fields and VMAT plans were made with full double arcs. Dose calculation was performed usingmore » AAA algorithms with dose grid size of 0.25 cm. Verification plans were calculated on Scanditronix Wellhofer pelvis slab phantom. Measurement point was selected and calculated, where two isocenter plan fields are overlapping, this measurement point was kept at distance 8.5cm from both isocenter. The plans were delivered using Varian TrueBeamTM machine on pelvis slab phantom. Point dose measurements was carried out using CC13 ion chamber volume of 0.13cm3. Results: The measured junction point dose are compared with TPS calculated dose. The mean difference observed was 4.5%, 6.0%, 4.0% and 7.0% for IMRT-FB,IMRT-FFF, VMAT-FB and VMAT-FFF respectively. The measured dose results shows closer agreement with calculated dose in Flatten beam planning in both IMRT and VMAT, whereas in FFF beam plan dose difference are more compared with flatten beam plan. Conclusion: Dosimetry accuracy of Large Field junction dose difference was found less in Flatten beam compared with FFF beam plan delivery. Even though more dosimetric studies are required to analyse junction dose for FFF beam planning using multiple point dose measurements and fluence map verification in field junction area.« less
This paper presents a brief overview of the EPA's ETV program which was established in 1995 to overcome the numerous impediments to commercialization experienced by developers of innovative environmental technologies. Among those most frequently mentioned is the lack of credible ...
This paper presents a brief overview of EPA's ETV program established in 1995 to overcome the numerous impediments to commercialization experienced by developers of innovative environmental technologies. Among those most frequently mentioned is the lack of credible performance da...
Verification of space weather forecasts at the UK Met Office
NASA Astrophysics Data System (ADS)
Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.
2017-12-01
The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.
Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K
2013-03-04
The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
Validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Gilstrap, Lewey
1991-01-01
Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.
Hard and Soft Safety Verifications
NASA Technical Reports Server (NTRS)
Wetherholt, Jon; Anderson, Brenda
2012-01-01
The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.
Online 3D EPID-based dose verification: Proof of concept.
Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel
2016-07-01
Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.
Validation of search filters for identifying pediatric studies in PubMed.
Leclercq, Edith; Leeflang, Mariska M G; van Dalen, Elvira C; Kremer, Leontien C M
2013-03-01
To identify and validate PubMed search filters for retrieving studies including children and to develop a new pediatric search filter for PubMed. We developed 2 different datasets of studies to evaluate the performance of the identified pediatric search filters, expressed in terms of sensitivity, precision, specificity, accuracy, and number needed to read (NNR). An optimal search filter will have a high sensitivity and high precision with a low NNR. In addition to the PubMed Limits: All Child: 0-18 years filter (in May 2012 renamed to PubMed Filter Child: 0-18 years), 6 search filters for identifying studies including children were identified: 3 developed by Kastner et al, 1 developed by BestBets, one by the Child Health Field, and 1 by the Cochrane Childhood Cancer Group. Three search filters (Cochrane Childhood Cancer Group, Child Health Field, and BestBets) had the highest sensitivity (99.3%, 99.5%, and 99.3%, respectively) but a lower precision (64.5%, 68.4%, and 66.6% respectively) compared with the other search filters. Two Kastner search filters had a high precision (93.0% and 93.7%, respectively) but a low sensitivity (58.5% and 44.8%, respectively). They failed to identify many pediatric studies in our datasets. The search terms responsible for false-positive results in the reference dataset were determined. With these data, we developed a new search filter for identifying studies with children in PubMed with an optimal sensitivity (99.5%) and precision (69.0%). Search filters to identify studies including children either have a low sensitivity or a low precision with a high NNR. A new pediatric search filter with a high sensitivity and a low NNR has been developed. Copyright © 2013 Mosby, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wood, Andy; Clark, Elizabeth; Mendoza, Pablo; Nijssen, Bart; Newman, Andy; Clark, Martyn; Nowak, Kenneth; Arnold, Jeffrey
2017-04-01
Many if not most national operational streamflow prediction systems rely on a forecaster-in-the-loop approach that require the hands-on-effort of an experienced human forecaster. This approach evolved from the need to correct for long-standing deficiencies in the models and datasets used in forecasting, and the practice often leads to skillful flow predictions despite the use of relatively simple, conceptual models. Yet the 'in-the-loop' forecast process is not reproducible, which limits opportunities to assess and incorporate new techniques systematically, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun develop more centralized, 'over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, many national operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as such systems are beginning to be deployed operationally in centers such as ECMWF. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the US National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis Research and Prediction Applications' (SHARP) to implement, assess and demonstrate real-time over-the-loop ensemble flow forecasts in a range of US watersheds. The system relies on fully ensemble techniques, including: an 100-member ensemble of meteorological model forcings and an ensemble particle filter data assimilation for initializing watershed states; analog/regression-based downscaling of ensemble weather forecasts from GEFS; and statistical post-processing of ensemble forecast outputs, all of which run in real-time within a workflow managed by ECWMF's ecFlow libraries over large US regional domains. We describe SHARP and present early hindcast and verification results for short to seasonal range streamflow forecasts in a number of US case study watersheds.
Guidelines for mission integration, a summary report
NASA Technical Reports Server (NTRS)
1979-01-01
Guidelines are presented for instrument/experiment developers concerning hardware design, flight verification, and operations and mission implementation requirements. Interface requirements between the STS and instruments/experiments are defined. Interface constraints and design guidelines are presented along with integrated payload requirements for Spacelab Missions 1, 2, and 3. Interim data are suggested for use during hardware development until more detailed information is developed when a complete mission and an integrated payload system are defined. Safety requirements, flight verification requirements, and operations procedures are defined.
Test/QA Plan For Verification Of Anaerobic Digester For Energy Production And Pollution Prevention
The ETV-ESTE Program conducts third-party verification testing of commercially available technologies that improve the environmental conditions in the U.S. A stakeholder committee of buyers and users of such technologies guided the development of this test on anaerobic digesters...
The paper discusses greenhouse gas (GHG) mitigation and monitoring technology performance activities of the GHG Technology Verification Center. The Center is a public/private partnership between Southern Research Institute and the U.S. EPA's Office of Research and Development. It...
As part of a U.S. Environmental Protection Agency Environmental Technology Verification program, the Research Triangle Institute (RTI) developed a test protocol for measuring volatile organic compounds and aldehydes in a large chamber. RTI convened stakeholders for the commercial...
The Environmental Technology Verification Program (ETV) was established in 1995 by the U.S. Environmental Protection Agency to encourage the development and commercialization of new environmental technologies through third part testing and reporting of performance data. By ensur...
U.S. EPA Environmental Technology Verification Program, the Founder of the ETV Concept
The U.S. EPA Environmental Technology Verification (ETV) Program develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program was created in 1995 to help accelerate t...
Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images
Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong
2015-01-01
In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods. PMID:26703596
Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images.
Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong
2015-12-12
In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods.
Retina verification system based on biometric graph matching.
Lajevardi, Seyed Mehdi; Arakala, Arathi; Davis, Stephen A; Horadam, Kathy J
2013-09-01
This paper presents an automatic retina verification framework based on the biometric graph matching (BGM) algorithm. The retinal vasculature is extracted using a family of matched filters in the frequency domain and morphological operators. Then, retinal templates are defined as formal spatial graphs derived from the retinal vasculature. The BGM algorithm, a noisy graph matching algorithm, robust to translation, non-linear distortion, and small rotations, is used to compare retinal templates. The BGM algorithm uses graph topology to define three distance measures between a pair of graphs, two of which are new. A support vector machine (SVM) classifier is used to distinguish between genuine and imposter comparisons. Using single as well as multiple graph measures, the classifier achieves complete separation on a training set of images from the VARIA database (60% of the data), equaling the state-of-the-art for retina verification. Because the available data set is small, kernel density estimation (KDE) of the genuine and imposter score distributions of the training set are used to measure performance of the BGM algorithm. In the one dimensional case, the KDE model is validated with the testing set. A 0 EER on testing shows that the KDE model is a good fit for the empirical distribution. For the multiple graph measures, a novel combination of the SVM boundary and the KDE model is used to obtain a fair comparison with the KDE model for the single measure. A clear benefit in using multiple graph measures over a single measure to distinguish genuine and imposter comparisons is demonstrated by a drop in theoretical error of between 60% and more than two orders of magnitude.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rambo, Patrick; Schwarz, Jens; Kimmel, Mark
We have developed high damage threshold filters to modify the spatial profile of a high energy laser beam. The filters are formed by laser ablation of a transmissive window. The ablation sites constitute scattering centers which can be filtered in a subsequent spatial filter. Finally, by creating the filters in dielectric materials, we see an increased laser-induced damage threshold from previous filters created using ‘metal on glass’ lithography.
Rambo, Patrick; Schwarz, Jens; Kimmel, Mark; ...
2016-09-27
We have developed high damage threshold filters to modify the spatial profile of a high energy laser beam. The filters are formed by laser ablation of a transmissive window. The ablation sites constitute scattering centers which can be filtered in a subsequent spatial filter. Finally, by creating the filters in dielectric materials, we see an increased laser-induced damage threshold from previous filters created using ‘metal on glass’ lithography.
NASA Technical Reports Server (NTRS)
1993-01-01
The Aquaspace H2OME Guardian Water Filter, available through Western Water International, Inc., reduces lead in water supplies. The filter is mounted on the faucet and the filter cartridge is placed in the "dead space" between sink and wall. This filter is one of several new filtration devices using the Aquaspace compound filter media, which combines company developed and NASA technology. Aquaspace filters are used in industrial, commercial, residential, and recreational environments as well as by developing nations where water is highly contaminated.
Automated Installation Verification of COMSOL via LiveLink for MATLAB
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowell, Michael W
Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oakmore » Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).« less
NASA Technical Reports Server (NTRS)
Mccllough, J. R.; Sharpe, A.; Doetsch, K. H.
1980-01-01
The SIMFAC has played a vital role in the design, development, and performance verification of the shuttle remote manipulator system (SRMS) to be installed in the space shuttle orbiter. The facility provides for realistic man-in-the-loop operation of the SRMS by an operator in the operator complex, a flightlike crew station patterned after the orbiter aft flight deck with all necessary man machine interface elements, including SRMS displays and controls and simulated out-of-the-window and CCTV scenes. The characteristics of the manipulator system, including arm and joint servo dynamics and control algorithms, are simulated by a comprehensive mathematical model within the simulation subsystem of the facility. Major studies carried out using SIMFAC include: SRMS parameter sensitivity evaluations; the development, evaluation, and verification of operating procedures; and malfunction simulation and analysis of malfunction performance. Among the most important and comprehensive man-in-the-loop simulations carried out to date on SIMFAC are those which support SRMS performance verification and certification when the SRMS is part of the integrated orbiter-manipulator system.
Enrichment Assay Methods Development for the Integrated Cylinder Verification System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.
2009-10-22
International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brubaker, Erik; Deland, Sharon M.
This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less
Survey of Verification and Validation Techniques for Small Satellite Software Development
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2015-01-01
The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.
Extremely accurate sequential verification of RELAP5-3D
Mesina, George L.; Aumiller, David L.; Buschman, Francis X.
2015-11-19
Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less
Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D
2014-03-01
Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.
2016-06-01
UNCLASSIFIED Development of GPS Receiver Kalman Filter Algorithms for Stationary, Low-Dynamics, and High-Dynamics Applications Peter W. Sarunic 1 1...determine instantaneous estimates of receiver position and then goes on to develop three Kalman filter based estimators, which use stationary receiver...used in actual GPS receivers, and cover a wide range of applications. While the standard form of the Kalman filter , of which the three filters just
Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system was evaluated based on the Capstone 30kW Microturbine developed by Cain Ind...
78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
ERIC Educational Resources Information Center
Garrett, Gary L.; Zinsmeister, Joanne T.
This document reports research focusing on physical therapists and physical therapist assistant role delineation refinement and verification; entry-level role determinations; and translation of these roles into an examination development protocol and examination blueprint specifications. Following an introduction, section 2 describes the survey…
Developing topic-specific search filters for PubMed with click-through data.
Li, J; Lu, Z
2013-01-01
Search filters have been developed and demonstrated for better information access to the immense and ever-growing body of publications in the biomedical domain. However, to date the number of filters remains quite limited because the current filter development methods require significant human efforts in manual document review and filter term selection. In this regard, we aim to investigate automatic methods for generating search filters. We present an automated method to develop topic-specific filters on the basis of users' search logs in PubMed. Specifically, for a given topic, we first detect its relevant user queries and then include their corresponding clicked articles to serve as the topic-relevant document set accordingly. Next, we statistically identify informative terms that best represent the topic-relevant document set using a background set composed of topic irrelevant articles. Lastly, the selected representative terms are combined with Boolean operators and evaluated on benchmark datasets to derive the final filter with the best performance. We applied our method to develop filters for four clinical topics: nephrology, diabetes, pregnancy, and depression. For the nephrology filter, our method obtained performance comparable to the state of the art (sensitivity of 91.3%, specificity of 98.7%, precision of 94.6%, and accuracy of 97.2%). Similarly, high-performing results (over 90% in all measures) were obtained for the other three search filters. Based on PubMed click-through data, we successfully developed a high-performance method for generating topic-specific search filters that is significantly more efficient than existing manual methods. All data sets (topic-relevant and irrelevant document sets) used in this study and a demonstration system are publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/downloads/CQ_filter/
Developing Topic-Specific Search Filters for PubMed with Click-Through Data
Li, Jiao; Lu, Zhiyong
2013-01-01
Summary Objectives Search filters have been developed and demonstrated for better information access to the immense and ever-growing body of publications in the biomedical domain. However, to date the number of filters remains quite limited because the current filter development methods require significant human efforts in manual document review and filter term selection. In this regard, we aim to investigate automatic methods for generating search filters. Methods We present an automated method to develop topic-specific filters on the basis of users’ search logs in PubMed. Specifically, for a given topic, we first detect its relevant user queries and then include their corresponding clicked articles to serve as the topic-relevant document set accordingly. Next, we statistically identify informative terms that best represent the topic-relevant document set using a background set composed of topic irrelevant articles. Lastly, the selected representative terms are combined with Boolean operators and evaluated on benchmark datasets to derive the final filter with the best performance. Results We applied our method to develop filters for four clinical topics: nephrology, diabetes, pregnancy, and depression. For the nephrology filter, our method obtained performance comparable to the state of the art (sensitivity of 91.3%, specificity of 98.7%, precision of 94.6%, and accuracy of 97.2%). Similarly, high-performing results (over 90% in all measures) were obtained for the other three search filters. Conclusion Based on PubMed click-through data, we successfully developed a high-performance method for generating topic-specific search filters that is significantly more efficient than existing manual methods. All data sets (topic-relevant and irrelevant document sets) used in this study and a demonstration system are publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/downloads/CQ_filter/ PMID:23666447
Debris control design achievements of the booster separation motors
NASA Technical Reports Server (NTRS)
Smith, G. W.; Chase, C. A.
1985-01-01
The stringent debris control requirements imposed on the design of the Space Shuttle booster separation motor are described along with the verification program implemented to ensure compliance with debris control objectives. The principal areas emphasized in the design and development of the Booster Separation Motor (BSM) relative to debris control were the propellant formulation and nozzle closures which protect the motors from aerodynamic heating and moisture. A description of the motor design requirements, the propellant formulation and verification program, and the nozzle closures design and verification are presented.
NASA Astrophysics Data System (ADS)
Wang, Y. S.; Shen, G. Q.; Guo, H.; Tang, X. L.; Hamade, T.
2013-08-01
In this paper, a roughness model, which is based on human auditory perception (HAP) and known as HAP-RM, is developed for the sound quality evaluation (SQE) of vehicle noise. First, the interior noise signals are measured for a sample vehicle and prepared for roughness modelling. The HAP-RM model is based on the process of sound transfer and perception in the human auditory system by combining the structural filtering function and nonlinear perception characteristics of the ear. The HAP-RM model is applied to the measured vehicle interior noise signals by considering the factors that affect hearing, such as the modulation and carrier frequencies, the time and frequency maskings and the correlations of the critical bands. The HAP-RM model is validated by jury tests. An anchor-scaled scoring method (ASM) is used for subjective evaluations in the jury tests. The verification results show that the novel developed model can accurately calculate vehicle noise roughness below 0.6 asper. Further investigation shows that the total roughness of the vehicle interior noise can mainly be attributed to frequency components below 12 Bark. The time masking effects of the modelling procedure enable the application of the HAP-RM model to stationary and nonstationary vehicle noise signals and the SQE of other sound-related signals in engineering problems.
NASA Astrophysics Data System (ADS)
Erazo, Kalil; Nagarajaiah, Satish
2017-06-01
In this paper an offline approach for output-only Bayesian identification of stochastic nonlinear systems is presented. The approach is based on a re-parameterization of the joint posterior distribution of the parameters that define a postulated state-space stochastic model class. In the re-parameterization the state predictive distribution is included, marginalized, and estimated recursively in a state estimation step using an unscented Kalman filter, bypassing state augmentation as required by existing online methods. In applications expectations of functions of the parameters are of interest, which requires the evaluation of potentially high-dimensional integrals; Markov chain Monte Carlo is adopted to sample the posterior distribution and estimate the expectations. The proposed approach is suitable for nonlinear systems subjected to non-stationary inputs whose realization is unknown, and that are modeled as stochastic processes. Numerical verification and experimental validation examples illustrate the effectiveness and advantages of the approach, including: (i) an increased numerical stability with respect to augmented-state unscented Kalman filtering, avoiding divergence of the estimates when the forcing input is unmeasured; (ii) the ability to handle arbitrary prior and posterior distributions. The experimental validation of the approach is conducted using data from a large-scale structure tested on a shake table. It is shown that the approach is robust to inherent modeling errors in the description of the system and forcing input, providing accurate prediction of the dynamic response when the excitation history is unknown.
Ghamari, M; Soltanpur, C; Cabrera, S; Romero, R; Martinek, R; Nazeran, H
2016-08-01
Heart Rate Variability (HRV) signal analysis provides a quantitative marker of the Autonomic Nervous System (ANS) function. A wristband-type wireless photoplethysmographic (PPG) device was custom-designed to collect and analyze the arterial pulse in the wrist. The proposed device is comprised of an optical sensor to monitor arterial pulse, a signal conditioning unit to filter and amplify the analog PPG signal, a microcontroller to digitize the analog PPG signal, and a Bluetooth module to transfer the data to a smart device. This paper proposes a novel model to represent the PPG signal as the summation of two Gaussian functions. The paper concludes with a verification procedure for HRV signal analysis during sedentary activities.
Decomposed Photo Response Non-Uniformity for Digital Forensic Analysis
NASA Astrophysics Data System (ADS)
Li, Yue; Li, Chang-Tsun
The last few years have seen the applications of Photo Response Non-Uniformity noise (PRNU) - a unique stochastic fingerprint of image sensors, to various types of digital forensic investigations such as source device identification and integrity verification. In this work we proposed a new way of extracting PRNU noise pattern, called Decomposed PRNU (DPRNU), by exploiting the difference between the physical andartificial color components of the photos taken by digital cameras that use a Color Filter Array for interpolating artificial components from physical ones. Experimental results presented in this work have shown the superiority of the proposed DPRNU to the commonly used version. We also proposed a new performance metrics, Corrected Positive Rate (CPR) to evaluate the performance of the common PRNU and the proposed DPRNU.
Comparison of Nonlinear Filtering Techniques for Lunar Surface Roving Navigation
NASA Technical Reports Server (NTRS)
Kimber, Lemon; Welch, Bryan W.
2008-01-01
Leading up to the Apollo missions the Extended Kalman Filter, a modified version of the Kalman Filter, was developed to estimate the state of a nonlinear system. Throughout the Apollo missions, Potter's Square Root Filter was used for lunar navigation. Now that NASA is returning to the Moon, the filters used during the Apollo missions must be compared to the filters that have been developed since that time, the Bierman-Thornton Filter (UD) and the Unscented Kalman Filter (UKF). The UD Filter involves factoring the covariance matrix into UDUT and has similar accuracy to the Square Root Filter; however it requires less computation time. Conversely, the UKF, which uses sigma points, is much more computationally intensive than any of the filters; however it produces the most accurate results. The Extended Kalman Filter, Potter's Square Root Filter, the Bierman-Thornton UD Filter, and the Unscented Kalman Filter each prove to be the most accurate filter depending on the specific conditions of the navigation system.
Electron/proton spectrometer certification documentation analyses
NASA Technical Reports Server (NTRS)
Gleeson, P.
1972-01-01
A compilation of analyses generated during the development of the electron-proton spectrometer for the Skylab program is presented. The data documents the analyses required by the electron-proton spectrometer verification plan. The verification plan was generated to satisfy the ancillary hardware requirements of the Apollo Applications program. The certification of the spectrometer requires that various tests, inspections, and analyses be documented, approved, and accepted by reliability and quality control personnel of the spectrometer development program.
Component Verification and Certification in NASA Missions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)
2001-01-01
Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.
A new technique for measuring listening and reading literacy in developing countries
NASA Astrophysics Data System (ADS)
Greene, Barbara A.; Royer, James M.; Anzalone, Stephen
1990-03-01
One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.
Development and evaluation of evidence-based nursing (EBN) filters and related databases*
Lavin, Mary A.; Krieger, Mary M.; Meyer, Geralyn A.; Spasser, Mark A.; Cvitan, Tome; Reese, Cordie G.; Carlson, Judith H.; Perry, Anne G.; McNary, Patricia
2005-01-01
Objectives: Difficulties encountered in the retrieval of evidence-based nursing (EBN) literature and recognition of terminology, research focus, and design differences between evidence-based medicine and nursing led to the realization that nursing needs its own filter strategies for evidence-based practice. This article describes the development and evaluation of filters that facilitate evidence-based nursing searches. Methods: An inductive, multistep methodology was employed. A sleep search strategy was developed for uniform application to all filters for filter development and evaluation purposes. An EBN matrix was next developed as a framework to illustrate conceptually the placement of nursing-sensitive filters along two axes: horizontally, an adapted nursing process, and vertically, levels of evidence. Nursing diagnosis, patient outcomes, and primary data filters were developed recursively. Through an interface with the PubMed search engine, the EBN matrix filters were inserted into a database that executes filter searches, retrieves citations, and stores and updates retrieved citations sets hourly. For evaluation purposes, the filters were subjected to sensitivity and specificity analyses and retrieval set comparisons. Once the evaluation was complete, hyperlinks providing access to any one or a combination of completed filters to the EBN matrix were created. Subject searches on any topic may be applied to the filters, which interface with PubMed. Results: Sensitivity and specificity for the combined nursing diagnosis and primary data filter were 64% and 99%, respectively; for the patient outcomes filter, the results were 75% and 71%, respectively. Comparisons were made between the EBN matrix filters (nursing diagnosis and primary data) and PubMed's Clinical Queries (diagnosis and sensitivity) filters. Additional comparisons examined publication types and indexing differences. Review articles accounted for the majority of the publication type differences, because “review” was accepted by the CQ but was “NOT'd” by the EBN filter. Indexing comparisons revealed that although the term “nursing diagnosis” is in Medical Subject Headings (MeSH), the nursing diagnoses themselves (e.g., sleep deprivation, disturbed sleep pattern) are not indexed as nursing diagnoses. As a result, abstracts deemed to be appropriate nursing diagnosis by the EBN filter were not accepted by the CQ diagnosis filter. Conclusions: The EBN filter capture of desired articles may be enhanced by further refinement to achieve a greater degree of filter sensitivity. Retrieval set comparisons revealed publication type differences and indexing issues. The EBN matrix filter “NOT'd” out “review,” while the CQ filter did not. Indexing issues were identified that explained the retrieval of articles deemed appropriate by the EBN filter matrix but not included in the CQ retrieval. These results have MeSH definition and indexing implications as well as implications for clinical decision support in nursing practice. PMID:15685282
Software engineering and automatic continuous verification of scientific software
NASA Astrophysics Data System (ADS)
Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.
2011-12-01
Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation
NASA Astrophysics Data System (ADS)
Yoshida, Toshio
In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.
Guided-mode resonance nanophotonics in materially sparse architectures
NASA Astrophysics Data System (ADS)
Magnusson, Robert; Niraula, Manoj; Yoon, Jae W.; Ko, Yeong H.; Lee, Kyu J.
2016-03-01
The guided-mode resonance (GMR) concept refers to lateral quasi-guided waveguide modes induced in periodic layers. Whereas these effects have been known for a long time, new attributes and innovations continue to appear. Here, we review some recent progress in this field with emphasis on sparse, or minimal, device embodiments. We discuss properties of wideband resonant reflectors designed with gratings in which the grating ridges are matched to an identical material to eliminate local reflections and phase changes. This critical interface therefore possesses zero refractive-index contrast; hence we call them "zero-contrast gratings." Applying this architecture, we present single-layer, wideband reflectors that are robust under experimentally realistic parametric variations. We introduce a new class of reflectors and polarizers fashioned with dielectric nanowire grids that are mostly empty space. Computed results predict high reflection and attendant polarization extinction for these sparse lattices. Experimental verification with Si nanowire grids yields ~200-nm-wide band of high reflection for one polarization state and free transmission of the orthogonal state. Finally, we present bandpass filters using all-dielectric resonant gratings. We design, fabricate, and test nanostructured single layer filters exhibiting high efficiency and sub-nanometer-wide passbands surrounded by 100-nm-wide stopbands.
Intelligent voltage control strategy for three-phase UPS inverters with output LC filter
NASA Astrophysics Data System (ADS)
Jung, J. W.; Leu, V. Q.; Dang, D. Q.; Do, T. D.; Mwasilu, F.; Choi, H. H.
2015-08-01
This paper presents a supervisory fuzzy neural network control (SFNNC) method for a three-phase inverter of uninterruptible power supplies (UPSs). The proposed voltage controller is comprised of a fuzzy neural network control (FNNC) term and a supervisory control term. The FNNC term is deliberately employed to estimate the uncertain terms, and the supervisory control term is designed based on the sliding mode technique to stabilise the system dynamic errors. To improve the learning capability, the FNNC term incorporates an online parameter training methodology, using the gradient descent method and Lyapunov stability theory. Besides, a linear load current observer that estimates the load currents is used to exclude the load current sensors. The proposed SFNN controller and the observer are robust to the filter inductance variations, and their stability analyses are described in detail. The experimental results obtained on a prototype UPS test bed with a TMS320F28335 DSP are presented to validate the feasibility of the proposed scheme. Verification results demonstrate that the proposed control strategy can achieve smaller steady-state error and lower total harmonic distortion when subjected to nonlinear or unbalanced loads compared to the conventional sliding mode control method.
Tracking of multiple targets using online learning for reference model adaptation.
Pernkopf, Franz
2008-12-01
Recently, much work has been done in multiple object tracking on the one hand and on reference model adaptation for a single-object tracker on the other side. In this paper, we do both tracking of multiple objects (faces of people) in a meeting scenario and online learning to incrementally update the models of the tracked objects to account for appearance changes during tracking. Additionally, we automatically initialize and terminate tracking of individual objects based on low-level features, i.e., face color, face size, and object movement. Many methods unlike our approach assume that the target region has been initialized by hand in the first frame. For tracking, a particle filter is incorporated to propagate sample distributions over time. We discuss the close relationship between our implemented tracker based on particle filters and genetic algorithms. Numerous experiments on meeting data demonstrate the capabilities of our tracking approach. Additionally, we provide an empirical verification of the reference model learning during tracking of indoor and outdoor scenes which supports a more robust tracking. Therefore, we report the average of the standard deviation of the trajectories over numerous tracking runs depending on the learning rate.
Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development
2017-09-29
Filter Standards Development September 29, 2017 Approved for public release; distribution is unlimited. Thomas E. suTTo Materials and Systems Branch...LIMITATION OF ABSTRACT Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development Thomas E. Sutto Naval Research...approach, developed by NRL, is tested by examining the filter behavior against a number of chemicals to determine if the NRL approach resulted in the
A Hybrid On-line Verification Method of Relay Setting
NASA Astrophysics Data System (ADS)
Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin
2017-05-01
Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.
NASA Astrophysics Data System (ADS)
McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye
1997-06-01
The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.