Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MCGREW, D.L.
2001-10-31
This Requirements Verification Report provides the traceability of how Project W-314 fulfilled the Project Development Specification requirements for the AN Farm to 200E Waste Transfer System Upgrade package.
Requirements, Verification, and Compliance (RVC) Database Tool
NASA Technical Reports Server (NTRS)
Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale
2001-01-01
This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".
Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation
NASA Technical Reports Server (NTRS)
Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna
2000-01-01
This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
Formal System Verification for Trustworthy Embedded Systems
2011-04-19
microkernel basis. We had previously achieved code- level formal verification of the seL4 microkernel [3]. In the present project, over 12 months with 0.6 FTE...project, we designed and implemented a secure network access device (SAC) on top of the verified seL4 microkernel. The device allows a trusted front...Engelhardt, Rafal Kolan- ski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. seL4 : Formal verification of an OS kernel. CACM, 53(6):107
C formal verification with unix communication and concurrency
NASA Technical Reports Server (NTRS)
Hoover, Doug N.
1990-01-01
The results of a NASA SBIR project are presented in which CSP-Ariel, a verification system for C programs which use Unix system calls for concurrent programming, interprocess communication, and file input and output, was developed. This project builds on ORA's Ariel C verification system by using the system of Hoare's book, Communicating Sequential Processes, to model concurrency and communication. The system runs in ORA's Clio theorem proving environment. The use of CSP to model Unix concurrency and sketch the CSP semantics of a simple concurrent program is outlined. Plans for further development of CSP-Ariel are discussed. This paper is presented in viewgraph form.
Formal Verification of the AAMP-FV Microcode
NASA Technical Reports Server (NTRS)
Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam
1999-01-01
This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.
Idaho out-of-service verification field operational test
DOT National Transportation Integrated Search
2000-02-01
The Out-of-Service Verification Field Operational Test Project was initiated in 1994. The purpose of the project was to test the feasibility of using sensors and a computerized tracking system to augment the ability of inspectors to monitor and contr...
NASA Astrophysics Data System (ADS)
Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui
2011-05-01
During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.
Formal verification of an avionics microprocessor
NASA Technical Reports Server (NTRS)
Srivas, Mandayam, K.; Miller, Steven P.
1995-01-01
Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
Documentation requirements for Applications Systems Verification and Transfer projects (ASVTs)
NASA Technical Reports Server (NTRS)
Suchy, J. T.
1977-01-01
NASA's Application Systems Verification and Transfer Projects (ASVTs) are deliberate efforts to facilitate the transfer of applications of NASA-developed space technology to users such as federal agencies, state and local governments, regional planning groups, public service institutions, and private industry. This study focused on the role of documentation in facilitating technology transfer both to primary users identified during project planning and to others with similar information needs. It was understood that documentation can be used effectively when it is combined with informal (primarily verbal) communication within each user community and with other formal techniques such as organized demonstrations and training programs. Documentation examples from eight ASVT projects and one potential project were examined to give scope to the investigation.
Guidance and Control Software Project Data - Volume 3: Verification Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
Space shuttle engineering and operations support. Avionics system engineering
NASA Technical Reports Server (NTRS)
Broome, P. A.; Neubaur, R. J.; Welsh, R. T.
1976-01-01
The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.
Intermediate Experimental Vehicle (IXV): Avionics and Software of the ESA Reentry Demonstrator
NASA Astrophysics Data System (ADS)
Malucchi, Giovanni; Dussy, Stephane; Camuffo, Fabrizio
2012-08-01
The IXV project is conceived as a technology platform that would perform the step forward with respect to the Atmospheric Reentry Demonstrator (ARD), by increasing the system maneuverability and verifying the critical technology performances against a wider re- entry corridor.The main objective is to design, develop and to perform an in-flight verification of an autonomous lifting and aerodynamically controlled (by a combined use of thrusters and aerodynamic surfaces) reentry system.The project also includes the verification and experimentation of a set of critical reentry technologies and disciplines:Thermal Protection System (TPS), for verification and characterization of thermal protection technologies in representative operational environment;Aerodynamics - Aerthermodynamics (AED-A TD), for understanding and validation of aerodynamics and aerothermodyamics phenomena with improvement of design tools;Guidance, Navigation and Control (GNC), for verification of guidance, navigation and control techniques in representative operational environment (i.e. reentry from Low Earth Orbit);Flight dynamics, to update and validate the vehicle model during actual flight, focused on stability and control derivatives.The above activities are being performed through the implementation of a strict system design-to-cost approach with a proto-flight model development philosophy.In 2008 and 2009, the IXV project activities reached the successful completion of the project Phase-B, including the System PDR, and early project Phase-C.In 2010, following a re-organization of the industrial consortium, the IXV project successfully completed a design consolidation leading to an optimization of the technical baseline including the GNC, avionics (i.e. power, data handling, radio frequency and telemetry), measurement sensors, hot and cold composite structures, thermal protections and control, with significant improvements of the main system budgets.The project has successfully closed the System CDR during 2011 and it is currently running the Phase-D with the target to be launched with Vega from Kourou in 2014The paper will provide an overview of the IXV design and mission objectives in the frame of the atmospheric reentry overall activities, focusing on the avionics and software architecture and design.
Definition of ground test for Large Space Structure (LSS) control verification
NASA Technical Reports Server (NTRS)
Waites, H. B.; Doane, G. B., III; Tollison, D. K.
1984-01-01
An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.
Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program
NASA Technical Reports Server (NTRS)
Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby
2017-01-01
Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.
SU-E-I-56: Scan Angle Reduction for a Limited-Angle Intrafraction Verification (LIVE) System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, L; Zhang, Y; Yin, F
Purpose: To develop a novel adaptive reconstruction strategy to further reduce the scanning angle required by the limited-angle intrafraction verification (LIVE) system for intrafraction verification. Methods: LIVE acquires limited angle MV projections from the exit fluence of the arc treatment beam or during gantry rotation between static beams. Orthogonal limited-angle kV projections are also acquired simultaneously to provide additional information. LIVE considers the on-board 4D-CBCT images as a deformation of the prior 4D-CT images, and solves the deformation field based on deformation models and data fidelity constraint. LIVE reaches a checkpoint after a limited-angle scan, and reconstructs 4D-CBCT for intrafractionmore » verification at the checkpoint. In adaptive reconstruction strategy, a larger scanning angle of 30° is used for the first checkpoint, and smaller scanning angles of 15° are used for subsequent checkpoints. The onboard images reconstructed at the previous adjacent checkpoint are used as the prior images for reconstruction at the current checkpoint. As the algorithm only needs to reconstruct the small deformation occurred between adjacent checkpoints, projections from a smaller scan angle provide enough information for the reconstruction. XCAT was used to simulate tumor motion baseline drift of 2mm along sup-inf direction at every subsequent checkpoint, which are 15° apart. Adaptive reconstruction strategy was used to reconstruct the images at each checkpoint using orthogonal 15° kV and MV projections. Results: Results showed that LIVE reconstructed the tumor volumes accurately using orthogonal 15° kV-MV projections. Volume percentage differences (VPDs) were within 5% and center of mass shifts (COMS) were within 1mm for reconstruction at all checkpoints. Conclusion: It's feasible to use an adaptive reconstruction strategy to further reduce the scan angle needed by LIVE to allow faster and more frequent intrafraction verification to minimize the treatment errors in lung cancer treatments. Grant from Varian Medical System.« less
System verification and validation: a fundamental systems engineering task
NASA Astrophysics Data System (ADS)
Ansorge, Wolfgang R.
2004-09-01
Systems Engineering (SE) is the discipline in a project management team, which transfers the user's operational needs and justifications for an Extremely Large Telescope (ELT) -or any other telescope-- into a set of validated required system performance characteristics. Subsequently transferring these validated required system performance characteris-tics into a validated system configuration, and eventually into the assembled, integrated telescope system with verified performance characteristics and provided it with "objective evidence that the particular requirements for the specified intended use are fulfilled". The latter is the ISO Standard 8402 definition for "Validation". This presentation describes the verification and validation processes of an ELT Project and outlines the key role System Engineering plays in these processes throughout all project phases. If these processes are implemented correctly into the project execution and are started at the proper time, namely at the very beginning of the project, and if all capabilities of experienced system engineers are used, the project costs and the life-cycle costs of the telescope system can be reduced between 25 and 50 %. The intention of this article is, to motivate and encourage project managers of astronomical telescopes and scientific instruments to involve the entire spectrum of Systems Engineering capabilities performed by trained and experienced SYSTEM engineers for the benefit of the project by explaining them the importance of Systems Engineering in the AIV and validation processes.
Verification of micro-scale photogrammetry for smooth three-dimensional object measurement
NASA Astrophysics Data System (ADS)
Sims-Waterhouse, Danny; Piano, Samanta; Leach, Richard
2017-05-01
By using sub-millimetre laser speckle pattern projection we show that photogrammetry systems are able to measure smooth three-dimensional objects with surface height deviations less than 1 μm. The projection of laser speckle patterns allows correspondences on the surface of smooth spheres to be found, and as a result, verification artefacts with low surface height deviations were measured. A combination of VDI/VDE and ISO standards were also utilised to provide a complete verification method, and determine the quality parameters for the system under test. Using the proposed method applied to a photogrammetry system, a 5 mm radius sphere was measured with an expanded uncertainty of 8.5 μm for sizing errors, and 16.6 μm for form errors with a 95 % confidence interval. Sphere spacing lengths between 6 mm and 10 mm were also measured by the photogrammetry system, and were found to have expanded uncertainties of around 20 μm with a 95 % confidence interval.
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
MESA: Message-Based System Analysis Using Runtime Verification
NASA Technical Reports Server (NTRS)
Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter
2017-01-01
In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.
Towards the Formal Verification of a Distributed Real-Time Automotive System
NASA Technical Reports Server (NTRS)
Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey
2010-01-01
We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y; Yin, F; Ren, L
Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to furthermore » reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The technique enables LIVE to accurately reconstruct 4D-CBCT images using only orthogonal 6° angle, which greatly improves the efficiency and reduces dose of LIVE for intrafraction verification.« less
Black carbon is a term that is commonly used to describe strongly light absorbing carbon (LAC), which is thought to play a significant role in global climate change through direct absorption of light, interaction with clouds, and by reducing the reflectivity of snow and ice. BC ...
Submicron Systems Architecture Project
1981-11-01
This project is concerned with the architecture , design , and testing of VLSI Systems. The principal activities in this report period include: The Tree Machine; COPE, The Homogeneous Machine; Computational Arrays; Switch-Level Model for MOS Logic Design; Testing; Local Network and Designer Workstations; Self-timed Systems; Characterization of Deadlock Free Resource Contention; Concurrency Algebra; Language Design and Logic for Program Verification.
Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring
NASA Technical Reports Server (NTRS)
Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.
2015-01-01
Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.
NASA Technical Reports Server (NTRS)
Lewis, James L.
2011-01-01
The NASA Docking System (NDS) is NASA's implementation for the emerging International Docking System Standard (IDSS) using low impact docking technology. The NASA Docking System Project (NDSP) is the International Space Station (ISS) Program's project to produce the NDS, Common Docking Adapter (CDA) and Docking Hub. The NDS design evolved from the Low Impact Docking System (LIDS). The acronym international Low Impact Docking System (iLIDS) is also used to describe this system as well as the Government Furnished Equipment (GFE) project designing the NDS for the NDSP. NDS and iLIDS may be used interchangeability. This document will use the acronym iLIDS. Some of the heritage documentation and implementations (e.g., software command names, requirement identification (ID), figures, etc.) used on NDS will continue to use the LIDS acronym. This specification defines the technical requirements for the iLIDS GFE delivered to the NDSP by the iLIDS project. This document contains requirements for two iLIDS configurations, SEZ29101800-301 and SEZ29101800-302. Requirements with the statement, iLIDS shall, are for all configurations. Examples of requirements that are unique to a single configuration may be identified as iLIDS (-301) shall or iLIDS (-302) shall. Furthermore, to allow a requirement to encompass all configurations with an exception, the requirement may be designated as iLIDS (excluding -302) shall. Verification requirements for the iLIDS project are identified in the Verification Matrix (VM) provided in the iLIDS Verification and Validation Document, JSC-63966. The following definitions differentiate between requirements and other statements: Shall: This is the only verb used for the binding requirements. Should/May: These verbs are used for stating non-mandatory goals. Will: This verb is used for stating facts or declaration of purpose. A Definition of Terms table is provided in Appendix B to define those terms with specific tailored uses in this document.
IT Project Success w\\7120 and 7123 NPRs to Achieve Project Success
NASA Technical Reports Server (NTRS)
Walley, Tina L.
2009-01-01
This slide presentation reviews management techniques to assure information technology development project success. Details include the work products, the work breakdown structure (WBS), system integration, verification and validation (IV&V), and deployment and operations. An example, the NASA Consolidated Active Directory (NCAD), is reviewed.
Projects in an expert system class
NASA Technical Reports Server (NTRS)
Whitson, George M.
1991-01-01
Many universities now teach courses in expert systems. In these courses students study the architecture of an expert system, knowledge acquisition techniques, methods of implementing systems and verification and validation techniques. A major component of any such course is a class project consisting of the design and implementation of an expert system. Discussed here are a number of techniques that we have used at the University of Texas at Tyler to develop meaningful projects that could be completed in a semester course.
Model-based verification and validation of the SMAP uplink processes
NASA Astrophysics Data System (ADS)
Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.
Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.
Design, analysis and test verification of advanced encapsulation systems
NASA Technical Reports Server (NTRS)
Garcia, A.; Minning, C.
1982-01-01
Analytical models were developed to perform optical, thermal, electrical and structural analyses on candidate encapsulation systems. Qualification testing, specimens of various types, and a finalized optimum design are projected.
SMAP Verification and Validation Project - Final Report
NASA Technical Reports Server (NTRS)
Murry, Michael
2012-01-01
In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.
Optimum-AIV: A planning and scheduling system for spacecraft AIV
NASA Technical Reports Server (NTRS)
Arentoft, M. M.; Fuchs, Jens J.; Parrod, Y.; Gasquet, Andre; Stader, J.; Stokes, I.; Vadon, H.
1991-01-01
A project undertaken for the European Space Agency (ESA) is presented. The project is developing a knowledge based software system for planning and scheduling of activities for spacecraft assembly, integration, and verification (AIV). The system extends into the monitoring of plan execution and the plan repair phase. The objectives are to develop an operational kernel of a planning, scheduling, and plan repair tool, called OPTIMUM-AIV, and to provide facilities which will allow individual projects to customize the kernel to suit its specific needs. The kernel shall consist of a set of software functionalities for assistance in initial specification of the AIV plan, in verification and generation of valid plans and schedules for the AIV activities, and in interactive monitoring and execution problem recovery for the detailed AIV plans. Embedded in OPTIMUM-AIV are external interfaces which allow integration with alternative scheduling systems and project databases. The current status of the OPTIMUM-AIV project, as of Jan. 1991, is that a further analysis of the AIV domain has taken place through interviews with satellite AIV experts, a software requirement document (SRD) for the full operational tool was approved, and an architectural design document (ADD) for the kernel excluding external interfaces is ready for review.
NASA Technical Reports Server (NTRS)
1995-01-01
The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.
Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.
2009-01-01
Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.
Non-Proliferation, the IAEA Safeguards System, and the importance of nuclear material measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Rebecca S.
2017-09-18
The objective of this project is to explain the contribution of nuclear material measurements to the system of international verification of State declarations and the non-proliferation of nuclear weapons.
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
System engineering of the Atacama Large Millimeter/submillimeter Array
NASA Astrophysics Data System (ADS)
Bhatia, Ravinder; Marti, Javier; Sugimoto, Masahiro; Sramek, Richard; Miccolis, Maurizio; Morita, Koh-Ichiro; Arancibia, Demián.; Araya, Andrea; Asayama, Shin'ichiro; Barkats, Denis; Brito, Rodrigo; Brundage, William; Grammer, Wes; Haupt, Christoph; Kurlandczyk, Herve; Mizuno, Norikazu; Napier, Peter; Pizarro, Eduardo; Saini, Kamaljeet; Stahlman, Gretchen; Verzichelli, Gianluca; Whyborn, Nick; Yagoubov, Pavel
2012-09-01
The Atacama Large Millimeter/submillimeter Array (ALMA) will be composed of 66 high precision antennae located at 5000 meters altitude in northern Chile. This paper will present the methodology, tools and processes adopted to system engineer a project of high technical complexity, by system engineering teams that are remotely located and from different cultures, and in accordance with a demanding schedule and within tight financial constraints. The technical and organizational complexity of ALMA requires a disciplined approach to the definition, implementation and verification of the ALMA requirements. During the development phase, System Engineering chairs all technical reviews and facilitates the resolution of technical conflicts. We have developed analysis tools to analyze the system performance, incorporating key parameters that contribute to the ultimate performance, and are modeled using best estimates and/or measured values obtained during test campaigns. Strict tracking and control of the technical budgets ensures that the different parts of the system can operate together as a whole within ALMA boundary conditions. System Engineering is responsible for acceptances of the thousands of hardware items delivered to Chile, and also supports the software acceptance process. In addition, System Engineering leads the troubleshooting efforts during testing phases of the construction project. Finally, the team is conducting System level verification and diagnostics activities to assess the overall performance of the observatory. This paper will also share lessons learned from these system engineering and verification approaches.
An analysis of random projection for changeable and privacy-preserving biometric verification.
Wang, Yongjin; Plataniotis, Konstantinos N
2010-10-01
Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.
Project W-314 specific test and evaluation plan for AZ tank farm upgrades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hays, W.H.
1998-08-12
The purpose of this Specific Test and Evaluation Plan (STEP) is to provide a detailed written plan for the systematic testing of modifications made by the addition of the SN-631 transfer line from the AZ-O1A pit to the AZ-02A pit by the W-314 Project. The STEP develops the outline for test procedures that verify the system`s performance to the established Project design criteria. The STEP is a lower tier document based on the W-314 Test and Evaluation P1 an (TEP). Testing includes Validations and Verifications (e.g., Commercial Grade Item Dedication activities, etc), Factory Tests and Inspections (FTIs), installation tests andmore » inspections, Construction Tests and Inspections (CTIs), Acceptance Test Procedures (ATPs), Pre-Operational Test Procedures (POTPs), and Operational Test Procedures (OTPs). The STEP will be utilized in conjunction with the TEP for verification and validation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott William
The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lavietes, A.; Kalkhoran, N.
The overall goal of this project was to demonstrate a compact gamma-ray spectroscopic system with better energy resolution and lower costs than scintillator-based detector systems for uranium enrichment analysis applications.
NASA Technical Reports Server (NTRS)
Hill, Gerald M.; Evans, Richard K.
2009-01-01
A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.
Securely Partitioning Spacecraft Computing Resources: Validation of a Separation Kernel
NASA Astrophysics Data System (ADS)
Bremer, Leon; Schreutelkamp, Erwin
2011-08-01
The F-35 Lightning II, also known as the Joint Strike Fighter, will be the first operational fighter aircraft equipped with an operational MultiShip Embedded Training capability. This onboard training system allows teams of fighter pilots to jointly operate their F-35 in flight against virtual threats, avoiding the need for real adversary air threats and surface threat systems in their training. The European Real-time Operations Simulator (EuroSim) framework is well known in the space domain, particularly in support of engineering and test phases of space system development. In the MultiShip Embedded Training project, EuroSim is not only the essential tool for development and verification throughout the project but is also the engine of the final embedded simulator on board of the F-35 aircraft. The novel ways in which EuroSim is applied in the project in relation to distributed simulation problems, team collaboration, tool chains and embedded systems can benefit many projects and applications. The paper describes the application of EuroSim as the simulation engine of the F-35 Embedded Training solution, the extensions to the EuroSim product that enable this application, and its usage in development and verification of the whole project as carried out at the sites of Dutch Space and the National Aerospace Laboratory (NLR).
Development of automated optical verification technologies for control systems
NASA Astrophysics Data System (ADS)
Volegov, Peter L.; Podgornov, Vladimir A.
1999-08-01
The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.
Proving autonomous vehicle and advanced driver assistance systems safety : final research report.
DOT National Transportation Integrated Search
2016-02-15
The main objective of this project was to provide technology for answering : crucial safety and correctness questions about verification of autonomous : vehicle and advanced driver assistance systems based on logic. : In synergistic activities, we ha...
Alternative Nonvolatile Residue Analysis with Contaminant Identification Project
NASA Technical Reports Server (NTRS)
Loftin, Kathleen (Compiler); Summerfield, Burton (Compiler); Thompson, Karen (Compiler); Mullenix, Pamela (Compiler); Zeitlin, Nancy (Compiler)
2015-01-01
Cleanliness verification is required in numerous industries including spaceflight ground support, electronics, medical and aerospace. Currently at KSC requirement for cleanliness verification use solvents that environmentally unfriendly. This goal of this project is to produce an alternative cleanliness verification technique that is both environmentally friendly and more cost effective.
Oversight of this investigation will be provided by the U.S. Environmental Protection Agency through the Environmental Technology Verification (ETV) Program. This project will be performed by Battelle, which manages the ETV Advanced Monitoring Systems (AMS) Center through a coop...
CPAS Parachute Testing, Model Development, & Verification
NASA Technical Reports Server (NTRS)
Romero, Leah M.
2013-01-01
Capsule Parachute Assembly System (CPAS) is the human rated parachute system for the Orion vehicle used during re-entry. Similar to Apollo parachute design. Human rating requires additional system redundancy. A Government Furnished Equipment (GFE) project responsible for: Design; Development testing; Performance modeling; Fabrication; Qualification; Delivery
The PLAID graphics analysis impact on the space program
NASA Technical Reports Server (NTRS)
Nguyen, Jennifer P.; Wheaton, Aneice L.; Maida, James C.
1994-01-01
An ongoing project design often requires visual verification at various stages. These requirements are critically important because the subsequent phases of that project might depend on the complete verification of a particular stage. Currently, there are several software packages at JSC that provide such simulation capabilities. We present the simulation capabilities of the PLAID modeling system used in the Flight Crew Support Division for human factors analyses. We summarize some ongoing studies in kinematics, lighting, EVA activities, and discuss various applications in the mission planning of the current Space Shuttle flights and the assembly sequence of the Space Station Freedom with emphasis on the redesign effort.
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
Project W-314 specific test and evaluation plan for transfer line SN-633 (241-AX-B to 241-AY-02A)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hays, W.H.
1998-03-20
The purpose of this Specific Test and Evaluation Plan (STEP) is to provide a detailed written plan for the systematic testing of modifications made by the addition of the SN-633 transfer line by the W-314 Project. The STEP develops the outline for test procedures that verify the system`s performance to the established Project design criteria. The STEP is a lower tier document based on the W-314 Test and Evaluation Plan (TEP). This STEP encompasses all testing activities required to demonstrate compliance to the project design criteria as it relates to the addition of transfer line SN-633. The Project Design Specificationsmore » (PDS) identify the specific testing activities required for the Project. Testing includes Validations and Verifications (e.g., Commercial Grade Item Dedication activities), Factory Acceptance Tests (FATs), installation tests and inspections, Construction Acceptance Tests (CATs), Acceptance Test Procedures (ATPs), Pre-Operational Test Procedures (POTPs), and Operational Test Procedures (OTPs). It should be noted that POTPs are not required for testing of the transfer line addition. The STEP will be utilized in conjunction with the TEP for verification and validation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim
This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less
Acceptance test procedure for the L-070 project mechanical equipment and instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loll, C.M.
1996-04-19
This document contains the acceptance test procedure for the mechanical equipment and instrumentation installed per the L-070 Project. The specific system to be tested are the pump controls for the 3906 Lift Station and 350-A Lift Station. In addition, verification that signals are being received by the 300 Area Treated Effluent Disposal Facility control system, is also performed.
NASA Technical Reports Server (NTRS)
Reichert, J. D.
1980-01-01
The Analog Design Verification System (ADVS), the largest single solar collector built, was tested. Referred to as the Solar Gridiron or Bowl Concept, it employs a stationary mirror, with tracking accomplished by the mirror.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
Verification Games: Crowd-Sourced Formal Verification
2016-03-01
VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.
NASA Astrophysics Data System (ADS)
Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.
2008-08-01
The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.
This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1987-01-01
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin
2010-04-01
The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.
Multibody modeling and verification
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1989-01-01
A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.
Specific test and evaluation plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hays, W.H.
1998-03-20
The purpose of this Specific Test and Evaluation Plan (STEP) is to provide a detailed written plan for the systematic testing of modifications made to the 241-AX-B Valve Pit by the W-314 Project. The STEP develops the outline for test procedures that verify the system`s performance to the established Project design criteria. The STEP is a lower tier document based on the W-314 Test and Evaluation Plan (TEP). Testing includes Validations and Verifications (e.g., Commercial Grade Item Dedication activities), Factory Acceptance Tests (FATs), installation tests and inspections, Construction Acceptance Tests (CATs), Acceptance Test Procedures (ATPs), Pre-Operational Test Procedures (POTPs), andmore » Operational Test Procedures (OTPs). It should be noted that POTPs are not required for testing of the transfer line addition. The STEP will be utilized in conjunction with the TEP for verification and validation.« less
Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results
NASA Technical Reports Server (NTRS)
Burken, John J.; Larson, Richard R.
2009-01-01
F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.
Enrichment Assay Methods Development for the Integrated Cylinder Verification System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.
2009-10-22
International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less
NASA Astrophysics Data System (ADS)
Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich
2016-04-01
MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.
NASA Astrophysics Data System (ADS)
Torres-Xirau, I.; Olaciregui-Ruiz, I.; Rozendaal, R. A.; González, P.; Mijnheer, B. J.; Sonke, J.-J.; van der Heide, U. A.; Mans, A.
2017-08-01
In external beam radiotherapy, electronic portal imaging devices (EPIDs) are frequently used for pre-treatment and for in vivo dose verification. Currently, various MR-guided radiotherapy systems are being developed and clinically implemented. Independent dosimetric verification is highly desirable. For this purpose we adapted our EPID-based dose verification system for use with the MR-Linac combination developed by Elekta in cooperation with UMC Utrecht and Philips. In this study we extended our back-projection method to cope with the presence of an extra attenuating medium between the patient and the EPID. Experiments were performed at a conventional linac, using an aluminum mock-up of the MRI scanner housing between the phantom and the EPID. For a 10 cm square field, the attenuation by the mock-up was 72%, while 16% of the remaining EPID signal resulted from scattered radiation. 58 IMRT fields were delivered to a 20 cm slab phantom with and without the mock-up. EPID reconstructed dose distributions were compared to planned dose distributions using the γ -evaluation method (global, 3%, 3 mm). In our adapted back-projection algorithm the averaged {γmean} was 0.27+/- 0.06 , while in the conventional it was 0.28+/- 0.06 . Dose profiles of several square fields reconstructed with our adapted algorithm showed excellent agreement when compared to TPS.
Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid
NASA Astrophysics Data System (ADS)
Kuwayama, Akira
The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.
Cooling Tower (Evaporative Cooling System) Measurement and Verification Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W.; Boyd, Brian; Stoughton, Kate M.
This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with cooling tower efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.
2012-04-01
Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December
Sohl, Terry L.; Dwyer, John L.
1998-01-01
The North American Landscape Characterization (NALC) project is a component of the National Aeronautics and Space Administration (NASA) Landsat Pathfinder program. Pathfinder projects are focused on the investigation of global change utilizing current remote sensing technologies. The NALC project is a cooperative effort between the U. S. Environmental Protection Agency (EPA), the U.S. Geological Survey (USGS), and NASA to make Landsat data available to the widest possible user community for scientific research and general public interest. The NALC project is principally funded by the EPA Office of Research and Development and the USGS's Earth Resources Observation Systems (EROS) Data Center (EDC).The objectives of the NALC project are to produce standardized remote sensing data sets, develop standardized analysis methods, and derive standardized land cover change products for a large portion of the North American continent (the conterminous United States and Mexico) (Lunetta and Sturdevant, 1993). The standard product is the NALC “triplicate”;, consisting of co‐registered Landsat multispectral scanner data for the years 1973, 1986, and 1991 (plus or minus one year), plus co‐registered 3 arcsecond digital terrain elevation data. Processing began with the 1986 scene, which was precision corrected (with full terrain correction) to a 60 meter Universal Transverse Mercator base. Automated cross‐correlation procedures were used to co‐register the 1970's and 1990's data to the 1980's base, and independent verifications of registration quality were performed on all triplicate components. The pertinent metadata were compiled in a relational database, which includes WRS2 path/rows, scene ID's, image dates, solar azimuth and elevation, verification RMSE's, and the number of verification control points. NALC triplicate data sets are being used for a number of applications, including the analysis of urbanization patterns, dynamics of climatic fluctuations, deforestation studies, and vegetation classification and mapping. These data are being distributed through the Earth Observing System Data and Information System (EOSDIS) Information Management System (IMS) at a cost of $15(U.S.) for each triplicate.
Quantitative reactive modeling and verification.
Henzinger, Thomas A
Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.
ENVIRONMENTAL TECHNOLOGY VERIFICATION EVALUATIONS OF ARSENIC REDUCTION SYSTEMS FOR DRINKING WATER
Arsenic in drinking water is a known carcinogen with additional adverse human health impacts. For the 4,100 affected systems, 37 to 55 cancer cases per year are estimated, with about half projected to result in deaths if treatment solutions are not implemented. The Environmenta...
NASA Technical Reports Server (NTRS)
Bhat, Biliyar N.
2008-01-01
Ares I Crew Launch Vehicle Upper Stage is designed and developed based on sound systems engineering principles. Systems Engineering starts with Concept of Operations and Mission requirements, which in turn determine the launch system architecture and its performance requirements. The Ares I-Upper Stage is designed and developed to meet these requirements. Designers depend on the support from materials, processes and manufacturing during the design, development and verification of subsystems and components. The requirements relative to reliability, safety, operability and availability are also dependent on materials availability, characterization, process maturation and vendor support. This paper discusses the roles and responsibilities of materials and manufacturing engineering during the various phases of Ares IUS development, including design and analysis, hardware development, test and verification. Emphasis is placed how materials, processes and manufacturing support is integrated over the Upper Stage Project, both horizontally and vertically. In addition, the paper describes the approach used to ensure compliance with materials, processes, and manufacturing requirements during the project cycle, with focus on hardware systems design and development.
A neural-visualization IDS for honeynet data.
Herrero, Álvaro; Zurutuza, Urko; Corchado, Emilio
2012-04-01
Neural intelligent systems can provide a visualization of the network traffic for security staff, in order to reduce the widely known high false-positive rate associated with misuse-based Intrusion Detection Systems (IDSs). Unlike previous work, this study proposes an unsupervised neural models that generate an intuitive visualization of the captured traffic, rather than network statistics. These snapshots of network events are immensely useful for security personnel that monitor network behavior. The system is based on the use of different neural projection and unsupervised methods for the visual inspection of honeypot data, and may be seen as a complementary network security tool that sheds light on internal data structures through visual inspection of the traffic itself. Furthermore, it is intended to facilitate verification and assessment of Snort performance (a well-known and widely-used misuse-based IDS), through the visualization of attack patterns. Empirical verification and comparison of the proposed projection methods are performed in a real domain, where two different case studies are defined and analyzed.
NASA Astrophysics Data System (ADS)
Rausch, Peter; Verpoort, Sven; Wittrock, Ulrich
2017-11-01
Concepts for future large space telescopes require an active optics system to mitigate aberrations caused by thermal deformation and gravitational release. Such a system would allow on-site correction of wave-front errors and ease the requirements for thermal and gravitational stability of the optical train. In the course of the ESA project "Development of Adaptive Deformable Mirrors for Space Instruments" we have developed a unimorph deformable mirror designed to correct for low-order aberrations and dedicated to be used in space environment. We briefly report on design and manufacturing of the deformable mirror and present results from performance verifications and environmental testing.
1976-01-01
Environmental Systems Laboratory P. 0. Box 631, Vlcksburg, Mississippi 39100 10. PROGRAM ELEMENT. PROJECT, TASK AREA » WORK UNIT NUMBERS Project...phases of the study were under the general supervision of Messrs. W. G. Shockley, Chief, Mobility and Environmental Systemo Labo- ratory (MESL), and...W. E. Grabau, former Chief, Environmental Systems Division (ESD) and now Special Assistant, MESL, and under the direct supervision of Mr. J. K
Operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
1983-01-01
The major operational areas of the COSMIC center are described. Quantitative data on the software submittals, program verification, and evaluation are presented. The dissemination activities are summarized. Customer services and marketing activities of the center for the calendar year are described. Those activities devoted to the maintenance and support of selected programs are described. A Customer Information system, the COSMIC Abstract Recording System Project, and the COSMIC Microfiche Project are summarized. Operational cost data are summarized.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
ERIC Educational Resources Information Center
Applied Management Sciences, Inc., Silver Spring, MD.
Presented in this report are selected findings of the Income Verification Pilot Project (IVPP), an investigation examining misreporting of applicant income and family size on applications for government-sponsored school meal benefits. As reported here, Phase II of the project provided for a comprehensive assessment of specific quality assurance…
NASA Technical Reports Server (NTRS)
Fey, M. G.
1981-01-01
The experimental verification system for the production of silicon via the arc heater-sodium reduction of SiCl4 was designed, fabricated, installed, and operated. Each of the attendant subsystems was checked out and operated to insure performance requirements. These subsystems included: the arc heaters/reactor, cooling water system, gas system, power system, Control & Instrumentation system, Na injection system, SiCl4 injection system, effluent disposal system and gas burnoff system. Prior to introducing the reactants (Na and SiCl4) to the arc heater/reactor, a series of gas only-power tests was conducted to establish the operating parameters of the three arc heaters of the system. Following the successful completion of the gas only-power tests and the readiness tests of the sodium and SiCl4 injection systems, a shakedown test of the complete experimental verification system was conducted.
The Stanford how things work project
NASA Technical Reports Server (NTRS)
Fikes, Richard; Gruber, Tom; Iwasaki, Yumi
1994-01-01
We provide an overview of the Stanford How Things Work (HTW) project, an ongoing integrated collection of research activities in the Knowledge Systems Laboratory at Stanford University. The project is developing technology for representing knowledge about engineered devices in a form that enables the knowledge to be used in multiple systems for multiple reasoning tasks and reasoning methods that enable the represented knowledge to be effectively applied to the performance of the core engineering task of simulating and analyzing device behavior. The central new capabilities currently being developed in the project are automated assistance with model formulation and with verification that a design for an electro-mechanical device satisfies its functional specification.
Investigation of high-strength bolt-tightening verification techniques : tech transfer summary.
DOT National Transportation Integrated Search
2016-03-01
The primary objective of this project was to explore the current state-of-practice and the state-of-the-art techniques for high-strength bolt tightening and verification in structural steel connections. This project was completed so that insight coul...
DOT National Transportation Integrated Search
2014-04-01
The objective of this project was to quantify the effectiveness of the rail inspection ground verification process. More specifically, : the project focused on comparing the effectiveness of conventional versus phased array probes to manually detect ...
NASA Technical Reports Server (NTRS)
Shull, Forrest; Bechtel, Andre; Feldmann, Raimund L.; Regardie, Myrna; Seaman, Carolyn
2008-01-01
This viewgraph presentation addresses the question of inspection and verification and validation (V&V) effectiveness of developing computer systems. A specific question is the relation between V&V effectiveness in the early lifecycle of development and the later testing of the developed system.
ERIC Educational Resources Information Center
Watson, Jeffery; Witham, Peter; St. Louis, Timothy
2010-01-01
The U.S. Department of Education Teacher Incentive Fund (TIF) seeks to transform education compensation systems so that principal and teacher performance (measured through classroom productivity measures) connects to compensation. Classroom-level productivity measures require robust student-teacher linkage data. Organizations such as the…
This project will contribute valuable information on the performance characteristics of new technology for use in infrastructure rehabilitation, and will provide additional credibility to the U.S. Environment Protection Agency’s (EPA) Office of Research and Development’s (ORD) fo...
Model Transformation for a System of Systems Dependability Safety Case
NASA Technical Reports Server (NTRS)
Murphy, Judy; Driskell, Steve
2011-01-01
The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.
Model-Based Verification and Validation of the SMAP Uplink Processes
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun
2013-01-01
This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.
Apollo Soyuz Test Project Weights and Mass Properties Operational Management System
NASA Technical Reports Server (NTRS)
Collins, M. A., Jr.; Hischke, E. R.
1975-01-01
The Apollo Soyuz Test Project (ASTP) Weights and Mass Properties Operational Management System was established to assure a timely and authoritative method of acquiring, controlling, generating, and disseminating an official set of vehicle weights and mass properties data. This paper provides an overview of the system and its interaction with the various aspects of vehicle and component design, mission planning, hardware and software simulations and verification, and real-time mission support activities. The effect of vehicle configuration, design maturity, and consumables updates is discussed in the context of weight control.
Off-line robot programming and graphical verification of path planning
NASA Technical Reports Server (NTRS)
Tonkay, Gregory L.
1989-01-01
The objective of this project was to develop or specify an integrated environment for off-line programming, graphical path verification, and debugging for robotic systems. Two alternatives were compared. The first was the integration of the ASEA Off-line Programming package with ROBSIM, a robotic simulation program. The second alternative was the purchase of the commercial product IGRIP. The needs of the RADL (Robotics Applications Development Laboratory) were explored and the alternatives were evaluated based on these needs. As a result, IGRIP was proposed as the best solution to the problem.
Array automated assembly task, phase 2. Low cost silicon solar array project
NASA Technical Reports Server (NTRS)
Rhee, S. S.; Jones, G. T.; Allison, K. T.
1978-01-01
Several modifications instituted in the wafer surface preparation process served to significantly reduce the process cost to 1.55 cents per peak watt in 1975 cents. Performance verification tests of a laser scanning system showed a limited capability to detect hidden cracks or defects, but with potential equipment modifications this cost effective system could be rendered suitable for applications. Installation of electroless nickel plating system was completed along with an optimization of the wafer plating process. The solder coating and flux removal process verification test was completed. An optimum temperature range of 500-550 C was found to produce uniform solder coating with the restriction that a modified dipping procedure is utilized. Finally, the construction of the spray-on dopant equipment was completed.
DOT National Transportation Integrated Search
2000-01-01
Although the effects of climatic factors on pavement performance have long been recognized as important, those effects remain largely unquantified because individual pavement research projects to date generally have been restricted to limited geograp...
Video Vehicle Detector Verification System (V2DVS) operators manual and project final report.
DOT National Transportation Integrated Search
2012-03-01
The accurate detection of the presence, speed and/or length of vehicles on roadways is recognized as critical for : effective roadway congestion management and safety. Vehicle presence sensors are commonly used for traffic : volume measurement and co...
18 CFR 12.13 - Verification form.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT WORKS...
Discriminative Projection Selection Based Face Image Hashing
NASA Astrophysics Data System (ADS)
Karabat, Cagatay; Erdogan, Hakan
Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.
The evaluation of OSTA's APT and ASVT programs
NASA Technical Reports Server (NTRS)
1981-01-01
The results of an evaluation of NASA's Applications Pilot Test (APT) and Applications System Verification and Transfer (AVST) Programs are presented. These programs sponsor cooperative projects between NASA and potential users of remote sensing (primarily LANDSAT) technology from federal and state government and the private sector. Fifteen specific projects, seven APT's and eight ASVT's, are examined as mechanisms for technology development, test, and transfer by comparing their results against stated objectives. Interviews with project managers from NASA field centers and user agency representatives provide the basis for project evaluation from NASA and user perspectives.
Design verification and cold-flow modeling test report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-07-01
This report presents a compilation of the following three test reports prepared by TRW for Alaska Industrial Development and Export Authority (AIDEA) as part of the Healy Clean Coal Project, Phase 1 Design of the TRW Combustor and Auxiliary Systems, which is co-sponsored by the Department of Energy under the Clean Coal Technology 3 Program: (1) Design Verification Test Report, dated April 1993, (2) Combustor Cold Flow Model Report, dated August 28, 1992, (3) Coal Feed System Cold Flow Model Report, October 28, 1992. In this compilation, these three reports are included in one volume consisting of three parts, andmore » TRW proprietary information has been excluded.« less
Yourkavitch, Jennifer; Zalisk, Kirsten; Prosnitz, Debra; Luhanga, Misheck; Nsona, Humphreys
2016-11-01
The World Health Organization contracted annual data quality assessments of Rapid Access Expansion (RAcE) projects to review integrated community case management (iCCM) data quality and the monitoring and evaluation (M&E) system for iCCM, and to suggest ways to improve data quality. The first RAcE data quality assessment was conducted in Malawi in January 2014 and we present findings pertaining to data from the health management information system at the community, facility and other sub-national levels because RAcE grantees rely on that for most of their monitoring data. We randomly selected 10 health facilities (10% of eligible facilities) from the four RAcE project districts, and collected quantitative data with an adapted and comprehensive tool that included an assessment of Malawi's M&E system for iCCM data and a data verification exercise that traced selected indicators through the reporting system. We rated the iCCM M&E system across five function areas based on interviews and observations, and calculated verification ratios for each data reporting level. We also conducted key informant interviews with Health Surveillance Assistants and facility, district and central Ministry of Health staff. Scores show a high-functioning M&E system for iCCM with some deficiencies in data management processes. The system lacks quality controls, including data entry verification, a protocol for addressing errors, and written procedures for data collection, entry, analysis and management. Data availability was generally high except for supervision data. The data verification process identified gaps in completeness and consistency, particularly in Health Surveillance Assistants' record keeping. Staff at all levels would like more training in data management. This data quality assessment illuminates where an otherwise strong M&E system for iCCM fails to ensure some aspects of data quality. Prioritizing data management with documented protocols, additional training and approaches to create efficient supervision practices may improve iCCM data quality. © The Author 2016. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Upgrades at the NASA Langley Research Center National Transonic Facility
NASA Technical Reports Server (NTRS)
Paryz, Roman W.
2012-01-01
Several projects have been completed or are nearing completion at the NASA Langley Research Center (LaRC) National Transonic Facility (NTF). The addition of a Model Flow-Control/Propulsion Simulation test capability to the NTF provides a unique, transonic, high-Reynolds number test capability that is well suited for research in propulsion airframe integration studies, circulation control high-lift concepts, powered lift, and cruise separation flow control. A 1992 vintage Facility Automation System (FAS) that performs the control functions for tunnel pressure, temperature, Mach number, model position, safety interlock and supervisory controls was replaced using current, commercially available components. This FAS upgrade also involved a design study for the replacement of the facility Mach measurement system and the development of a software-based simulation model of NTF processes and control systems. The FAS upgrades were validated by a post upgrade verification wind tunnel test. The data acquisition system (DAS) upgrade project involves the design, purchase, build, integration, installation and verification of a new DAS by replacing several early 1990's vintage computer systems with state of the art hardware/software. This paper provides an update on the progress made in these efforts. See reference 1.
Critical Surface Cleaning and Verification Alternatives
NASA Technical Reports Server (NTRS)
Melton, Donald M.; McCool, A. (Technical Monitor)
2000-01-01
As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.
Merlyn J. Paulson
1979-01-01
This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...
WTS-4 system verification unit for wind/hydroelectric integration study
NASA Technical Reports Server (NTRS)
Watts, A. W.
1982-01-01
The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.
Validation of mesoscale models
NASA Technical Reports Server (NTRS)
Kuo, Bill; Warner, Tom; Benjamin, Stan; Koch, Steve; Staniforth, Andrew
1993-01-01
The topics discussed include the following: verification of cloud prediction from the PSU/NCAR mesoscale model; results form MAPS/NGM verification comparisons and MAPS observation sensitivity tests to ACARS and profiler data; systematic errors and mesoscale verification for a mesoscale model; and the COMPARE Project and the CME.
NASA Technical Reports Server (NTRS)
Kubat, Gregory
2016-01-01
This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.
An Integrated Approach to Exploration Launch Office Requirements Development
NASA Technical Reports Server (NTRS)
Holladay, Jon B.; Langford, Gary
2006-01-01
The proposed paper will focus on the Project Management and Systems Engineering approach utilized to develop a set of both integrated and cohesive requirements for the Exploration Launch Office, within the Constellation Program. A summary of the programmatic drivers which influenced the approach along with details of the resulting implementation will be discussed as well as metrics evaluating the efficiency and accuracy of the various requirements development activities. Requirements development activities will focus on the procedures utilized to ensure that technical content was valid and mature in preparation for the Crew Launch Vehicle and Constellation System s Requirements Reviews. This discussion will begin at initial requirements development during the Exploration Systems Architecture Study and progress through formal development of the program structure. Specific emphasis will be given to development and validation of the requirements. This discussion will focus on approaches to garner the appropriate requirement owners (or customers), project infrastructure utilized to emphasize proper integration, and finally the procedure to technically mature, verify and validate the requirements. Examples of requirements being implemented on the Launch Vehicle (systems, interfaces, test & verification) will be utilized to demonstrate the various processes and also provide a top level understanding of the launch vehicle(s) performance goals. Details may also be provided on the approaches for verification, which range from typical aerospace hardware development (qualification/acceptance) through flight certification (flight test, etc.). The primary intent of this paper is to provide a demonstrated procedure for the development of a mature, effective, integrated set of requirements on a complex system, which also has the added intricacies of both heritage and new hardware development integration. Ancillary focus of the paper will include discussion of Test and Verification approaches along with top level systems/elements performance capabilities.
Definition of optical systems payloads
NASA Technical Reports Server (NTRS)
Downey, J. A., III
1981-01-01
The various phases in the formulation of a major NASA project include the inception of the project, planning of the concept, and the project definition. A baseline configuration is established during the planning stage, which serves as a basis for engineering trade studies. Basic technological problems should be recognized early, and a technological verification plan prepared before development of a project begins. A progressive series of iterations is required during the definition phase, illustrating the complex interdependence of existing subsystems. A systems error budget should be established to assess the overall systems performance, identify key performance drivers, and guide performance trades and iterations around these drivers, thus decreasing final systems requirements. Unnecessary interfaces should be avoided, and reasonable design and cost margins maintained. Certain aspects of the definition of the Advanced X-ray Astrophysics Facility are used as an example.
NASA Technical Reports Server (NTRS)
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
2010-01-01
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.
A limited-angle intrafraction verification (LIVE) system for radiation therapy.
Ren, Lei; Zhang, You; Yin, Fang-Fang
2014-02-01
Currently, no 3D or 4D volumetric x-ray imaging techniques are available for intrafraction verification of target position during actual treatment delivery or in-between treatment beams, which is critical for stereotactic radiosurgery (SRS) and stereotactic body radiation therapy (SBRT) treatments. This study aims to develop a limited-angle intrafraction verification (LIVE) system to use prior information, deformation models, and limited angle kV-MV projections to verify target position intrafractionally. The LIVE system acquires limited-angle kV projections simultaneously during arc treatment delivery or in-between static 3D/IMRT treatment beams as the gantry moves from one beam to the next. Orthogonal limited-angle MV projections are acquired from the beam's eye view (BEV) exit fluence of arc treatment beam or in-between static beams to provide additional anatomical information. MV projections are converted to kV projections using a linear conversion function. Patient prior planning CT at one phase is used as the prior information, and the on-board patient volume is considered as a deformation of the prior images. The deformation field is solved using the data fidelity constraint, a breathing motion model extracted from the planning 4D-CT based on principal component analysis (PCA) and a free-form deformation (FD) model. LIVE was evaluated using a 4D digital extended cardiac torso phantom (XCAT) and a CIRS 008A dynamic thoracic phantom. In the XCAT study, patient breathing pattern and tumor size changes were simulated from CT to treatment position. In the CIRS phantom study, the artificial target in the lung region experienced both size change and position shift from CT to treatment position. Varian Truebeam research mode was used to acquire kV and MV projections simultaneously during the delivery of a dynamic conformal arc plan. The reconstruction accuracy was evaluated by calculating the 3D volume percentage difference (VPD) and the center of mass (COM) difference of the tumor in the true on-board images and reconstructed images. In both simulation and phantom studies, LIVE achieved substantially better reconstruction accuracy than reconstruction using PCA or FD deformation model alone. In the XCAT study, the average VPD and COM differences among different patient scenarios for LIVE system using orthogonal 30° scan angles were 4.3% and 0.3 mm when using kV+BEV MV. Reducing scan angle to 15° increased the average VPD and COM differences to 15.1% and 1.7 mm. In the CIRS phantom study, the VPD and COM differences for the LIVE system using orthogonal 30° scan angles were 6.4% and 1.4 mm. Reducing scan angle to 15° increased the VPD and COM differences to 51.9% and 3.8 mm. The LIVE system has the potential to substantially improve intrafraction target localization accuracy by providing volumetric verification of tumor position simultaneously during arc treatment delivery or in-between static treatment beams. With this improvement, LIVE opens up a new avenue for margin reduction and dose escalation in both fractionated treatments and SRS and SBRT treatments.
NASA Astrophysics Data System (ADS)
Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.
2014-06-01
A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.
ONAV - An Expert System for the Space Shuttle Mission Control Center
NASA Technical Reports Server (NTRS)
Mills, Malise; Wang, Lui
1992-01-01
The ONAV (Onboard Navigation) Expert System is being developed as a real-time console assistant to the ONAV flight controller for use in the Mission Control Center at the Johnson Space Center. Currently, Oct. 1991, the entry and ascent systems have been certified for use on console as support tools, and were used for STS-48. The rendezvous system is in verification with the goal to have the system certified for STS-49, Intelsat retrieval. To arrive at this stage, from a prototype to real-world application, the ONAV project has had to deal with not only Al issues but operating environment issues. The Al issues included the maturity of Al languages and the debugging tools, verification, and availability, stability and size of the expert pool. The environmental issues included real time data acquisition, hardware suitability, and how to achieve acceptance by users and management.
Metamorphoses of ONAV console operations: From prototype to real time application
NASA Technical Reports Server (NTRS)
Millis, Malise; Wang, Lui
1991-01-01
The ONAV (Onboard Navigation) Expert System is being developed as a real time console assistant to the ONAV flight controller for use in the Mission Control Center at the Johnson Space Center. Currently the entry and rendezvous systems are in verification, and the ascent is being prototyped. To arrive at this stage, from a prototype to real world application, the ONAV project has had to deal with not only AI issues but operating environment issues. The AI issues included the maturity of AI languages and the debugging tools, what is verification, and availability, stability, and the size of the expert pool. The environmental issues included real time data acquisition, hardware stability, and how to achieve acceptance by users and management.
Scalable and Accurate SMT-based Model Checking of Data Flow Systems
2013-10-30
guided by the semantics of the description language . In this project we developed instead a complementary and novel approach based on a somewhat brute...believe that our approach could help considerably in expanding the reach of abstract interpretation techniques to a variety of tar- get languages , as...project. We worked on developing a framework for compositional verification that capitalizes on the fact that data-flow languages , such as Lustre, have
Student-Teacher Linkage Verification: Model Process and Recommendations
ERIC Educational Resources Information Center
Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.
2012-01-01
As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…
Marshall Space Flight Center Ground Systems Development and Integration
NASA Technical Reports Server (NTRS)
Wade, Gina
2016-01-01
Ground Systems Development and Integration performs a variety of tasks in support of the Mission Operations Laboratory (MOL) and other Center and Agency projects. These tasks include various systems engineering processes such as performing system requirements development, system architecture design, integration, verification and validation, software development, and sustaining engineering of mission operations systems that has evolved the Huntsville Operations Support Center (HOSC) into a leader in remote operations for current and future NASA space projects. The group is also responsible for developing and managing telemetry and command configuration and calibration databases. Personnel are responsible for maintaining and enhancing their disciplinary skills in the areas of project management, software engineering, software development, software process improvement, telecommunications, networking, and systems management. Domain expertise in the ground systems area is also maintained and includes detailed proficiency in the areas of real-time telemetry systems, command systems, voice, video, data networks, and mission planning systems.
Software risk management through independent verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John R.; Zhou, Tong C.; Wood, Ralph
1995-01-01
Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mather, Barry
2015-07-09
The objective of this project is to use field verification to improve DOE’s ability to model and understand the impacts of, as well as develop solutions for, high penetration PV deployments in electrical utility distribution systems. The Participant will work with NREL to assess the existing distribution system at SCE facilities and assess adding additional PV systems into the electric power system.
This is an ESTE project summary brief. Many of the finished interior surfaces of homes and buildings are composed of materials that are prone to mold growth. These surfaces include gypsum board, wood flooring, insulation, and components of the heating and air conditioning system...
NASA Technical Reports Server (NTRS)
Gavin, Thomas R.
2006-01-01
This viewgraph presentation reviews the many parts of the JPL mission planning process that the project manager has to work with. Some of them are: NASA & JPL's institutional requirements, the mission systems design requirements, the science interactions, the technical interactions, financial requirements, verification and validation, safety and mission assurance, and independent assessment, review and reporting.
EPA's ETV Program, through the NRMRL has partnered with the California Department of Toxic Substances Control (DTSC) under an ETV Pilot Project to verify pollution prevention, recycling, and waste treatment technologies. This report and statement provides documentation of perfor...
Site systems engineering fiscal year 1999 multi-year work plan (MYWP) update for WBS 1.8.2.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
GRYGIEL, M.L.
1998-10-08
Manage the Site Systems Engineering process to provide a traceable integrated requirements-driven, and technically defensible baseline. Through the Site Integration Group(SIG), Systems Engineering ensures integration of technical activities across all site projects. Systems Engineering's primary interfaces are with the RL Project Managers, the Project Direction Office and with the Project Major Subcontractors, as well as with the Site Planning organization. Systems Implementation: (1) Develops, maintains, and controls the site integrated technical baseline, ensures the Systems Engineering interfaces between projects are documented, and maintain the Site Environmental Management Specification. (2) Develops and uses dynamic simulation models for verification of the baselinemore » and analysis of alternatives. (3) Performs and documents fictional and requirements analyses. (4) Works with projects, technology management, and the SIG to identify and resolve technical issues. (5) Supports technical baseline information for the planning and budgeting of the Accelerated Cleanup Plan, Multi-Year Work Plans, Project Baseline Summaries as well as performance measure reporting. (6) Works with projects to ensure the quality of data in the technical baseline. (7) Develops, maintains and implements the site configuration management system.« less
The VATES-Diamond as a Verifier's Best Friend
NASA Astrophysics Data System (ADS)
Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz
Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.
Command system output bit verification
NASA Technical Reports Server (NTRS)
Odd, C. W.; Abbate, S. F.
1981-01-01
An automatic test was developed to test the ability of the deep space station (DSS) command subsystem and exciter to generate and radiate, from the exciter, the correct idle bit sequence for a given flight project or to store and radiate received command data elements and files without alteration. This test, called the command system output bit verification test, is an extension of the command system performance test (SPT) and can be selected as an SPT option. The test compares the bit stream radiated from the DSS exciter with reference sequences generated by the SPT software program. The command subsystem and exciter are verified when the bit stream and reference sequences are identical. It is a key element of the acceptance testing conducted on the command processor assembly (CPA) operational program (DMC-0584-OP-G) prior to its transfer from development to operations.
NASA Technical Reports Server (NTRS)
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
2010-01-01
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barahona, B.; Jonkman, J.; Damiani, R.
2014-12-01
Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshoremore » Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.« less
Geographic Information System Data Analysis
NASA Technical Reports Server (NTRS)
Billings, Chad; Casad, Christopher; Floriano, Luis G.; Hill, Tracie; Johnson, Rashida K.; Locklear, J. Mark; Penn, Stephen; Rhoulac, Tori; Shay, Adam H.; Taylor, Antone;
1995-01-01
Data was collected in order to further NASA Langley Research Center's Geographic Information System(GIS). Information on LaRC's communication, electrical, and facility configurations was collected. Existing data was corrected through verification, resulting in more accurate databases. In addition, Global Positioning System(GPS) points were used in order to accurately impose buildings on digitized images. Overall, this project will help the Imaging and CADD Technology Team (ICTT) prove GIS to be a valuable resource for LaRC.
Climate Projections and Drought: Verification for the Colorado River Basin
NASA Astrophysics Data System (ADS)
Santos, N. I.; Piechota, T. C.; Miller, W. P.; Ahmad, S.
2017-12-01
The Colorado River Basin has experienced the driest 17 year period (2000-2016) in over 100 years of historical record keeping. While the Colorado River reservoir system began the current drought at near 100% capacity, reservoir storage has fallen to just above 50% during the drought. Even though federal and state water agencies have worked together to mitigate the impact of the drought and have collaboratively sponsored conservation programs and drought contingency plans, the 17-years of observed data beg the question as to whether the most recent climate projections would have been able to project the current drought's severity. The objective of this study is to analyze observations and ensemble projections (e.g. temperature, precipitation, streamflow) from the CMIP3 and CMIP5 archive in the Colorado River Basin and compare metrics related to skill scores, the Palmer Drought Severity Index, and water supply sustainability index. Furthermore, a sub-ensemble of CMIP3/CMIP5 projections, developed using a teleconnection replication verification technique developed by the author, will also be compared to the observed record to assist in further validating the technique as a usable process to increase skill in climatological projections. In the end, this study will assist to better inform water resource managers about the ability of climate ensembles to project hydroclimatic variability and the appearance of decadal drought periods.
Guidance and Control Software Project Data - Volume 1: Planning Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, J. N.; Chin, M. R.; Sjoden, G. E.
2013-07-01
A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reactionmore » rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)« less
NASA Astrophysics Data System (ADS)
Ament, F.; Weusthoff, T.; Arpagaus, M.; Rotach, M.
2009-04-01
The main aim of the WWRP Forecast Demonstration Project MAP D-PHASE is to demonstrate the performance of today's models to forecast heavy precipitation and flood events in the Alpine region. Therefore an end-to-end, real-time forecasting system was installed and operated during the D PHASE Operations Period from June to November 2007. Part of this system are 30 numerical weather prediction models (deterministic as well as ensemble systems) operated by weather services and research institutes, which issue alerts if predicted precipitation accumulations exceed critical thresholds. Additionally to the real-time alerts, all relevant model fields of these simulations are stored in a central data archive. This comprehensive data set allows a detailed assessment of today's quantitative precipitation forecast (QPF) performance in the Alpine region. We will present results of QPF verifications against Swiss radar and rain gauge data both from a qualitative point of view, in terms of alerts, as well as from a quantitative perspective, in terms of precipitation rate. Various influencing factors like lead time, accumulation time, selection of warning thresholds, or bias corrections will be discussed. Additional to traditional verifications of area average precipitation amounts, the performance of the models to predict the correct precipitation statistics without requiring a point-to-point match will be described by using modern Fuzzy verification techniques. Both analyses reveal significant advantages of deep convection resolving models compared to coarser models with parameterized convection. An intercomparison of the model forecasts themselves reveals a remarkably high variability between different models, and makes it worthwhile to evaluate the potential of a multi-model ensemble. Various multi-model ensemble strategies will be tested by combining D-PHASE models to virtual ensemble systems.
Verification and Implementation of Operations Safety Controls for Flight Missions
NASA Technical Reports Server (NTRS)
Jones, Cheryl L.; Smalls, James R.; Carrier, Alicia S.
2010-01-01
Approximately eleven years ago, the International Space Station launched the first module from Russia, the Functional Cargo Block (FGB). Safety and Mission Assurance (S&MA) Operations (Ops) Engineers played an integral part in that endeavor by executing strict flight product verification as well as continued staffing of S&MA's console in the Mission Evaluation Room (MER) for that flight mission. How were these engineers able to conduct such a complicated task? They conducted it based on product verification that consisted of ensuring that safety requirements were adequately contained in all flight products that affected crew safety. S&MA Ops engineers apply both systems engineering and project management principles in order to gain a appropriate level of technical knowledge necessary to perform thorough reviews which cover the subsystem(s) affected. They also ensured that mission priorities were carried out with a great detail and success.
A limited-angle intrafraction verification (LIVE) system for radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Lei, E-mail: lei.ren@duke.edu; Yin, Fang-Fang; Zhang, You
Purpose: Currently, no 3D or 4D volumetric x-ray imaging techniques are available for intrafraction verification of target position during actual treatment delivery or in-between treatment beams, which is critical for stereotactic radiosurgery (SRS) and stereotactic body radiation therapy (SBRT) treatments. This study aims to develop a limited-angle intrafraction verification (LIVE) system to use prior information, deformation models, and limited angle kV-MV projections to verify target position intrafractionally. Methods: The LIVE system acquires limited-angle kV projections simultaneously during arc treatment delivery or in-between static 3D/IMRT treatment beams as the gantry moves from one beam to the next. Orthogonal limited-angle MV projectionsmore » are acquired from the beam's eye view (BEV) exit fluence of arc treatment beam or in-between static beams to provide additional anatomical information. MV projections are converted to kV projections using a linear conversion function. Patient prior planning CT at one phase is used as the prior information, and the on-board patient volume is considered as a deformation of the prior images. The deformation field is solved using the data fidelity constraint, a breathing motion model extracted from the planning 4D-CT based on principal component analysis (PCA) and a free-form deformation (FD) model. LIVE was evaluated using a 4D digital extended cardiac torso phantom (XCAT) and a CIRS 008A dynamic thoracic phantom. In the XCAT study, patient breathing pattern and tumor size changes were simulated from CT to treatment position. In the CIRS phantom study, the artificial target in the lung region experienced both size change and position shift from CT to treatment position. Varian Truebeam research mode was used to acquire kV and MV projections simultaneously during the delivery of a dynamic conformal arc plan. The reconstruction accuracy was evaluated by calculating the 3D volume percentage difference (VPD) and the center of mass (COM) difference of the tumor in the true on-board images and reconstructed images. Results: In both simulation and phantom studies, LIVE achieved substantially better reconstruction accuracy than reconstruction using PCA or FD deformation model alone. In the XCAT study, the average VPD and COM differences among different patient scenarios for LIVE system using orthogonal 30° scan angles were 4.3% and 0.3 mm when using kV+BEV MV. Reducing scan angle to 15° increased the average VPD and COM differences to 15.1% and 1.7 mm. In the CIRS phantom study, the VPD and COM differences for the LIVE system using orthogonal 30° scan angles were 6.4% and 1.4 mm. Reducing scan angle to 15° increased the VPD and COM differences to 51.9% and 3.8 mm. Conclusions: The LIVE system has the potential to substantially improve intrafraction target localization accuracy by providing volumetric verification of tumor position simultaneously during arc treatment delivery or in-between static treatment beams. With this improvement, LIVE opens up a new avenue for margin reduction and dose escalation in both fractionated treatments and SRS and SBRT treatments.« less
Assuring NASA's Safety and Mission Critical Software
NASA Technical Reports Server (NTRS)
Deadrick, Wesley
2015-01-01
What is IV&V? Independent Verification and Validation (IV&V) is an objective examination of safety and mission critical software processes and products. Independence: 3 Key parameters: Technical Independence; Managerial Independence; Financial Independence. NASA IV&V perspectives: Will the system's software: Do what it is supposed to do?; Not do what it is not supposed to do?; Respond as expected under adverse conditions?. Systems Engineering: Determines if the right system has been built and that it has been built correctly. IV&V Technical Approaches: Aligned with IEEE 1012; Captured in a Catalog of Methods; Spans the full project lifecycle. IV&V Assurance Strategy: The IV&V Project's strategy for providing mission assurance; Assurance Strategy is driven by the specific needs of an individual project; Implemented via an Assurance Design; Communicated via Assurance Statements.
NASA Astrophysics Data System (ADS)
Claver, C. F.; Selvy, Brian M.; Angeli, George; Delgado, Francisco; Dubois-Felsmann, Gregory; Hascall, Patrick; Lotz, Paul; Marshall, Stuart; Schumacher, German; Sebag, Jacques
2014-08-01
The Large Synoptic Survey Telescope project was an early adopter of SysML and Model Based Systems Engineering practices. The LSST project began using MBSE for requirements engineering beginning in 2006 shortly after the initial release of the first SysML standard. Out of this early work the LSST's MBSE effort has grown to include system requirements, operational use cases, physical system definition, interfaces, and system states along with behavior sequences and activities. In this paper we describe our approach and methodology for cross-linking these system elements over the three classical systems engineering domains - requirement, functional and physical - into the LSST System Architecture model. We also show how this model is used as the central element to the overall project systems engineering effort. More recently we have begun to use the cross-linked modeled system architecture to develop and plan the system verification and test process. In presenting this work we also describe "lessons learned" from several missteps the project has had with MBSE. Lastly, we conclude by summarizing the overall status of the LSST's System Architecture model and our plans for the future as the LSST heads toward construction.
Kumada, Hiroaki; Kurihara, Toshikazu; Yoshioka, Masakazu; Kobayashi, Hitoshi; Matsumoto, Hiroshi; Sugano, Tomei; Sakurai, Hideyuki; Sakae, Takeji; Matsumura, Akira
2015-12-01
The iBNCT project team with University of Tsukuba is developing an accelerator-based neutron source. Regarding neutron target material, our project has applied beryllium. To deal with large heat load and blistering of the target system, we developed a three-layer structure for the target system that includes a blistering mitigation material between the beryllium used as the neutron generator and the copper heat sink. The three materials were bonded through diffusion bonding using a hot isostatic pressing method. Based on several verifications, our project chose palladium as the intermediate layer. A prototype of the neutron target system was produced. We will verify that sufficient neutrons for BNCT treatment are generated by the device in the near future. Copyright © 2015 Elsevier Ltd. All rights reserved.
Verification of Meteorological and Oceanographic Ensemble Forecasts in the U.S. Navy
NASA Astrophysics Data System (ADS)
Klotz, S.; Hansen, J.; Pauley, P.; Sestak, M.; Wittmann, P.; Skupniewicz, C.; Nelson, G.
2013-12-01
The Navy Ensemble Forecast Verification System (NEFVS) has been promoted recently to operational status at the U.S. Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC). NEFVS processes FNMOC and National Centers for Environmental Prediction (NCEP) meteorological and ocean wave ensemble forecasts, gridded forecast analyses, and innovation (observational) data output by FNMOC's data assimilation system. The NEFVS framework consists of statistical analysis routines, a variety of pre- and post-processing scripts to manage data and plot verification metrics, and a master script to control application workflow. NEFVS computes metrics that include forecast bias, mean-squared error, conditional error, conditional rank probability score, and Brier score. The system also generates reliability and Receiver Operating Characteristic diagrams. In this presentation we describe the operational framework of NEFVS and show examples of verification products computed from ensemble forecasts, meteorological observations, and forecast analyses. The construction and deployment of NEFVS addresses important operational and scientific requirements within Navy Meteorology and Oceanography. These include computational capabilities for assessing the reliability and accuracy of meteorological and ocean wave forecasts in an operational environment, for quantifying effects of changes and potential improvements to the Navy's forecast models, and for comparing the skill of forecasts from different forecast systems. NEFVS also supports the Navy's collaboration with the U.S. Air Force, NCEP, and Environment Canada in the North American Ensemble Forecast System (NAEFS) project and with the Air Force and the National Oceanic and Atmospheric Administration (NOAA) in the National Unified Operational Prediction Capability (NUOPC) program. This program is tasked with eliminating unnecessary duplication within the three agencies, accelerating the transition of new technology, such as multi-model ensemble forecasting, to U.S. Department of Defense use, and creating a superior U.S. global meteorological and oceanographic prediction capability. Forecast verification is an important component of NAEFS and NUOPC. Distribution Statement A: Approved for Public Release; distribution is unlimited
Verification of Meteorological and Oceanographic Ensemble Forecasts in the U.S. Navy
NASA Astrophysics Data System (ADS)
Klotz, S. P.; Hansen, J.; Pauley, P.; Sestak, M.; Wittmann, P.; Skupniewicz, C.; Nelson, G.
2012-12-01
The Navy Ensemble Forecast Verification System (NEFVS) has been promoted recently to operational status at the U.S. Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC). NEFVS processes FNMOC and National Centers for Environmental Prediction (NCEP) meteorological and ocean wave ensemble forecasts, gridded forecast analyses, and innovation (observational) data output by FNMOC's data assimilation system. The NEFVS framework consists of statistical analysis routines, a variety of pre- and post-processing scripts to manage data and plot verification metrics, and a master script to control application workflow. NEFVS computes metrics that include forecast bias, mean-squared error, conditional error, conditional rank probability score, and Brier score. The system also generates reliability and Receiver Operating Characteristic diagrams. In this presentation we describe the operational framework of NEFVS and show examples of verification products computed from ensemble forecasts, meteorological observations, and forecast analyses. The construction and deployment of NEFVS addresses important operational and scientific requirements within Navy Meteorology and Oceanography (METOC). These include computational capabilities for assessing the reliability and accuracy of meteorological and ocean wave forecasts in an operational environment, for quantifying effects of changes and potential improvements to the Navy's forecast models, and for comparing the skill of forecasts from different forecast systems. NEFVS also supports the Navy's collaboration with the U.S. Air Force, NCEP, and Environment Canada in the North American Ensemble Forecast System (NAEFS) project and with the Air Force and the National Oceanic and Atmospheric Administration (NOAA) in the National Unified Operational Prediction Capability (NUOPC) program. This program is tasked with eliminating unnecessary duplication within the three agencies, accelerating the transition of new technology, such as multi-model ensemble forecasting, to U.S. Department of Defense use, and creating a superior U.S. global meteorological and oceanographic prediction capability. Forecast verification is an important component of NAEFS and NUOPC.
Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David
2018-06-01
Six Sigma involves a structured process improvement strategy that places processes on a pathway to continued improvement. The data presented here summarizes a project that took three clinical laboratories from autoverification processes that allowed between about 40% to 60% of tests being auto-verified to more than 90% of tests and samples auto-verified. The project schedule, metrics and targets, a description of the previous system and detailed information on the changes made to achieve greater than 90% auto-verification is presented for this Six Sigma DMAIC (Design, Measure, Analyze, Improve, Control) process improvement project.
The purpose of this SOP is to define the steps involved in data entry and data verification of physical forms. It applies to the data entry and data verification of all physical forms. The procedure defined herein was developed for use in the Arizona NHEXAS project and the "Bor...
Requirements Development for the NASA Advanced Engineering Environment (AEE)
NASA Technical Reports Server (NTRS)
Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.
2003-01-01
The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.
Control structural interaction testbed: A model for multiple flexible body verification
NASA Technical Reports Server (NTRS)
Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.
1993-01-01
Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.
2011-01-01
Purpose To verify the dose distribution and number of monitor units (MU) for dynamic treatment techniques like volumetric modulated single arc radiation therapy - Rapid Arc - each patient treatment plan has to be verified prior to the first treatment. The purpose of this study was to develop a patient related treatment plan verification protocol using a two dimensional ionization chamber array (MatriXX, IBA, Schwarzenbruck, Germany). Method Measurements were done to determine the dependence between response of 2D ionization chamber array, beam direction, and field size. Also the reproducibility of the measurements was checked. For the patient related verifications the original patient Rapid Arc treatment plan was projected on CT dataset of the MatriXX and the dose distribution was calculated. After irradiation of the Rapid Arc verification plans measured and calculated 2D dose distributions were compared using the gamma evaluation method implemented in the measuring software OmniPro (version 1.5, IBA, Schwarzenbruck, Germany). Results The dependence between response of 2D ionization chamber array, field size and beam direction has shown a passing rate of 99% for field sizes between 7 cm × 7 cm and 24 cm × 24 cm for measurements of single arc. For smaller and larger field sizes than 7 cm × 7 cm and 24 cm × 24 cm the passing rate was less than 99%. The reproducibility was within a passing rate of 99% and 100%. The accuracy of the whole process including the uncertainty of the measuring system, treatment planning system, linear accelerator and isocentric laser system in the treatment room was acceptable for treatment plan verification using gamma criteria of 3% and 3 mm, 2D global gamma index. Conclusion It was possible to verify the 2D dose distribution and MU of Rapid Arc treatment plans using the MatriXX. The use of the MatriXX for Rapid Arc treatment plan verification in clinical routine is reasonable. The passing rate should be 99% than the verification protocol is able to detect clinically significant errors. PMID:21342509
Building the Qualification File of EGNOS with DOORS
NASA Astrophysics Data System (ADS)
Fabre, J.
2008-08-01
EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.
Comment on 'Propellant production from the Martian atmosphere'
NASA Technical Reports Server (NTRS)
Ruppe, H. O.
1993-01-01
The optimism of the Tauber et al. (1992) note on photosynthetic production of spacecraft fuels from Martian atmospheric gases is presently noted, in conjunction with the need for prior missions' verification of such a system. Two of the original authors reply that their solar cell array assumptions are conservative in light of plausible performance projections for 2010-decade technology.
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
NASA. Lewis Research Center Advanced Modulation and Coding Project: Introduction and overview
NASA Technical Reports Server (NTRS)
Budinger, James M.
1992-01-01
The Advanced Modulation and Coding Project at LeRC is sponsored by the Office of Space Science and Applications, Communications Division, Code EC, at NASA Headquarters and conducted by the Digital Systems Technology Branch of the Space Electronics Division. Advanced Modulation and Coding is one of three focused technology development projects within the branch's overall Processing and Switching Program. The program consists of industry contracts for developing proof-of-concept (POC) and demonstration model hardware, university grants for analyzing advanced techniques, and in-house integration and testing of performance verification and systems evaluation. The Advanced Modulation and Coding Project is broken into five elements: (1) bandwidth- and power-efficient modems; (2) high-speed codecs; (3) digital modems; (4) multichannel demodulators; and (5) very high-data-rate modems. At least one contract and one grant were awarded for each element.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, G; Yin, F; Ren, L
Purpose: In order to track the tumor movement for patient positioning verification during arc treatment delivery or in between 3D/IMRT beams for stereotactic body radiation therapy (SBRT), the limited-angle kV projections acquisition simultaneously during arc treatment delivery or in-between static treatment beams as the gantry moves to the next beam angle was proposed. The purpose of this study is to estimate additional imaging dose resulting from multiple tomosynthesis acquisitions in-between static treatment beams and to compare with that of a conventional kV-CBCT acquisition. Methods: kV imaging system integrated into Varian TrueBeam accelerators was modeled using EGSnrc Monte Carlo user code,more » BEAMnrc and DOSXYZnrc code was used in dose calculations. The simulated realistic kV beams from the Varian TrueBeam OBI 1.5 system were used to calculate dose to patient based on CT images. Organ doses were analyzed using DVHs. The imaging dose to patient resulting from realistic multiple tomosynthesis acquisitions with each 25–30 degree kV source rotation between 6 treatment beam gantry angles was studied. Results: For a typical lung SBRT treatment delivery much lower (20–50%) kV imaging doses from the sum of realistic six tomosynthesis acquisitions with each 25–30 degree x-ray source rotation between six treatment beam gantry angles were observed compared to that from a single CBCT image acquisition. Conclusion: This work indicates that the kV imaging in this proposed Limited-angle Intra-fractional Verification (LIVE) System for SBRT Treatments has a negligible imaging dose increase. It is worth to note that the MV imaging dose caused by MV projection acquisition in-between static beams in LIVE can be minimized by restricting the imaging to the target region and reducing the number of projections acquired. For arc treatments, MV imaging acquisition in LIVE does not add additional imaging dose as the MV images are acquired from treatment beams directly during the treatment.« less
Design and Testing of a Prototype Lunar or Planetary Surface Landing Research Vehicle (LPSLRV)
NASA Technical Reports Server (NTRS)
Murphy, Gloria A.
2010-01-01
This handbook describes a two-semester senior design course sponsored by the NASA Office of Education, the Exploration Systems Mission Directorate (ESMD), and the NASA Space Grant Consortium. The course was developed and implemented by the Mechanical and Aerospace Engineering Department (MAE) at Utah State University. The course final outcome is a packaged senior design course that can be readily incorporated into the instructional curriculum at universities across the country. The course materials adhere to the standards of the Accreditation Board for Engineering and Technology (ABET), and is constructed to be relevant to key research areas identified by ESMD. The design project challenged students to apply systems engineering concepts to define research and training requirements for a terrestrial-based lunar landing simulator. This project developed a flying prototype for a Lunar or Planetary Surface Landing Research Vehicle (LPSRV). Per NASA specifications the concept accounts for reduced lunar gravity, and allows the terminal stage of lunar descent to be flown either by remote pilot or autonomously. This free-flying platform was designed to be sufficiently-flexible to allow both sensor evaluation and pilot training. This handbook outlines the course materials, describes the systems engineering processes developed to facilitate design fabrication, integration, and testing. This handbook presents sufficient details of the final design configuration to allow an independent group to reproduce the design. The design evolution and details regarding the verification testing used to characterize the system are presented in a separate project final design report. Details of the experimental apparatus used for system characterization may be found in Appendix F, G, and I of that report. A brief summary of the ground testing and systems verification is also included in Appendix A of this report. Details of the flight tests will be documented in a separate flight test report. This flight test report serves as a complement to the course handbook presented here. This project was extremely ambitious, and achieving all of the design and test objectives was a daunting task. The schedule ran slightly longer than a single academic year with the complete design closure not occurring until early April. Integration and verification testing spilled over into late May and the first flight did not occur until mid to late June. The academic year at Utah State University ended on May 8, 2010. Following the end of the academic year, testing and integration was performed by the faculty advisor, paid research assistants, and volunteer student help
NASA Astrophysics Data System (ADS)
Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.
2014-12-01
The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich
2015-04-01
The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"
A web-based system for supporting global land cover data production
NASA Astrophysics Data System (ADS)
Han, Gang; Chen, Jun; He, Chaoying; Li, Songnian; Wu, Hao; Liao, Anping; Peng, Shu
2015-05-01
Global land cover (GLC) data production and verification process is very complicated, time consuming and labor intensive, requiring huge amount of imagery data and ancillary data and involving many people, often from different geographic locations. The efficient integration of various kinds of ancillary data and effective collaborative classification in large area land cover mapping requires advanced supporting tools. This paper presents the design and development of a web-based system for supporting 30-m resolution GLC data production by combining geo-spatial web-service and Computer Support Collaborative Work (CSCW) technology. Based on the analysis of the functional and non-functional requirements from GLC mapping, a three tiers system model is proposed with four major parts, i.e., multisource data resources, data and function services, interactive mapping and production management. The prototyping and implementation of the system have been realised by a combination of Open Source Software (OSS) and commercially available off-the-shelf system. This web-based system not only facilitates the integration of heterogeneous data and services required by GLC data production, but also provides online access, visualization and analysis of the images, ancillary data and interim 30 m global land-cover maps. The system further supports online collaborative quality check and verification workflows. It has been successfully applied to China's 30-m resolution GLC mapping project, and has improved significantly the efficiency of GLC data production and verification. The concepts developed through this study should also benefit other GLC or regional land-cover data production efforts.
Validation of Mission Plans Through Simulation
NASA Astrophysics Data System (ADS)
St-Pierre, J.; Melanson, P.; Brunet, C.; Crabtree, D.
2002-01-01
The purpose of a spacecraft mission planning system is to automatically generate safe and optimized mission plans for a single spacecraft, or more functioning in unison. The system verifies user input syntax, conformance to commanding constraints, absence of duty cycle violations, timing conflicts, state conflicts, etc. Present day constraint-based systems with state-based predictive models use verification rules derived from expert knowledge. A familiar solution found in Mission Operations Centers, is to complement the planning system with a high fidelity spacecraft simulator. Often a dedicated workstation, the simulator is frequently used for operator training and procedure validation, and may be interfaced to actual control stations with command and telemetry links. While there are distinct advantages to having a planning system offer realistic operator training using the actual flight control console, physical verification of data transfer across layers and procedure validation, experience has revealed some drawbacks and inefficiencies in ground segment operations: With these considerations, two simulation-based mission plan validation projects are under way at the Canadian Space Agency (CSA): RVMP and ViSION. The tools proposed in these projects will automatically run scenarios and provide execution reports to operations planning personnel, prior to actual command upload. This can provide an important safeguard for system or human errors that can only be detected with high fidelity, interdependent spacecraft models running concurrently. The core element common to these projects is a spacecraft simulator, built with off-the- shelf components such as CAE's Real-Time Object-Based Simulation Environment (ROSE) technology, MathWork's MATLAB/Simulink, and Analytical Graphics' Satellite Tool Kit (STK). To complement these tools, additional components were developed, such as an emulated Spacecraft Test and Operations Language (STOL) interpreter and CCSDS TM/TC encoders and decoders. This paper discusses the use of simulation in the context of space mission planning, describes the projects under way and proposes additional venues of investigation and development.
The concept verification testing of materials science payloads
NASA Technical Reports Server (NTRS)
Griner, C. S.; Johnston, M. H.; Whitaker, A.
1976-01-01
The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.
GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES
This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...
7 CFR 272.8 - State income and eligibility verification system.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 4 2010-01-01 2010-01-01 false State income and eligibility verification system. 272... PARTICIPATING STATE AGENCIES § 272.8 State income and eligibility verification system. (a) General. (1) State agencies may maintain and use an income and eligibility verification system (IEVS), as specified in this...
Microcode Verification Project.
1980-05-01
numerical constant. The internal syntax for these minimum and maximum values is REALMIN and REALMAX. ISPSSIMP ISPSSIMP is the file simplifying bitstring...To be fair , it is quito clear that much of the ILbor Il tile verification task can be reduced If verification and. code development are carried out...basi.a of and the language we have chosen for both encoding our descriptions of machines and reasoning about the course of computations. Internally , our
NASA Technical Reports Server (NTRS)
Bowley, C. J.; Barnes, J. C.; Rango, A.
1981-01-01
The purpose of the handbook is to update the various snowcover interpretation techniques, document the snow mapping techniques used in the various ASVT study areas, and describe the ways snowcover data have been applied to runoff prediction. Through documentation in handbook form, the methodology developed in the Snow Mapping ASVT can be applied to other areas.
2013-04-01
project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of
Requirement Assurance: A Verification Process
NASA Technical Reports Server (NTRS)
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
1988-01-01
infiltration studies ( Westerdahl and Skogerboe 1982). Exten- sive field verification studies have been conducted with the WES Rainfall Simulator...Lysimeter System on a wide range of USACE project sites ( Westerdahl and Skogerboe 1982, Lee and Skogerboe 1984, Skogerboe et al. 1987). The WES Rainfall...Criteria for Water 1986,"’ Criteria and Standards Division, Washington, DC. Westerdahl , H. E., and Skogerboe, J. G. 1982. "Realistic Rainfall and Water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Phillip A.; O'Hagan, Ryan; Shumaker, Brent
The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carriedmore » out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.« less
The role of criteria in design and management of space systems
NASA Technical Reports Server (NTRS)
Blair, J. C.; Ryan, R. S.
1992-01-01
Explicit requirements and standards arising in connection with space systems management serve as a framework for technical management and furnish legally binding control of development, verification, and operations. As a project develops, additional requirements are derived which are unique to the system in question; these are designated 'derived requirements'. The reliability and cost-effectiveness of a space system are best ensured where a balance has arisen between formal (legally binding) and informal. Attention is presently given to the development of criteria consistent with total quality management.
Zhang, Yawei; Deng, Xinchen; Yin, Fang-Fang; Ren, Lei
2018-01-01
Limited-angle intrafraction verification (LIVE) has been previously developed for four-dimensional (4D) intrafraction target verification either during arc delivery or between three-dimensional (3D)/IMRT beams. Preliminary studies showed that LIVE can accurately estimate the target volume using kV/MV projections acquired over orthogonal view 30° scan angles. Currently, the LIVE imaging acquisition requires slow gantry rotation and is not clinically optimized. The goal of this study is to optimize the image acquisition parameters of LIVE for different patient respiratory periods and gantry rotation speeds for the effective clinical implementation of the system. Limited-angle intrafraction verification imaging acquisition was optimized using a digital anthropomorphic phantom (XCAT) with simulated respiratory periods varying from 3 s to 6 s and gantry rotation speeds varying from 1°/s to 6°/s. LIVE scanning time was optimized by minimizing the number of respiratory cycles needed for the four-dimensional scan, and imaging dose was optimized by minimizing the number of kV and MV projections needed for four-dimensional estimation. The estimation accuracy was evaluated by calculating both the center-of-mass-shift (COMS) and three-dimensional volume-percentage-difference (VPD) between the tumor in estimated images and the ground truth images. The robustness of LIVE was evaluated with varied respiratory patterns, tumor sizes, and tumor locations in XCAT simulation. A dynamic thoracic phantom (CIRS) was used to further validate the optimized imaging schemes from XCAT study with changes of respiratory patterns, tumor sizes, and imaging scanning directions. Respiratory periods, gantry rotation speeds, number of respiratory cycles scanned and number of kV/MV projections acquired were all positively correlated with the estimation accuracy of LIVE. Faster gantry rotation speed or longer respiratory period allowed less respiratory cycles to be scanned and less kV/MV projections to be acquired to estimate the target volume accurately. Regarding the scanning time minimization, for patient respiratory periods of 3-4 s, gantry rotation speeds of 1°/s, 2°/s, 3-6°/s required scanning of five, four, and three respiratory cycles, respectively. For patient respiratory periods of 5-6 s, the corresponding respiratory cycles required in the scan changed to four, three, and two cycles, respectively. Regarding the imaging dose minimization, for patient respiratory periods of 3-4 s, gantry rotation speeds of 1°/s, 2-4°/s, 5-6°/s required acquiring of 7, 5, 4 kV and MV projections, respectively. For patient respiratory periods of 5-6 s, 5 kV and 5 MV projections are sufficient for all gantry rotation speeds. The optimized LIVE system was robust against breathing pattern, tumor size and tumor location changes. In the CIRS study, the optimized LIVE system achieved the average center-of-mass-shift (COMS)/volume-percentage-difference (VPD) of 0.3 ± 0.1 mm/7.7 ± 2.0% for the scanning time priority case, 0.2 ± 0.1 mm/6.1 ± 1.2% for the imaging dose priority case, respectively, among all gantry rotation speeds tested. LIVE was robust against different scanning directions investigated. The LIVE system has been preliminarily optimized for different patient respiratory periods and treatment gantry rotation speeds using digital and physical phantoms. The optimized imaging parameters, including number of respiratory cycles scanned and kV/MV projection numbers acquired, provide guidelines for optimizing the scanning time and imaging dose of the LIVE system for its future evaluations and clinical implementations through patient studies. © 2017 American Association of Physicists in Medicine.
NASA Technical Reports Server (NTRS)
Powell, John D.
2003-01-01
This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.
Sustainable, Reliable Mission-Systems Architecture
NASA Technical Reports Server (NTRS)
O'Neil, Graham; Orr, James K.; Watson, Steve
2005-01-01
A mission-systems architecture, based on a highly modular infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is essential for affordable md sustainable space exploration programs. This mission-systems architecture requires (8) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, end verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered systems are applied to define the model. Technology projections reaching out 5 years are made to refine model details.
Sustainable, Reliable Mission-Systems Architecture
NASA Technical Reports Server (NTRS)
O'Neil, Graham; Orr, James K.; Watson, Steve
2007-01-01
A mission-systems architecture, based on a highly modular infrastructure utilizing: open-standards hardware and software interfaces as the enabling technology is essential for affordable and sustainable space exploration programs. This mission-systems architecture requires (a) robust communication between heterogeneous system, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered system are applied to define the model. Technology projections reaching out 5 years are mde to refine model details.
Zhang, Yawei; Yin, Fang-Fang; Zhang, You; Ren, Lei
2017-05-07
The purpose of this study is to develop an adaptive prior knowledge guided image estimation technique to reduce the scan angle needed in the limited-angle intrafraction verification (LIVE) system for 4D-CBCT reconstruction. The LIVE system has been previously developed to reconstruct 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the 4D-CBCT images for faster intrafraction verification. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on kV-MV projections acquired in extremely limited angle (orthogonal 3°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of the respiratory motion. The 4D digital extended-cardiac-torso (XCAT) phantom and a CIRS 008A dynamic thoracic phantom were used to evaluate the effectiveness of this technique. The reconstruction accuracy of the technique was evaluated by calculating both the center-of-mass-shift (COMS) and 3D volume-percentage-difference (VPD) of the tumor in reconstructed images and the true on-board images. The performance of the technique was also assessed with varied breathing signals against scanning angle, lesion size, lesion location, projection sampling interval, and scanning direction. In the XCAT study, using orthogonal-view of 3° kV and portal MV projections, this technique achieved an average tumor COMS/VPD of 0.4 ± 0.1 mm/5.5 ± 2.2%, 0.6 ± 0.3 mm/7.2 ± 2.8%, 0.5 ± 0.2 mm/7.1 ± 2.6%, 0.6 ± 0.2 mm/8.3 ± 2.4%, for baseline drift, amplitude variation, phase shift, and patient breathing signal variation, respectively. In the CIRS phantom study, this technique achieved an average tumor COMS/VPD of 0.7 ± 0.1 mm/7.5 ± 1.3% for a 3 cm lesion and 0.6 ± 0.2 mm/11.4 ± 1.5% for a 2 cm lesion in the baseline drift case. The average tumor COMS/VPD were 0.5 ± 0.2 mm/10.8 ± 1.4%, 0.4 ± 0.3 mm/7.3 ± 2.9%, 0.4 ± 0.2 mm/7.4 ± 2.5%, 0.4 ± 0.2 mm/7.3 ± 2.8% for the four real patient breathing signals, respectively. Results demonstrated that the adaptive prior knowledge guided image estimation technique with LIVE system is robust against scanning angle, lesion size, location and scanning direction. It can estimate on-board images accurately with as little as 6 projections in orthogonal-view 3° angle. In conclusion, adaptive prior knowledge guided image reconstruction technique accurately estimates 4D-CBCT images using extremely-limited angle and projections. This technique greatly improves the efficiency and accuracy of LIVE system for ultrafast 4D intrafraction verification of lung SBRT treatments.
Static Verification for Code Contracts
NASA Astrophysics Data System (ADS)
Fähndrich, Manuel
The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.
NASA Technical Reports Server (NTRS)
Culbert, Chris; French, Scott W.; Hamilton, David
1994-01-01
Knowledge-based systems (KBS's) are in general use in a wide variety of domains, both commercial and government. As reliance on these types of systems grows, the need to assess their quality and validity reaches critical importance. As with any software, the reliability of a KBS can be directly attributed to the application of disciplined programming and testing practices throughout the development life-cycle. However, there are some essential differences between conventional software and KBSs, both in construction and use. The identification of these differences affect the verification and validation (V&V) process and the development of techniques to handle them. The recognition of these differences is the basis of considerable on-going research in this field. For the past three years IBM (Federal Systems Company - Houston) and the Software Technology Branch (STB) of NASA/Johnson Space Center have been working to improve the 'state of the practice' in V&V of Knowledge-based systems. This work was motivated by the need to maintain NASA's ability to produce high quality software while taking advantage of new KBS technology. To date, the primary accomplishment has been the development and teaching of a four-day workshop on KBS V&V. With the hope of improving the impact of these workshops, we also worked directly with NASA KBS projects to employ concepts taught in the workshop. This paper describes two projects that were part of this effort. In addition to describing each project, this paper describes problems encountered and solutions proposed in each case, with particular emphasis on implications for transferring KBS V&V technology beyond the NASA domain.
Comparative verification between GEM model and official aviation terminal forecasts
NASA Technical Reports Server (NTRS)
Miller, Robert G.
1988-01-01
The Generalized Exponential Markov (GEM) model uses the local standard airways observation (SAO) to predict hour-by-hour the following elements: temperature, pressure, dew point depression, first and second cloud-layer height and amount, ceiling, total cloud amount, visibility, wind, and present weather conditions. GEM is superior to persistence at all projections for all elements in a large independent sample. A minute-by-minute GEM forecasting system utilizing the Automated Weather Observation System (AWOS) is under development.
Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit
NASA Technical Reports Server (NTRS)
Sartori, John
2005-01-01
The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.
International Space Station Passive Thermal Control System Analysis, Top Ten Lessons-Learned
NASA Technical Reports Server (NTRS)
Iovine, John
2011-01-01
The International Space Station (ISS) has been on-orbit for over 10 years, and there have been numerous technical challenges along the way from design to assembly to on-orbit anomalies and repairs. The Passive Thermal Control System (PTCS) management team has been a key player in successfully dealing with these challenges. The PTCS team performs thermal analysis in support of design and verification, launch and assembly constraints, integration, sustaining engineering, failure response, and model validation. This analysis is a significant body of work and provides a unique opportunity to compile a wealth of real world engineering and analysis knowledge and the corresponding lessons-learned. The analysis lessons encompass the full life cycle of flight hardware from design to on-orbit performance and sustaining engineering. These lessons can provide significant insight for new projects and programs. Key areas to be presented include thermal model fidelity, verification methods, analysis uncertainty, and operations support.
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Multi-canister overpack project -- verification and validation, MCNP 4A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldmann, L.H.
This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less
Concept Verification Test - Evaluation of Spacelab/Payload operation concepts
NASA Technical Reports Server (NTRS)
Mcbrayer, R. O.; Watters, H. H.
1977-01-01
The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.
Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia
1996-01-01
The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.
Exomars Mission Verification Approach
NASA Astrophysics Data System (ADS)
Cassi, Carlo; Gilardi, Franco; Bethge, Boris
According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.
Interface Management for a NASA Flight Project Using Model-Based Systems Engineering (MBSE)
NASA Technical Reports Server (NTRS)
Vipavetz, Kevin; Shull, Thomas A.; Infeld, Samatha; Price, Jim
2016-01-01
The goal of interface management is to identify, define, control, and verify interfaces; ensure compatibility; provide an efficient system development; be on time and within budget; while meeting stakeholder requirements. This paper will present a successful seven-step approach to interface management used in several NASA flight projects. The seven-step approach using Model Based Systems Engineering will be illustrated by interface examples from the Materials International Space Station Experiment-X (MISSE-X) project. The MISSE-X was being developed as an International Space Station (ISS) external platform for space environmental studies, designed to advance the technology readiness of materials and devices critical for future space exploration. Emphasis will be given to best practices covering key areas such as interface definition, writing good interface requirements, utilizing interface working groups, developing and controlling interface documents, handling interface agreements, the use of shadow documents, the importance of interface requirement ownership, interface verification, and product transition.
A Model-Based Approach to Engineering Behavior of Complex Aerospace Systems
NASA Technical Reports Server (NTRS)
Ingham, Michel; Day, John; Donahue, Kenneth; Kadesch, Alex; Kennedy, Andrew; Khan, Mohammed Omair; Post, Ethan; Standley, Shaun
2012-01-01
One of the most challenging yet poorly defined aspects of engineering a complex aerospace system is behavior engineering, including definition, specification, design, implementation, and verification and validation of the system's behaviors. This is especially true for behaviors of highly autonomous and intelligent systems. Behavior engineering is more of an art than a science. As a process it is generally ad-hoc, poorly specified, and inconsistently applied from one project to the next. It uses largely informal representations, and results in system behavior being documented in a wide variety of disparate documents. To address this problem, JPL has undertaken a pilot project to apply its institutional capabilities in Model-Based Systems Engineering to the challenge of specifying complex spacecraft system behavior. This paper describes the results of the work in progress on this project. In particular, we discuss our approach to modeling spacecraft behavior including 1) requirements and design flowdown from system-level to subsystem-level, 2) patterns for behavior decomposition, 3) allocation of behaviors to physical elements in the system, and 4) patterns for capturing V&V activities associated with behavioral requirements. We provide examples of interesting behavior specification patterns, and discuss findings from the pilot project.
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
Hydrologic data-verification management program plan
Alexander, C.W.
1982-01-01
Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, R.; Grace, W.
1996-07-01
This is the final report of a one-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). We won a 1994 R&D 100 Award for inventing the Bartas Iris Verification System. The system has been delivered to a sponsor and is no longer available to us. This technology can verify the identity of a person for purposes of access control, national security, law enforcement, forensics, counter-terrorism, and medical, financial, or scholastic records. The technique is non-invasive, psychologically acceptable, works in real-time, and obtains more biometric data than any other biometric except DNA analysis. This project soughtmore » to develop a new, second-generation prototype instrument.« less
Formalizing New Navigation Requirements for NASA's Space Shuttle
NASA Technical Reports Server (NTRS)
DiVito, Ben L.
1996-01-01
We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.
Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. J. Appel
This cleanup verification package documents completion of remedial action for the 118-F-3, Minor Construction Burial Ground waste site. This site was an open field covered with cobbles, with no vegetation growing on the surface. The site received irradiated reactor parts that were removed during conversion of the 105-F Reactor from the Liquid 3X to the Ball 3X Project safety systems and received mostly vertical safety rod thimbles and step plugs.
2017-09-01
report was cleared for public release by the 88th ABW, Wright-Patterson AFB Public Affairs Office and is available to the general public, including...AFRL/RI 11. SPONSOR/MONITOR’S REPORT NUMBER AFRL-RI-RS-TR-2017-178 12. DISTRIBUTION AVAILABILITY STATEMENT Approved for Public Release; Distribution...Formal Verification, Red Team, High Assurance Cyber Military Systems 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES
To support the Nation's Homeland Security Program, this U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) project is conducted to verify the performance of commercially available products, methods, and equipment for decontamination of hard and...
DOT National Transportation Integrated Search
2008-06-30
The following Independent Verification and Validation (IV&V) report documents and presents the results of a study of the Washington State Ferries Prototype Wireless High Speed Data Network. The purpose of the study was to evaluate and determine if re...
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
This is an ESTE project summary brief. EPA’s Environmental Technology Verification Program (ETV) is verifying the performance of portable optical and thermal imaging devices for leak detection at petroleum refineries and chemical plans. Industrial facilities, such as chemical p...
Quality Assurance Project Plan for Verification of Sediment Ecotoxicity Assessment Ring(SEA Ring)
The objective of the verification is to test the efficacy and ability of the Sediment Ecotoxicity Assessment Ring (SEA Ring) to evaluate the toxicity of contaminants in the sediment, at the sediment-water interface, and WC to organisms that live in those respective environments.
Verification of Space Weather Forecasts using Terrestrial Weather Approaches
NASA Astrophysics Data System (ADS)
Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.
2015-12-01
The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.
Evaluation on Cost Overrun Risks of Long-distance Water Diversion Project Based on SPA-IAHP Method
NASA Astrophysics Data System (ADS)
Yuanyue, Yang; Huimin, Li
2018-02-01
Large investment, long route, many change orders and etc. are main causes for costs overrun of long-distance water diversion project. This paper, based on existing research, builds a full-process cost overrun risk evaluation index system for water diversion project, apply SPA-IAHP method to set up cost overrun risk evaluation mode, calculate and rank weight of every risk evaluation indexes. Finally, the cost overrun risks are comprehensively evaluated by calculating linkage measure, and comprehensive risk level is acquired. SPA-IAHP method can accurately evaluate risks, and the reliability is high. By case calculation and verification, it can provide valid cost overrun decision making information to construction companies.
Study on verifying the angle measurement performance of the rotary-laser system
NASA Astrophysics Data System (ADS)
Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui
2018-04-01
An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
Supersonic Gas-Liquid Cleaning System
NASA Technical Reports Server (NTRS)
Kinney, Frank
1996-01-01
The Supersonic Gas-Liquid Cleaning System Research Project consisted mainly of a feasibility study, including theoretical and engineering analysis, of a proof-of-concept prototype of this particular cleaning system developed by NASA-KSC. The cleaning system utilizes gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the device to be cleaned. The cleaning fluid being accelerated to these high velocities may consist of any solvent or liquid, including water. Compressed air or any inert gas is used to provide the conveying medium for the liquid, as well as substantially reduce the total amount of liquid needed to perform adequate surface cleaning and cleanliness verification. This type of aqueous cleaning system is considered to be an excellent way of conducting cleaning and cleanliness verification operations as replacements for the use of CFC 113 which must be discontinued by 1995. To utilize this particular cleaning system in various cleaning applications for both the Space Program and the commercial market, it is essential that the cleaning system, especially the supersonic nozzle, be characterized for such applications. This characterization consisted of performing theoretical and engineering analysis, identifying desirable modifications/extensions to the basic concept, evaluating effects of variations in operating parameters, and optimizing hardware design for specific applications.
A Quantitative Approach to the Formal Verification of Real-Time Systems.
1996-09-01
Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic
2012-10-23
Naeini A H, Hill J T, Krause A, Groblacher S, Aspelmeyer M and Painter O 2011 Nature 478 89 [14] Siegman A E 1986 Lasers (Sausalito, CA: University... laser rate equation theory and experimental verification 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...coherent mechanical oscillators: the laser rate equation theory and experimental verification J B Khurgin1, M W Pruessner2,3, T H Stievater2 and W S
Systems Integration Processes for NASA Ares I Crew Launch Vehicle
NASA Technical Reports Server (NTRS)
Taylor, James L.; Reuter, James L.; Sexton, Jeffrey D.
2006-01-01
NASA's Exploration Initiative will require development of many new elements to constitute a robust system of systems. New launch vehicles are needed to place cargo and crew in stable Low Earth Orbit (LEO). This paper examines the systems integration processes NASA is utilizing to ensure integration and control of propulsion and nonpropulsion elements within NASA's Crew Launch Vehicle (CLV), now known as the Ares I. The objective of the Ares I is to provide the transportation capabilities to meet the Constellation Program requirements for delivering a Crew Exploration Vehicle (CEV) or other payload to LEO in support of the lunar and Mars missions. The Ares I must successfully provide this capability within cost and schedule, and with an acceptable risk approach. This paper will describe the systems engineering management processes that will be applied to assure Ares I Project success through complete and efficient technical integration. Discussion of technical review and management processes for requirements development and verification, integrated design and analysis, integrated simulation and testing, and the integration of reliability, maintainability and supportability (RMS) into the design will also be included. The Ares I Project is logically divided into elements by the major hardware groupings, and associated management, system engineering, and integration functions. The processes to be described herein are designed to integrate within these Ares I elements and among the other Constellation projects. Also discussed is launch vehicle stack integration (Ares I to CEV, and Ground and Flight Operations integration) throughout the life cycle, including integrated vehicle performance through orbital insertion, recovery of the first stage, and reentry of the upper stage. The processes for decomposing requirements to the elements and ensuring that requirements have been correctly validated, decomposed, and allocated, and that the verification requirements are properly defined to ensure that the system design meets requirements, will be discussed.
NASA Technical Reports Server (NTRS)
1978-01-01
The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.
NASA Technical Reports Server (NTRS)
1986-01-01
Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.
Selected Examples of LDRD Projects Supporting Test Ban Treaty Verification and Nonproliferation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, K.; Al-Ayat, R.; Walter, W. R.
The Laboratory Directed Research and Development (LDRD) Program at the DOE National Laboratories was established to ensure the scientific and technical vitality of these institutions and to enhance the their ability to respond to evolving missions and anticipate national needs. LDRD allows the Laboratory directors to invest a percentage of their total annual budget in cutting-edge research and development projects within their mission areas. We highlight a selected set of LDRD-funded projects, in chronological order, that have helped provide capabilities, people and infrastructure that contributed greatly to our ability to respond to technical challenges in support of test ban treatymore » verification and nonproliferation.« less
Independent verification of plutonium decontamination on Johnston Atoll (1992--1996)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson-Nichols, M.J.; Wilson, J.E.; McDowell-Boyer, L.M.
1998-05-01
The Field Command, Defense Special Weapons Agency (FCDSWA) (formerly FCDNA) contracted Oak Ridge National Laboratory (ORNL) Environmental Technology Section (ETS) to conduct an independent verification (IV) of the Johnston Atoll (JA) Plutonium Decontamination Project by an interagency agreement with the US Department of Energy in 1992. The main island is contaminated with the transuranic elements plutonium and americium, and soil decontamination activities have been ongoing since 1984. FCDSWA has selected a remedy that employs a system of sorting contaminated particles from the coral/soil matrix, allowing uncontaminated soil to be reused. The objective of IV is to evaluate the effectiveness ofmore » remedial action. The IV contractor`s task is to determine whether the remedial action contractor has effectively reduced contamination to levels within established criteria and whether the supporting documentation describing the remedial action is adequate. ORNL conducted four interrelated tasks from 1992 through 1996 to accomplish the IV mission. This document is a compilation and summary of those activities, in addition to a comprehensive review of the history of the project.« less
NASA Technical Reports Server (NTRS)
Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim
1993-01-01
The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.
A Low Power SOC Architecture for the V2.0+EDR Bluetooth Using a Unified Verification Platform
NASA Astrophysics Data System (ADS)
Kim, Jeonghun; Kim, Suki; Baek, Kwang-Hyun
This paper presents a low-power System on Chip (SOC) architecture for the v2.0+EDR (Enhanced Data Rate) Bluetooth and its applications. Our design includes a link controller, modem, RF transceiver, Sub-Band Codec (SBC), Expanded Instruction Set Computer (ESIC) processor, and peripherals. To decrease power consumption of the proposed SOC, we reduce data transfer using a dual-port memory, including a power management unit, and a clock gated approach. We also address some of issues and benefits of reusable and unified environment on a centralized data structure and SOC verification platform. This includes flexibility in meeting the final requirements using technology-independent tools wherever possible in various processes and for projects. The other aims of this work are to minimize design efforts by avoiding the same work done twice by different people and to reuse the similar environment and platform for different projects. This chip occupies a die size of 30mm2 in 0.18µm CMOS, and the worst-case current of the total chip is 54mA.
An unattended verification station for UF6 cylinders: Field trial findings
NASA Astrophysics Data System (ADS)
Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.
2017-12-01
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Code of Federal Regulations, 2010 CFR
2010-04-01
... Social Security and Employer Identification Numbers by owners. 884.117 Section 884.117 Housing and Urban... PROJECTS Applicability, Scope and Basic Policies § 884.117 Disclosure and verification of Social Security... Social Security and Employer Identification Numbers, as provided by 24 CFR part 5. (Approved by the...
Verification test report on a solar heating and hot water system
NASA Technical Reports Server (NTRS)
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
46 CFR 61.40-3 - Design verification testing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...
A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables
NASA Astrophysics Data System (ADS)
Huang, Laura X.; Isaac, George A.; Sheng, Grant
2014-01-01
This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.
Towards composition of verified hardware devices
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, G. C.
1991-01-01
Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.
GCS plan for software aspects of certification
NASA Technical Reports Server (NTRS)
Shagnea, Anita M.; Lowman, Douglas S.; Withers, B. Edward
1990-01-01
As part of the Guidance and Control Software (GCS) research project being sponsored by NASA to evaluate the failure processes of software, standard industry software development procedures are being employed. To ensure that these procedures are authentic, the guidelines outlined in the Radio Technical Commission for Aeronautics (RTCA/DO-178A document entitled, software considerations in airborne systems and equipment certification, were adopted. A major aspect of these guidelines is proper documentation. As such, this report, the plan for software aspects of certification, was produced in accordance with DO-178A. An overview is given of the GCS research project, including the goals of the project, project organization, and project schedules. It also specifies the plans for all aspects of the project which relate to the certification of the GCS implementations developed under a NASA contract. These plans include decisions made regarding the software specification, accuracy requirements, configuration management, implementation development and verification, and the development of the GCS simulator.
A Low-Cost Data Acquisition System for Automobile Dynamics Applications
González, Alejandro; Vinolas, Jordi
2018-01-01
This project addresses the need for the implementation of low-cost acquisition technology in the field of vehicle engineering: the design, development, manufacture, and verification of a low-cost Arduino-based data acquisition platform to be used in <80 Hz data acquisition in vehicle dynamics, using low-cost accelerometers. In addition to this, a comparative study is carried out of professional vibration acquisition technologies and low-cost systems, obtaining optimum results for low- and medium-frequency operations with an error of 2.19% on road tests. It is therefore concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources. PMID:29382039
A Low-Cost Data Acquisition System for Automobile Dynamics Applications.
González, Alejandro; Olazagoitia, José Luis; Vinolas, Jordi
2018-01-27
This project addresses the need for the implementation of low-cost acquisition technology in the field of vehicle engineering: the design, development, manufacture, and verification of a low-cost Arduino-based data acquisition platform to be used in <80 Hz data acquisition in vehicle dynamics, using low-cost accelerometers. In addition to this, a comparative study is carried out of professional vibration acquisition technologies and low-cost systems, obtaining optimum results for low- and medium-frequency operations with an error of 2.19% on road tests. It is therefore concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
NASA Astrophysics Data System (ADS)
Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland
2016-08-01
In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.
Temporal Specification and Verification of Real-Time Systems.
1991-08-30
of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .
78 FR 22522 - Privacy Act of 1974: New System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-16
... Privacy Act of 1974 (5 U.S.C. 552a), as amended, titled ``Biometric Verification System (CSOSA-20).'' This... Biometric Verification System allows individuals under supervision to electronically check-in for office... determination. ADDRESSES: You may submit written comments, identified by ``Biometric Verification System, CSOSA...
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
HDM/PASCAL Verification System User's Manual
NASA Technical Reports Server (NTRS)
Hare, D.
1983-01-01
The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujii, T; Fujii, Y; Shimizu, S
Purpose: To acquire correct information for inside the body in patient positioning of Real-time-image Gated spot scanning Proton Therapy (RGPT), utilization of tomographic image at exhale phase of patient respiration obtained from 4-dimensional Cone beam CT (4D-CBCT) has been desired. We developed software named “Image Analysis Platform” for 4D-CBCT researches which has technique to segment projection-images based on 3D marker position in the body. The 3D marker position can be obtained by using two axes CBCT system at Hokkaido University Hospital Proton Therapy Center. Performance verification of the software was implemented. Methods: The software calculates 3D marker position retrospectively bymore » using matching positions on pair projection-images obtained by two axes fluoroscopy mode of CBCT system. Log data of 3D marker tracking are outputted after the tracking. By linking the Log data and gantry-angle file of projection-image, all projection-images are equally segmented to spatial five-phases according to marker 3D position of SI direction and saved to specified phase folder. Segmented projection-images are used for CBCT reconstruction of each phase. As performance verification of the software, test of segmented projection-images was implemented for sample CT phantom (Catphan) image acquired by two axes fluoroscopy mode of CBCT. Dummy marker was added on the images. Motion of the marker was modeled to move in 3D space. Motion type of marker is sin4 wave function has amplitude 10.0 mm/5.0 mm/0 mm, cycle 4 s/4 s/0 s for SI/AP/RL direction. Results: The marker was tracked within 0.58 mm accuracy in 3D for all images, and it was confirmed that all projection-images were segmented and saved to each phase folder correctly. Conclusion: We developed software for 4D-CBCT research which can segment projection-image based on 3D marker position. It will be helpful to create high quality of 4D-CBCT reconstruction image for RGPT.« less
NASA Project Constellation Systems Engineering Approach
NASA Technical Reports Server (NTRS)
Dumbacher, Daniel L.
2005-01-01
NASA's Office of Exploration Systems (OExS) is organized to empower the Vision for Space Exploration with transportation systems that result in achievable, affordable, and sustainable human and robotic journeys to the Moon, Mars, and beyond. In the process of delivering these capabilities, the systems engineering function is key to implementing policies, managing mission requirements, and ensuring technical integration and verification of hardware and support systems in a timely, cost-effective manner. The OExS Development Programs Division includes three main areas: (1) human and robotic technology, (2) Project Prometheus for nuclear propulsion development, and (3) Constellation Systems for space transportation systems development, including a Crew Exploration Vehicle (CEV). Constellation Systems include Earth-to-orbit, in-space, and surface transportation systems; maintenance and science instrumentation; and robotic investigators and assistants. In parallel with development of the CEV, robotic explorers will serve as trailblazers to reduce the risk and costs of future human operations on the Moon, as well as missions to other destinations, including Mars. Additional information is included in the original extended abstract.
Simscape Modeling Verification in the Simulink Development Environment
NASA Technical Reports Server (NTRS)
Volle, Christopher E. E.
2014-01-01
The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.
Finger vein verification system based on sparse representation.
Xin, Yang; Liu, Zhi; Zhang, Haixia; Zhang, Hong
2012-09-01
Finger vein verification is a promising biometric pattern for personal identification in terms of security and convenience. The recognition performance of this technology heavily relies on the quality of finger vein images and on the recognition algorithm. To achieve efficient recognition performance, a special finger vein imaging device is developed, and a finger vein recognition method based on sparse representation is proposed. The motivation for the proposed method is that finger vein images exhibit a sparse property. In the proposed system, the regions of interest (ROIs) in the finger vein images are segmented and enhanced. Sparse representation and sparsity preserving projection on ROIs are performed to obtain the features. Finally, the features are measured for recognition. An equal error rate of 0.017% was achieved based on the finger vein image database, which contains images that were captured by using the near-IR imaging device that was developed in this study. The experimental results demonstrate that the proposed method is faster and more robust than previous methods.
The Tools That Help Systems Engineering
NASA Technical Reports Server (NTRS)
Gamertsfelder, Jacob O.
2017-01-01
There are many tools that systems engineers use in today's space programs. In my time in the Commercial Crew Program I sought to improve one of the vital tools for the verification and validation team. This was my main project but only a small part of what I have done in the department. I have also had the chance to learn from the best and see actual hardware, this real world experience will help me be a better aerospace engineer when I enter the workforce. I look forward to seeing the Commercial Crew Program progress to launch.
The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the Arizona NHEXAS project and Border study at the Illinois Institute of Technology (IIT) site. Keywords: computers; s...
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.
The National Human Exposure Assessment Sur...
Formulating face verification with semidefinite programming.
Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S
2007-11-01
This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.
Task Listings Resulting from the Vocational Competency Measures Project. Memorandum Report.
ERIC Educational Resources Information Center
American Institutes for Research in the Behavioral Sciences, Palo Alto, CA.
This memorandum report consists of 14 task listings resulting from the Vocational Competency Measures Project. (The Vocational Competency Measures Project was a test development project that involved the writing and verification of task listings for 14 vocational occupational areas through over 225 interviews conducted in 27 states.) Provided in…
The classification of LANDSAT data for the Orlando, Florida, urban fringe area
NASA Technical Reports Server (NTRS)
Walthall, C. L.; Knapp, E. M.
1978-01-01
Procedures used to map residential land cover on the Orlando, Florida, Urban fringe zone are detailed. The NASA Bureau of the Census Applications Systems Verification and Transfer project and the test site are described as well as the LANDSAT data used as the land cover information sources. Both single-date LANDSAT data processing and multitemporal principal components LANDSAT data processing are described. A summary of significant findings is included.
1992-10-01
infiltration studies ( Westerdahl and Skogerboe 1982). Extensive field 53 verification studies have been conducted with the WES Rainfall Simulator...Lysimeter System on a wide range of Corps project sites ( Westerdahl and Skogerboe 1982, Lee and Skogerboe 1984, Skogerboe et al. 1987). The WES Rain- fall...Vicksburg, MS. Winer, B. J. 1971. Statistical Principles in Experimental Design, McGraw- Hill Book Company, New York. Westerdahl , H. E., and Skogerboe, J
2012-01-30
CFRP LAMINATES FOR MARINE USE Sa. CONTRACT NUMBER 5b. GRANT NUMBER N00014-06-1-1139 Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Miyano, Yasushi...prediction of CFRP laminates proposed and confirmed experimentally in the previous ONR project of Grant # N000140110949 was verified theoretically and refined...DURABILITY OF CFRP LAMINATES FOR MARINE USE Principal Investigator Yasushi Miyano Co-principal Investigator Isao Kimpara Materials System
1968-01-01
AS-204, the fourth Saturn IB launch vehicle, developed by the Marshall Space Flight Center (MSFC), awaits its January 22, 1968 liftoff from Cape Canaveral, Florida for the unmarned Apollo 5 mission. Primary mission objectives included the verification of the Apollo Lunar Module's (LM) ascent and descent propulsion systems and an evaluation of the S-IVB stage instrument unit performance. In all, nine Saturn IB flights were made, ending with the Apollo-Soyuz Test Project in July 1975.
Long life reliability thermal control systems study
NASA Technical Reports Server (NTRS)
Scollon, T. R., Jr.; Killen, R. E.
1972-01-01
The results of a program undertaken to conceptually design and evaluate a passive, high reliability, long life thermal control system for space station application are presented. The program consisted of four steps: (1) investigate and select potential thermal system elements; (2) conceive, evaluate and select a thermal control system using these elements; (3) conduct a verification test of a prototype segment of the selected system; and (4) evaluate the utilization of waste heat from the power supply. The result of this project is a conceptual thermal control system design which employs heat pipes as primary components, both for heat transport and temperature control. The system, its evaluation, and the test results are described.
NASA Technical Reports Server (NTRS)
Martinez, Pedro A.; Dunn, Kevin W.
1987-01-01
This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.
Performance verification testing of the Arkal Pressurized Stormwater Filtration System was conducted under EPA's Environmental Technology Verification Program on a 5.5-acre parking lot and grounds of St. Mary's Hospital in Milwaukee, Wisconsin. The system consists of a water sto...
Test development for the thermionic system evaluation test (TSET) project
NASA Astrophysics Data System (ADS)
Morris, D. Brent; Standley, Vaughn H.; Schuller, Michael J.
1992-01-01
The arrival of a Soviet TOPAZ-II space nuclear reactor affords the US space nuclear power (SNP) community the opportunity to study an assembled thermionic conversion power system. The TOPAZ-II will be studied via the Thermionic System Evaluation Test (TSET) Project. This paper is devoted to the discussion of TSET test development as related to the objectives contained in the TSET Project Plan (Standley et al. 1991). The objectives contained in the Project Plan are the foundation for scheduled TSET tests on TOPAZ-II and are derived from the needs of the Air Force Thermionic SNP program. Our ability to meet the objectives is bounded by unique constraints, such as procurement requirements, operational limitations, and necessary interaction between US and Soviet Scientists and engineers. The fulfillment of the test objectives involves a thorough methodology of test scheduling and data managment. The overall goals for the TSET program are gaining technical understanding of a thermionic SNP system and demonstrating the capabilities and limitations of such a system while assisting in the training of US scientist and engineers in preparation for US SNP system testing. Tests presently scheduled as part of TSET include setup, demonstration, and verification tests; normal and off-normal operating test, and system and component performance tests.
Evaluation of areas prepared for planting using LANDSAT data. M.S. Thesis; [Ribeirao Preto, Brazil
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Deassuncao, G. V.; Duarte, V.
1983-01-01
Three different algorithms (SINGLE-CELL, MAXVER and MEDIA K) were used to automatically interpret data from LANDSAT observations of an area of Ribeirao Preto, Brazil. Photographic transparencies were obtained, projected and visually interpreted. The results show that: (1) the MAXVER algorithm presented a better classification performance; (2) verification of the changes in cultivated areas using data from the three different acquisition dates was possible; (3) the water bodies, degraded lands, urban areas, and fallow fields were frequently mistaken by cultivated soils; and (4) the use of projected photographic transparencies furnished satisfactory results, besides reducing the time spent on the image-100 system.
Interpreter composition issues in the formal verification of a processor-memory module
NASA Technical Reports Server (NTRS)
Fura, David A.; Cohen, Gerald C.
1994-01-01
This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.
Formal verification of medical monitoring software using Z language: a representative sample.
Babamir, Seyed Morteza; Borhani, Mehdi
2012-08-01
Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
47 CFR 73.151 - Field strength measurements to establish performance of directional antennas.
Code of Federal Regulations, 2010 CFR
2010-10-01
... verified either by field strength measurement or by computer modeling and sampling system verification. (a... specifically identified by the Commission. (c) Computer modeling and sample system verification of modeled... performance verified by computer modeling and sample system verification. (1) A matrix of impedance...
This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-03-01
The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.
NASA Space Radiation Risk Project: Overview and Recent Results
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Chappell, Lori J.; George, Kerry A.; Hada, Megumi; Hu, Shaowen; Kidane, Yared H.; Kim, Myung-Hee Y.; Kovyrshina, Tatiana; Norman, Ryan B.; Nounu, Hatem N.;
2015-01-01
The NASA Space Radiation Risk project is responsible for integrating new experimental and computational results into models to predict risk of cancer and acute radiation syndrome (ARS) for use in mission planning and systems design, as well as current space operations. The project has several parallel efforts focused on proving NASA's radiation risk projection capability in both the near and long term. This presentation will give an overview, with select results from these efforts including the following topics: verification, validation, and streamlining the transition of models to use in decision making; relative biological effectiveness and dose rate effect estimation using a combination of stochastic track structure simulations, DNA damage model calculations and experimental data; ARS model improvements; pathway analysis from gene expression data sets; solar particle event probabilistic exposure calculation including correlated uncertainties for use in design optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Phyllis C.
A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.
Building an outpatient imaging center: A case study at genesis healthcare system, part 2.
Yanci, Jim
2006-01-01
In the second of 2 parts, this article will focus on process improvement projects utilizing a case study at Genesis HealthCare System located in Zanesville, OH. Operational efficiency is a key step in developing a freestanding diagnostic imaging center. The process improvement projects began with an Expert Improvement Session (EIS) on the scheduling process. An EIS session is a facilitated meeting that can last anywhere from 3 hours to 2 days. Its intention is to take a group of people involved with the problem or operational process and work to understand current failures or breakdowns in the process. Recommendations are jointly developed to overcome any current deficiencies, and a work plan is structured to create ownership over the changes. A total of 11 EIS sessions occurred over the course of this project, covering 5 sections: Scheduling/telephone call process, Pre-registration, Verification/pre-certification, MRI throughput, CT throughput. Following is a single example of a project focused on the process improvement efforts. All of the process improvement projects utilized a quasi methodology of "DMAIC" (Define, Measure, Analyze, Improve, and Control).
NASA Technical Reports Server (NTRS)
Thurmond, Beverly A.; Gillan, Douglas J.; Perchonok, Michele G.; Marcus, Beth A.; Bourland, Charles T.
1986-01-01
A team of engineers and food scientists from NASA, the aerospace industry, food companies, and academia are defining the Space Station Food System. The team identified the system requirements based on an analysis of past and current space food systems, food systems from isolated environment communities that resemble Space Station, and the projected Space Station parameters. The team is resolving conflicts among requirements through the use of trade-off analyses. The requirements will give rise to a set of specifications which, in turn, will be used to produce concepts. Concept verification will include testing of prototypes, both in 1-g and microgravity. The end-item specification provides an overall guide for assembling a functional food system for Space Station.
Stennis Space Center Verification & Validation Capabilities
NASA Technical Reports Server (NTRS)
Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir
2007-01-01
Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.
Aqueous cleaning and verification processes for precision cleaning of small parts
NASA Technical Reports Server (NTRS)
Allen, Gale J.; Fishell, Kenneth A.
1995-01-01
The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.
An unattended verification station for UF 6 cylinders: Field trial findings
Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...
2017-08-26
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
An unattended verification station for UF 6 cylinders: Field trial findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L. E.; Miller, K. A.; McDonald, B. S.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
The capability of lithography simulation based on MVM-SEM® system
NASA Astrophysics Data System (ADS)
Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong
2015-10-01
The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.
Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,
Interfacing and Verifying ALHAT Safe Precision Landing Systems with the Morpheus Vehicle
NASA Technical Reports Server (NTRS)
Carson, John M., III; Hirsh, Robert L.; Roback, Vincent E.; Villalpando, Carlos; Busa, Joseph L.; Pierrottet, Diego F.; Trawny, Nikolas; Martin, Keith E.; Hines, Glenn D.
2015-01-01
The NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project developed a suite of prototype sensors to enable autonomous and safe precision landing of robotic or crewed vehicles under any terrain lighting conditions. Development of the ALHAT sensor suite was a cross-NASA effort, culminating in integration and testing on-board a variety of terrestrial vehicles toward infusion into future spaceflight applications. Terrestrial tests were conducted on specialized test gantries, moving trucks, helicopter flights, and a flight test onboard the NASA Morpheus free-flying, rocket-propulsive flight-test vehicle. To accomplish these tests, a tedious integration process was developed and followed, which included both command and telemetry interfacing, as well as sensor alignment and calibration verification to ensure valid test data to analyze ALHAT and Guidance, Navigation and Control (GNC) performance. This was especially true for the flight test campaign of ALHAT onboard Morpheus. For interfacing of ALHAT sensors to the Morpheus flight system, an adaptable command and telemetry architecture was developed to allow for the evolution of per-sensor Interface Control Design/Documents (ICDs). Additionally, individual-sensor and on-vehicle verification testing was developed to ensure functional operation of the ALHAT sensors onboard the vehicle, as well as precision-measurement validity for each ALHAT sensor when integrated within the Morpheus GNC system. This paper provides some insight into the interface development and the integrated-systems verification that were a part of the build-up toward success of the ALHAT and Morpheus flight test campaigns in 2014. These campaigns provided valuable performance data that is refining the path toward spaceflight infusion of the ALHAT sensor suite.
Development of a software safety process and a case study of its use
NASA Technical Reports Server (NTRS)
Knight, John C.
1993-01-01
The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.
NASA Technical Reports Server (NTRS)
1979-01-01
Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.
30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?
Code of Federal Regulations, 2010 CFR
2010-07-01
... a project management timeline, Gantt Chart, that depicts when interim and final reports required by... 30 Mineral Resources 2 2010-07-01 2010-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources MINERALS MANAGEMENT SERVICE...
ERIC Educational Resources Information Center
Wu, Peter Y.; Manohar, Priyadarshan A.; Acharya, Sushil
2016-01-01
It is well known that interesting questions can stimulate thinking and invite participation. Class exercises are designed to make use of questions to engage students in active learning. In a project toward building a community skilled in software verification and validation (SV&V), we critically review and further develop course materials in…
The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.
The U.S.-Mexico Border Program is sponsored ...
This verification study was a special project designed to determine the efficacy of a draft standard operating procedure (SOP) developed by US EPA Region 3 for the determination of selected glycols in drinking waters that may have been impacted by active unconventional oil and ga...
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
NASA Technical Reports Server (NTRS)
Landano, M. R.; Easter, R. W.
1984-01-01
Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.
NASA Technical Reports Server (NTRS)
Davis, Robert E.
2002-01-01
The presentation provides an overview of requirement and interpretation letters, mechanical systems safety interpretation letter, design and verification provisions, and mechanical systems verification plan.
Configuration Management Process Assessment Strategy
NASA Technical Reports Server (NTRS)
Henry, Thad
2014-01-01
Purpose: To propose a strategy for assessing the development and effectiveness of configuration management systems within Programs, Projects, and Design Activities performed by technical organizations and their supporting development contractors. Scope: Various entities CM Systems will be assessed dependent on Project Scope (DDT&E), Support Services and Acquisition Agreements. Approach: Model based structured against assessing organizations CM requirements including best practices maturity criteria. The model is tailored to the entity being assessed dependent on their CM system. The assessment approach provides objective feedback to Engineering and Project Management of the observed CM system maturity state versus the ideal state of the configuration management processes and outcomes(system). center dot Identifies strengths and risks versus audit gotcha's (findings/observations). center dot Used "recursively and iteratively" throughout program lifecycle at select points of need. (Typical assessments timing is Post PDR/Post CDR) center dot Ideal state criteria and maturity targets are reviewed with the assessed entity prior to an assessment (Tailoring) and is dependent on the assessed phase of the CM system. center dot Supports exit success criteria for Preliminary and Critical Design Reviews. center dot Gives a comprehensive CM system assessment which ultimately supports configuration verification activities.*
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-26
... Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY... lists the following information: Title of Proposal: Enterprise Income Verification (EIV) System User Access, Authorization Form and Rules Of Behavior and User Agreement. OMB Approval Number: 2577-New. Form...
Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report
NASA Technical Reports Server (NTRS)
Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.
2017-01-01
This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.
General Environmental Verification Specification
NASA Technical Reports Server (NTRS)
Milne, J. Scott, Jr.; Kaufman, Daniel S.
2003-01-01
The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.
NASA Astrophysics Data System (ADS)
Chow, L.; Fai, S.
2017-08-01
The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.
Software Development Technologies for Reactive, Real-Time, and Hybrid Systems
NASA Technical Reports Server (NTRS)
Manna, Zohar
1996-01-01
The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.
A Generic Modeling Process to Support Functional Fault Model Development
NASA Technical Reports Server (NTRS)
Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.
2016-01-01
Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.
Communication Architecture in Mixed-Reality Simulations of Unmanned Systems
2018-01-01
Verification of the correct functionality of multi-vehicle systems in high-fidelity scenarios is required before any deployment of such a complex system, e.g., in missions of remote sensing or in mobile sensor networks. Mixed-reality simulations where both virtual and physical entities can coexist and interact have been shown to be beneficial for development, testing, and verification of such systems. This paper deals with the problems of designing a certain communication subsystem for such highly desirable realistic simulations. Requirements of this communication subsystem, including proper addressing, transparent routing, visibility modeling, or message management, are specified prior to designing an appropriate solution. Then, a suitable architecture of this communication subsystem is proposed together with solutions to the challenges that arise when simultaneous virtual and physical message transmissions occur. The proposed architecture can be utilized as a high-fidelity network simulator for vehicular systems with implicit mobility models that are given by real trajectories of the vehicles. The architecture has been utilized within multiple projects dealing with the development and practical deployment of multi-UAV systems, which support the architecture’s viability and advantages. The provided experimental results show the achieved similarity of the communication characteristics of the fully deployed hardware setup to the setup utilizing the proposed mixed-reality architecture. PMID:29538290
Communication Architecture in Mixed-Reality Simulations of Unmanned Systems.
Selecký, Martin; Faigl, Jan; Rollo, Milan
2018-03-14
Verification of the correct functionality of multi-vehicle systems in high-fidelity scenarios is required before any deployment of such a complex system, e.g., in missions of remote sensing or in mobile sensor networks. Mixed-reality simulations where both virtual and physical entities can coexist and interact have been shown to be beneficial for development, testing, and verification of such systems. This paper deals with the problems of designing a certain communication subsystem for such highly desirable realistic simulations. Requirements of this communication subsystem, including proper addressing, transparent routing, visibility modeling, or message management, are specified prior to designing an appropriate solution. Then, a suitable architecture of this communication subsystem is proposed together with solutions to the challenges that arise when simultaneous virtual and physical message transmissions occur. The proposed architecture can be utilized as a high-fidelity network simulator for vehicular systems with implicit mobility models that are given by real trajectories of the vehicles. The architecture has been utilized within multiple projects dealing with the development and practical deployment of multi-UAV systems, which support the architecture's viability and advantages. The provided experimental results show the achieved similarity of the communication characteristics of the fully deployed hardware setup to the setup utilizing the proposed mixed-reality architecture.
Sequence verification of synthetic DNA by assembly of sequencing reads
Wilson, Mandy L.; Cai, Yizhi; Hanlon, Regina; Taylor, Samantha; Chevreux, Bastien; Setubal, João C.; Tyler, Brett M.; Peccoud, Jean
2013-01-01
Gene synthesis attempts to assemble user-defined DNA sequences with base-level precision. Verifying the sequences of construction intermediates and the final product of a gene synthesis project is a critical part of the workflow, yet one that has received the least attention. Sequence validation is equally important for other kinds of curated clone collections. Ensuring that the physical sequence of a clone matches its published sequence is a common quality control step performed at least once over the course of a research project. GenoREAD is a web-based application that breaks the sequence verification process into two steps: the assembly of sequencing reads and the alignment of the resulting contig with a reference sequence. GenoREAD can determine if a clone matches its reference sequence. Its sophisticated reporting features help identify and troubleshoot problems that arise during the sequence verification process. GenoREAD has been experimentally validated on thousands of gene-sized constructs from an ORFeome project, and on longer sequences including whole plasmids and synthetic chromosomes. Comparing GenoREAD results with those from manual analysis of the sequencing data demonstrates that GenoREAD tends to be conservative in its diagnostic. GenoREAD is available at www.genoread.org. PMID:23042248
Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)
2003-01-01
Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.
ROVER : prototype roving verification van : transportation project summary
DOT National Transportation Integrated Search
1997-06-01
The purpose of this project is to verify the safety and legality of commercial vehicles at both fixed and mobile roadside sites. improving the efficiency, safety. and effectiveness of commercial vehicle operations through the use of timely, accurate ...
The use of robots for arms control treaty verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalowski, S.J.
1991-01-01
Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less
Long-Term Durability Analysis of a 100,000+ Hr Stirling Power Convertor Heater Head
NASA Technical Reports Server (NTRS)
Bartolotta, Paul A.; Bowman, Randy R.; Krause, David L.; Halford, Gary R.
2000-01-01
DOE and NASA have identified Stirling Radioisotope Power Systems (SRPS) as the power supply for deep space exploration missions the Europa Orbiter and Solar Probe. As a part of this effort, NASA has initiated a long-term durability project for critical hot section components of the Stirling power convertor to qualify flight hardware. This project will develop a life prediction methodology that utilizes short-term (t < 20,000 hr) test data to verify long-term (t > 100,000 hr) design life. The project consists of generating a materials database for the specific heat of alloy, evaluation of critical hermetic sealed joints, life model characterization, and model verification. This paper will describe the qualification methodology being developed and provide a status for this effort.
Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...
Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.
Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L
1995-01-01
This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
Code of Federal Regulations, 2014 CFR
2014-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
Code of Federal Regulations, 2013 CFR
2013-10-01
... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...
System Engineering Infrastructure Evolution Galileo IOV and the Steps Beyond
NASA Astrophysics Data System (ADS)
Eickhoff, J.; Herpel, H.-J.; Steinle, T.; Birn, R.; Steiner, W.-D.; Eisenmann, H.; Ludwig, T.
2009-05-01
The trends to more and more constrained financial budgets in satellite engineering require a permanent optimization of the S/C system engineering processes and infrastructure. Astrium in the recent years already has built up a system simulation infrastructure - the "Model-based Development & Verification Environment" - which meanwhile is well known all over Europe and is established as Astrium's standard approach for ESA, DLR projects and now even the EU/ESA-Project Galileo IOV. The key feature of the MDVE / FVE approach is to provide entire S/C simulation (with full featured OBC simulation) already in early phases to start OBSW code tests on a simulated S/C and to later add hardware in the loop step by step up to an entire "Engineering Functional Model (EFM)" or "FlatSat". The subsequent enhancements to this simulator infrastructure w.r.t. spacecraft design data handling are reported in the following sections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARTER, R.P.
1999-11-19
The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78). Integrated Safety Management (ISM) requires contractors to integrate safety into management and work practices at all levels so that missions are achieved while protecting the public, the worker, and the environment. The contractor is required to describe the Integrated Safety Management System (ISMS) to be used to implement the safety performance objective.
A combined system for 3D printing cybersecurity
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2017-06-01
Previous work has discussed the impact of cybersecurity breaches on 3D printed objects. Multiple attack types that could weaken objects, make them unsuitable for certain applications and even create safety hazards have been presented. This paper considers a visible light sensing-based verification system's efficacy as a means of thwarting cybersecurity threats to 3D printing. This system detects discrepancies between expected and actual printed objects (based on an independent pristine CAD model). Whether reliance on an independent CAD model is appropriate is also considered. The future of 3D printing is projected and the importance of cybersecurity in this future is discussed.
DOT National Transportation Integrated Search
2016-11-21
Work presented herein is an addendum to the final report for NCDOT Project 2011-05 entitled : Field Verification of Undercut Criteria and Alternatives for Subgrade Stabilization in the : Piedmont Area. The objective of the addendum work is to p...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the ...
46 CFR 61.40-3 - Design verification testing.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...
The DPACS project at the University of Trieste.
Fioravanti, F; Inchingolo, P; Valenzin, G; Dalla Palma, L
1997-01-01
The DPACS project (Data and Picture Archiving and Communication System) was undertaken at the University of Trieste by the Institute of Radiology and the DEEI (Dipartimento di Elettrotecnica, Elettronica ed Informatica), in collaboration with the CRSTBS (Centro Ricerche e Studi Tecnologie Biomediche Sanitarie) of the Area Science Park and the Azienda Ospedaliera of Trieste. The main objective of this project is to create an open system for the management of clinical data and images and for the integration of health care services. The first phase is oriented toward finding an implementation strategy for the creation of a prototype DPACS system, to serve as a starting point for the realization of a distributed structure for the extension of the service, firstly to the entire structure of the Cattinara Hospital and subsequently to all the Public Health units in Trieste. After local testing, the service will finally be expanded to a wider geographical level. The intensive computerization of the Institute of Radiology furnished the most favourable environment for the verification of the prototype, as the service provided by the existing RIS (Radiology Information System) and PACS (Picture and Archiving Communication System) has long been consolidated. One of the main goals of the project, in particular, is to replace the old, by now obsolete, PACS with the DPACS services.
Options and Risk for Qualification of Electric Propulsion System
NASA Technical Reports Server (NTRS)
Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)
2002-01-01
Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.
NASA Astrophysics Data System (ADS)
Roed-Larsen, Trygve; Flach, Todd
The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.
Verification and Implementation of Operations Safety Controls for Flight Missions
NASA Technical Reports Server (NTRS)
Smalls, James R.; Jones, Cheryl L.; Carrier, Alicia S.
2010-01-01
There are several engineering disciplines, such as reliability, supportability, quality assurance, human factors, risk management, safety, etc. Safety is an extremely important engineering specialty within NASA, and the consequence involving a loss of crew is considered a catastrophic event. Safety is not difficult to achieve when properly integrated at the beginning of each space systems project/start of mission planning. The key is to ensure proper handling of safety verification throughout each flight/mission phase. Today, Safety and Mission Assurance (S&MA) operations engineers continue to conduct these flight product reviews across all open flight products. As such, these reviews help ensure that each mission is accomplished with safety requirements along with controls heavily embedded in applicable flight products. Most importantly, the S&MA operations engineers are required to look for important design and operations controls so that safety is strictly adhered to as well as reflected in the final flight product.
Avionics Technology Contract Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Sappe', Hoyt; Squires, Shiela S.
This document reports on Phase I of a project that examined the occupation of avionics technician, established appropriate committees, and conducted task verification. Results of this phase provide the basic information required to develop the program standards and to guide and set up the committee structure to guide the project. Section 1…
Face verification with balanced thresholds.
Yan, Shuicheng; Xu, Dong; Tang, Xiaoou
2007-01-01
The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Lei, E-mail: lei.ren@duke.edu; Yin, Fang-Fang; Zhang, You
Purpose: Currently, no 3D or 4D volumetric x-ray imaging techniques are available for intrafraction verification of target position during actual treatment delivery or in-between treatment beams, which is critical for stereotactic radiosurgery (SRS) and stereotactic body radiation therapy (SBRT) treatments. This study aims to develop a limited-angle intrafraction verification (LIVE) system to use prior information, deformation models, and limited angle kV-MV projections to verify target position intrafractionally. Methods: The LIVE system acquires limited-angle kV projections simultaneously during arc treatment delivery or in-between static 3D/IMRT treatment beams as the gantry moves from one beam to the next. Orthogonal limited-angle MV projectionsmore » are acquired from the beam's eye view (BEV) exit fluence of arc treatment beam or in-between static beams to provide additional anatomical information. MV projections are converted to kV projections using a linear conversion function. Patient prior planning CT at one phase is used as the prior information, and the on-board patient volume is considered as a deformation of the prior images. The deformation field is solved using the data fidelity constraint, a breathing motion model extracted from the planning 4D-CT based on principal component analysis (PCA) and a free-form deformation (FD) model. LIVE was evaluated using a 4D digital extended cardiac torso phantom (XCAT) and a CIRS 008A dynamic thoracic phantom. In the XCAT study, patient breathing pattern and tumor size changes were simulated from CT to treatment position. In the CIRS phantom study, the artificial target in the lung region experienced both size change and position shift from CT to treatment position. Varian Truebeam research mode was used to acquire kV and MV projections simultaneously during the delivery of a dynamic conformal arc plan. The reconstruction accuracy was evaluated by calculating the 3D volume percentage difference (VPD) and the center of mass (COM) difference of the tumor in the true on-board images and reconstructed images. Results: In both simulation and phantom studies, LIVE achieved substantially better reconstruction accuracy than reconstruction using PCA or FD deformation model alone. In the XCAT study, the average VPD and COM differences among different patient scenarios for LIVE system using orthogonal 30° scan angles were 4.3% and 0.3 mm when using kV+BEV MV. Reducing scan angle to 15° increased the average VPD and COM differences to 15.1% and 1.7 mm. In the CIRS phantom study, the VPD and COM differences for the LIVE system using orthogonal 30° scan angles were 6.4% and 1.4 mm. Reducing scan angle to 15° increased the VPD and COM differences to 51.9% and 3.8 mm. Conclusions: The LIVE system has the potential to substantially improve intrafraction target localization accuracy by providing volumetric verification of tumor position simultaneously during arc treatment delivery or in-between static treatment beams. With this improvement, LIVE opens up a new avenue for margin reduction and dose escalation in both fractionated treatments and SRS and SBRT treatments.« less
Tethered Satellite System Project Overview
NASA Technical Reports Server (NTRS)
Laue, J. H.
1985-01-01
The Skyhook concept is reviewed and the use of a tethered satellite system (TSS) to enable scientific investigations from the shuttle using a closed loop control system is examined. The tethered satellite system has capabilities for deployment toward or away from Earth, for multiple round trip missions, and for deployment at distances up to 100 KN from the orbiter. The deployer, which consists of an entendable boom, a reel for the tether, and the tether itself, permits deployment and retrieval at a safe distance, allows alignment of the force vector of the tether through the center of gravity of the shuttle, and gives some initial gravity gradient separation to aid in deployment and ultimate retrieval of the tethered satellite. Charts show TSS activities in terms of systems studies, key guidelines, Italian and U.S. responsibilities, user activities, and major science and applications accommodation features. Scientific objectives for TSS-1 and TSS-2 verification missions and the current status of the project are also given.
Thermal hydraulic feasibility assessment of the hot conditioning system and process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heard, F.J.
1996-10-10
The Spent Nuclear Fuel Project was established to develop engineered solutions for the expedited removal, stabilization, and storage of spent nuclear fuel from the K Basins at the U.S. Department of Energy`s Hanford Site in Richland, Washington. A series of analyses have been completed investigating the thermal-hydraulic performance and feasibility of the proposed Hot Conditioning System and process for the Spent Nuclear Fuel Project. The analyses were performed using a series of thermal-hydraulic models that could respond to all process and safety-related issues that may arise pertaining to the Hot Conditioning System. The subject efforts focus on independently investigating, quantifying,more » and establishing the governing heat production and removal mechanisms, flow distributions within the multi-canister overpack, and performing process simulations for various purge gases under consideration for the Hot Conditioning System, as well as obtaining preliminary results for comparison with and verification of other analyses, and providing technology- based recommendations for consideration and incorporation into the Hot Conditioning System design bases.« less
Firing Room Remote Application Software Development
NASA Technical Reports Server (NTRS)
Liu, Kan
2014-01-01
The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories subsystem. In addition, a Conversion Fusion project was created to show specific approved checkout and launch engineering data for public-friendly display purposes.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-06
...] Domestic Origin Verification System Questionnaire and Regulations Governing Inspection and Certification of... inspection at the above office during regular business hours. Please be advised that all comments submitted... submitting the comments will be made public. SUPPLEMENTARY INFORMATION: Title: ``Domestic Origin Verification...
AQUEOUS CLEANING OF PRINTED CIRCUIT BOARD STENCILS
The USEPA through NRMRL has partnered with the California Dept. of Toxic Substance Control under an ETV Pilot Project to verigy polllution prevention, recycling and waste treatment technologies. One of the projects selected for verification was the ultrasonic aqueous cleaning tec...
SMART-1, Platform Design and Project Status
NASA Astrophysics Data System (ADS)
Sjoberg, F.
SMART-1 is the first of the Small Missions for Advanced Research and Technology (SMART), an element of ESA's Horizons 2000 plan for scientific projects. These missions aim at testing key technologies for future Cornerstone missions. The mission of SMART-1 is the flight demonstration of Electric Primary Propulsion for a scientifically relevant deep space trajectory. More specifically, SMART-1 will be launched into a geostationary transfer orbit and use a single ion thruster to achieve lunar orbit. include: -A modern avionics architecture with a clean-cut control hierarchy -Extensive Failure Detection, Isolation and Recovery (FDIR) capabilities following the control hierarchy of the -An advanced power control and distribution system -A newly developed gimbal mechanism for the orientation of the electric ion thruster The project is currently in the FM AIT phase scheduled for launch in late 2002. The paper will describe the SMART- 1 spacecraft platform design as well as the current project and spacecraft verification status.
Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...
Overview of the TOPEX/Poseidon Platform Harvest Verification Experiment
NASA Technical Reports Server (NTRS)
Morris, Charles S.; DiNardo, Steven J.; Christensen, Edward J.
1995-01-01
An overview is given of the in situ measurement system installed on Texaco's Platform Harvest for verification of the sea level measurement from the TOPEX/Poseidon satellite. The prelaunch error budget suggested that the total root mean square (RMS) error due to measurements made at this verification site would be less than 4 cm. The actual error budget for the verification site is within these original specifications. However, evaluation of the sea level data from three measurement systems at the platform has resulted in unexpectedly large differences between the systems. Comparison of the sea level measurements from the different tide gauge systems has led to a better understanding of the problems of measuring sea level in relatively deep ocean. As of May 1994, the Platform Harvest verification site has successfully supported 60 TOPEX/Poseidon overflights.
NASA Technical Reports Server (NTRS)
Srivas, Mandayam; Bickford, Mark
1991-01-01
The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.
1981-01-01
The achievement of an optimized design from the system standpoint under the low cost, high risk constraints of the present day environment was analyzed. Space Shuttle illustrates the requirement for an analysis approach that considers all major disciplines (coupling between structures control, propulsion, thermal, aeroelastic, and performance), simultaneously. The Space Shuttle and certain payloads, Space Telescope and Spacelab, are examined. The requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements, control requirements, and the resulting design verification approaches are illustrated. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.
This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less
Development of an Intelligent Monitoring System for Geological Carbon Sequestration (GCS) Systems
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Jeong, H.; Xu, W.; Hovorka, S. D.; Zhu, T.; Templeton, T.; Arctur, D. K.
2016-12-01
To provide stakeholders timely evidence that GCS repositories are operating safely and efficiently requires integrated monitoring to assess the performance of the storage reservoir as the CO2 plume moves within it. As a result, GCS projects can be data intensive, as a result of proliferation of digital instrumentation and smart-sensing technologies. GCS projects are also resource intensive, often requiring multidisciplinary teams performing different monitoring, verification, and accounting (MVA) tasks throughout the lifecycle of a project to ensure secure containment of injected CO2. How to correlate anomaly detected by a certain sensor to events observed by other devices to verify leakage incidents? How to optimally allocate resources for task-oriented monitoring if reservoir integrity is in question? These are issues that warrant further investigation before real integration can take place. In this work, we are building a web-based, data integration, assimilation, and learning framework for geologic carbon sequestration projects (DIAL-GCS). DIAL-GCS will be an intelligent monitoring system (IMS) for automating GCS closed-loop management by leveraging recent developments in high-throughput database, complex event processing, data assimilation, and machine learning technologies. Results will be demonstrated using realistic data and model derived from a GCS site.
Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...
2016-10-13
This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less
NASA Astrophysics Data System (ADS)
McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye
1997-06-01
The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.
40 CFR 1066.240 - Torque transducer verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...
Missile and Space Systems Reliability versus Cost Trade-Off Study
1983-01-01
F00-1C09 Robert C. Schneider F00-1C09 V . PERFORMING ORGANIZATION NAME AM0 ADDRESS 16 PRGRAM ELEMENT. PROJECT. TASK BoeingAerosace CmpAnyA CA WORK UNIT...reliability problems, which has the - real bearing on program effectiveness. A well planned and funded reliability effort can prevent or ferret out...failure analysis, and the in- corporation and verification of design corrections to prevent recurrence of failures. 302.2.2 A TMJ test plan shall be
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
A partial acceptance test was conducted on the El Toro Library Solar Energy System, and the detailed results of the various mode acceptance tests are given. All the modes tested function as designed. Collector array efficiencies were calculated at approximately 40%. Chiller COP was estimated at .50, with chiller loop flow rates approximately 85 to 90% of design flow. The acceptance test included visual inspection, preoperational testing and procedure verification, operational mode checkout, and performance testing. (LEW)
Multimodal biometric approach for cancelable face template generation
NASA Astrophysics Data System (ADS)
Paul, Padma Polash; Gavrilova, Marina
2012-06-01
Due to the rapid growth of biometric technology, template protection becomes crucial to secure integrity of the biometric security system and prevent unauthorized access. Cancelable biometrics is emerging as one of the best solutions to secure the biometric identification and verification system. We present a novel technique for robust cancelable template generation algorithm that takes advantage of the multimodal biometric using feature level fusion. Feature level fusion of different facial features is applied to generate the cancelable template. A proposed algorithm based on the multi-fold random projection and fuzzy communication scheme is used for this purpose. In cancelable template generation, one of the main difficulties is keeping interclass variance of the feature. We have found that interclass variations of the features that are lost during multi fold random projection can be recovered using fusion of different feature subsets and projecting in a new feature domain. Applying the multimodal technique in feature level, we enhance the interclass variability hence improving the performance of the system. We have tested the system for classifier fusion for different feature subset and different cancelable template fusion. Experiments have shown that cancelable template improves the performance of the biometric system compared with the original template.
NASA Astrophysics Data System (ADS)
Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.
2017-08-01
In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.
Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth
2016-01-01
If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.
Molecular Verification of Cryptops hortensis (Scolopendromorpha: Cryptopidae) in theNearctic Region
2018-01-29
Journal Article 3. DATES COVERED (From – To) March – April 2016 4. TITLE AND SUBTITLE Molecular Verification of Cryptops hortensis...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USAF School of Aerospace Medicine ...Public Health and Preventive Medicine Dept/PHR 2510 Fifth St., Bldg. 840 Wright-Patterson AFB, OH 45433-7913 8. PERFORMING ORGANIZATION REPORT
Innovative safety valve selection techniques and data.
Miller, Curt; Bredemyer, Lindsey
2007-04-11
The new valve data resources and modeling tools that are available today are instrumental in verifying that that safety levels are being met in both current installations and project designs. If the new ISA 84 functional safety practices are followed closely, good industry validated data used, and a user's maintenance integrity program strictly enforced, plants should feel confident that their design has been quantitatively reinforced. After 2 years of exhaustive reliability studies, there are now techniques and data available to support this safety system component deficiency. Everyone who has gone through the process of safety integrity level (SIL) verification (i.e. reliability math) will appreciate the progress made in this area. The benefits of these advancements are improved safety with lower lifecycle costs such as lower capital investment and/or longer testing intervals. This discussion will start with a review of the different valve, actuator, and solenoid/positioner combinations that can be used and their associated application restraints. Failure rate reliability studies (i.e. FMEDA) and data associated with the final combinations will then discussed. Finally, the impact of the selections on each safety system's SIL verification will be reviewed.
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2012 CFR
2012-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2011 CFR
2011-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2013 CFR
2013-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
48 CFR 4.1302 - Acquisition of approved products and services for personal identity verification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... products and services for personal identity verification. 4.1302 Section 4.1302 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Personal Identity Verification 4.1302 Acquisition of approved products and services for personal identity verification. (a) In...
Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...
Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...
Current status of verification practices in clinical biochemistry in Spain.
Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè
2013-09-01
Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.
NASA Astrophysics Data System (ADS)
Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.
2016-01-01
The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e., the observation that large-scale rainfall structures are more persistent and predictable than small-scale convective cells. This paper presents the development, adaptation and verification of the STEPS system for Belgium (STEPS-BE). STEPS-BE provides in real-time 20-member ensemble precipitation nowcasts at 1 km and 5 min resolutions up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 75-90 % of the forecast errors.
NASA Astrophysics Data System (ADS)
Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.
2015-07-01
The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE). STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80-90 % of the forecast errors.
Use of metaknowledge in the verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Morell, Larry J.
1989-01-01
Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.
NASA Astrophysics Data System (ADS)
Hirano, Ryoichi; Iida, Susumu; Amano, Tsuyoshi; Watanabe, Hidehiro; Hatakeyama, Masahiro; Murakami, Takeshi; Suematsu, Kenichi; Terao, Kenji
2016-03-01
Novel projection electron microscope optics have been developed and integrated into a new inspection system named EBEYE-V30 ("Model EBEYE" is an EBARA's model code) , and the resulting system shows promise for application to half-pitch (hp) 16-nm node extreme ultraviolet lithography (EUVL) patterned mask inspection. To improve the system's inspection throughput for 11-nm hp generation defect detection, a new electron-sensitive area image sensor with a high-speed data processing unit, a bright and stable electron source, and an image capture area deflector that operates simultaneously with the mask scanning motion have been developed. A learning system has been used for the mask inspection tool to meet the requirements of hp 11-nm node EUV patterned mask inspection. Defects are identified by the projection electron microscope system using the "defectivity" from the characteristics of the acquired image. The learning system has been developed to reduce the labor and costs associated with adjustment of the detection capability to cope with newly-defined mask defects. We describe the integration of the developed elements into the inspection tool and the verification of the designed specification. We have also verified the effectiveness of the learning system, which shows enhanced detection capability for the hp 11-nm node.
Pathfinding the Flight Advanced Stirling Convertor Design with the ASC-E3
NASA Technical Reports Server (NTRS)
Wong, Wayne A.; Wilson, Kyle; Smith, Eddie; Collins, Josh
2012-01-01
The Advanced Stirling Convertor (ASC) was initially developed by Sunpower, Inc. under contract to NASA Glenn Research Center (GRC) as a technology development project. The ASC technology fulfills NASA's need for high efficiency power convertors for future Radioisotope Power Systems (RPS). Early successful technology demonstrations between 2003 to 2005 eventually led to the expansion of the project including the decision in 2006 to use the ASC technology on the Advanced Stirling Radioisotope Generator (ASRG). Sunpower has delivered 22 ASC convertors of progressively mature designs to date to GRC. Currently, Sunpower with support from GRC, Lockheed Martin Space System Company (LMSSC), and the Department of Energy (DOE) is developing the flight ASC-F in parallel with the ASC-E3 pathfinders. Sunpower will deliver four pairs of ASC-E3 convertors to GRC which will be used for extended operation reliability assessment, independent validation and verification testing, system interaction tests, and to support LMSSC controller verification. The ASC-E3 and -F convertors are being built to the same design and processing documentation and the same product specification. The initial two pairs of ASC-E3 are built before the flight units and will validate design and processing changes prior to implementation on the ASC-F flight convertors. This paper provides a summary on development of the ASC technology and the status of the ASC-E3 build and how they serve the vital pathfinder role ahead of the flight build for ASRG. The ASRG is part of two of the three candidate missions being considered for selection for the Discovery 12 mission.
Code of Federal Regulations, 2013 CFR
2013-07-01
... OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES... 30 Mineral Resources 2 2013-07-01 2013-07-01 false What are the CVA's or project engineer's... Certified Verification Agent § 585.708 What are the CVA's or project engineer's primary duties for...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., what must the CVA or project engineer verify? 585.709 Section 585.709 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING... Verification Agent § 585.709 When conducting onsite fabrication inspections, what must the CVA or project...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., what must the CVA or project engineer do? 585.710 Section 585.710 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING... Verification Agent § 585.710 When conducting onsite installation inspections, what must the CVA or project...
Code of Federal Regulations, 2012 CFR
2012-07-01
... OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES... 30 Mineral Resources 2 2012-07-01 2012-07-01 false What are the CVA's or project engineer's... Certified Verification Agent § 585.708 What are the CVA's or project engineer's primary duties for...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., what must the CVA or project engineer verify? 585.709 Section 585.709 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING... Verification Agent § 585.709 When conducting onsite fabrication inspections, what must the CVA or project...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., what must the CVA or project engineer do? 585.710 Section 585.710 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING... Verification Agent § 585.710 When conducting onsite installation inspections, what must the CVA or project...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., what must the CVA or project engineer do? 585.710 Section 585.710 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING... Verification Agent § 585.710 When conducting onsite installation inspections, what must the CVA or project...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., what must the CVA or project engineer verify? 585.709 Section 585.709 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES OF EXISTING... Verification Agent § 585.709 When conducting onsite fabrication inspections, what must the CVA or project...
Code of Federal Regulations, 2014 CFR
2014-07-01
... OF OCEAN ENERGY MANAGEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE RENEWABLE ENERGY AND ALTERNATE USES... 30 Mineral Resources 2 2014-07-01 2014-07-01 false What are the CVA's or project engineer's... Certified Verification Agent § 585.708 What are the CVA's or project engineer's primary duties for...
2010-09-01
what level of detail is needed to build their teams, and they can add more detailed items from the model in order to tap deeper in the performance of...of a project on ‘Command Team Effectiveness’ by Task Group 127 for the RTO Human Factors and Medicine Panel (RTG HFM-127). Published...vérification du modèle et de l’instrument) This Technical Report documents the findings of a project on ‘Command Team Effectiveness’ by Task Group
The Air Pollution Control Technology Verification Center (APCT Center) is operated by RTI International (RTI), in cooperation with EPA's National Risk Management Research Laboratory. The APCT Center conducts verifications of technologies that clean air in ventilation systems, inc...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
45 CFR 95.626 - Independent Verification and Validation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...
24 CFR 5.512 - Verification of eligible immigration status.
Code of Federal Regulations, 2010 CFR
2010-04-01
... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
Verification of Autonomous Systems for Space Applications
NASA Technical Reports Server (NTRS)
Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.
2006-01-01
Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.
Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...
NASA Data for Water Resources Applications
NASA Technical Reports Server (NTRS)
Toll, David; Houser, Paul; Arsenault, Kristi; Entin, Jared
2004-01-01
Water Management Applications is one of twelve elements in the Earth Science Enterprise National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of: 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies including the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. This includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being eliminated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems WAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification validation. Water Management Applications is one of twelve elements in the Earth Science Enterprise s National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. T us includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being evaluated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems (LDAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification and validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klüter, Sebastian, E-mail: sebastian.klueter@med.uni-heidelberg.de; Schubert, Kai; Lissner, Steffen
Purpose: The dosimetric verification of treatment plans in helical tomotherapy usually is carried out via verification measurements. In this study, a method for independent dose calculation of tomotherapy treatment plans is presented, that uses a conventional treatment planning system with a pencil kernel dose calculation algorithm for generation of verification dose distributions based on patient CT data. Methods: A pencil beam algorithm that directly uses measured beam data was configured for dose calculation for a tomotherapy machine. Tomotherapy treatment plans were converted into a format readable by an in-house treatment planning system by assigning each projection to one static treatmentmore » field and shifting the calculation isocenter for each field in order to account for the couch movement. The modulation of the fluence for each projection is read out of the delivery sinogram, and with the kernel-based dose calculation, this information can directly be used for dose calculation without the need for decomposition of the sinogram. The sinogram values are only corrected for leaf output and leaf latency. Using the converted treatment plans, dose was recalculated with the independent treatment planning system. Multiple treatment plans ranging from simple static fields to real patient treatment plans were calculated using the new approach and either compared to actual measurements or the 3D dose distribution calculated by the tomotherapy treatment planning system. In addition, dose–volume histograms were calculated for the patient plans. Results: Except for minor deviations at the maximum field size, the pencil beam dose calculation for static beams agreed with measurements in a water tank within 2%/2 mm. A mean deviation to point dose measurements in the cheese phantom of 0.89% ± 0.81% was found for unmodulated helical plans. A mean voxel-based deviation of −0.67% ± 1.11% for all voxels in the respective high dose region (dose values >80%), and a mean local voxel-based deviation of −2.41% ± 0.75% for all voxels with dose values >20% were found for 11 modulated plans in the cheese phantom. Averaged over nine patient plans, the deviations amounted to −0.14% ± 1.97% (voxels >80%) and −0.95% ± 2.27% (>20%, local deviations). For a lung case, mean voxel-based deviations of more than 4% were found, while for all other patient plans, all mean voxel-based deviations were within ±2.4%. Conclusions: The presented method is suitable for independent dose calculation for helical tomotherapy within the known limitations of the pencil beam algorithm. It can serve as verification of the primary dose calculation and thereby reduce the need for time-consuming measurements. By using the patient anatomy and generating full 3D dose data, and combined with measurements of additional machine parameters, it can substantially contribute to overall patient safety.« less
Positron emission imaging device and method of using the same
Bingham, Philip R.; Mullens, James Allen
2013-01-15
An imaging system and method of imaging are disclosed. The imaging system can include an external radiation source producing pairs of substantially simultaneous radiation emissions of a picturization emission and a verification emissions at an emission angle. The imaging system can also include a plurality of picturization sensors and at least one verification sensor for detecting the picturization and verification emissions, respectively. The imaging system also includes an object stage is arranged such that a picturization emission can pass through an object supported on said object stage before being detected by one of said plurality of picturization sensors. A coincidence system and a reconstruction system can also be included. The coincidence can receive information from the picturization and verification sensors and determine whether a detected picturization emission is direct radiation or scattered radiation. The reconstruction system can produce a multi-dimensional representation of an object imaged with the imaging system.
NASA Technical Reports Server (NTRS)
Dabney, James B.; Arthur, James Douglas
2017-01-01
Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.
Verification testing of the Aquionics, Inc. bersonInLine® 4250 UV System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills Wastewater Treatment Plant test site in Parsippany, New Jersey. Two full-scale reactors were mounted in series. T...
Verification testing of the Ondeo Degremont, Inc. Aquaray® 40 HO VLS Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Three reactor modules were m...
NASA Technical Reports Server (NTRS)
Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.
2012-01-01
A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.
Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M
2000-02-01
Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.
Experiencing the "Community" in Community College Teaching through Mural Making.
ERIC Educational Resources Information Center
Elmes, Ellen
2002-01-01
Uses the five stages of creativity designed by psychologist Jacob Getzel--first insight, saturation, incubation, illumination, and verification--to describe a mural project in a rural community college. Reports that the project successfully paired students with community members to paint two local murals. (NB)
Ada(R) Test and Verification System (ATVS)
NASA Technical Reports Server (NTRS)
Strelich, Tom
1986-01-01
The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.
Experimental evaluation of fingerprint verification system based on double random phase encoding
NASA Astrophysics Data System (ADS)
Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi
2006-03-01
We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodds, K.; Daley, T.; Freifeld, B.
2009-05-01
The Australian Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC) is currently injecting 100,000 tons of CO{sub 2} in a large-scale test of storage technology in a pilot project in southeastern Australia called the CO2CRC Otway Project. The Otway Basin, with its natural CO{sub 2} accumulations and many depleted gas fields, offers an appropriate site for such a pilot project. An 80% CO{sub 2} stream is produced from a well (Buttress) near the depleted gas reservoir (Naylor) used for storage (Figure 1). The goal of this project is to demonstrate that CO{sub 2} can be safely transported, stored underground, andmore » its behavior tracked and monitored. The monitoring and verification framework has been developed to monitor for the presence and behavior of CO{sub 2} in the subsurface reservoir, near surface, and atmosphere. This monitoring framework addresses areas, identified by a rigorous risk assessment, to verify conformance to clearly identifiable performance criteria. These criteria have been agreed with the regulatory authorities to manage the project through all phases addressing responsibilities, liabilities, and to assure the public of safe storage.« less
Static and Dynamic Verification of Critical Software for Space Applications
NASA Astrophysics Data System (ADS)
Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.
Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.
Engineering design aspects of the heat-pipe power system
NASA Technical Reports Server (NTRS)
Capell, B. M.; Houts, M. G.; Poston, D. I.; Berte, M.
1997-01-01
The Heat-pipe Power System (HPS) is a near-term, low-cost space power system designed at Los Alamos that can provide up to 1,000 kWt for many space nuclear applications. The design of the reactor is simple, modular, and adaptable. The basic design allows for the use of a variety of power conversion systems and reactor materials (including the fuel, clad, and heat pipes). This paper describes a project that was undertaken to develop a database supporting many engineering aspects of the HPS design. The specific tasks discussed in this paper are: the development of an HPS materials database, the creation of finite element models that will allow a wide variety of investigations, and the verification of past calculations.
Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.
ERIC Educational Resources Information Center
Chen, Joseph C.; Chang, Ted C.
2000-01-01
Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samuel, D; Testa, M; Park, Y
Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less
ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM
The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...
The Environmental Technology Verification report discusses the technology and performance of the Xonon Cool Combustion System manufactured by Catalytica Energy Systems, Inc., formerly Catalytica Combustion Systems, Inc., to control NOx emissions from gas turbines that operate wit...
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team
2010-01-01
The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
NASA Technical Reports Server (NTRS)
Kapoor, Manju M.; Mehta, Manju
2010-01-01
The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.
Practices in Adequate Structural Design
NASA Technical Reports Server (NTRS)
Ryan, Robert S.
1989-01-01
Structural design and verification of space vehicles and space systems is a very tricky and awe inspiring business, particularly for manned missions. Failures in the missions with loss of life is devastating personally and nationally. The scope of the problem is driven by high performance requirements which push state-of-the-art technologies, creating high sensitivites to small variations and uncertainties. Insurance of safe, reliable flight dictates the use of sound principles, procedures, analysis, and testing. Many of those principles which were refocused by the Space Shuttle Challenger (51-L) accident on January 26, 1986, and the activities conducted to insure safe shuttle reflights are discussed. The emphasis will be focused on engineering, while recognizing that project and project management are also key to success.
Practices in adequate structural design
NASA Astrophysics Data System (ADS)
Ryan, Robert S.
1989-01-01
Structural design and verification of space vehicles and space systems is a very tricky and awe inspiring business, particularly for manned missions. Failures in the missions with loss of life is devastating personally and nationally. The scope of the problem is driven by high performance requirements which push state-of-the-art technologies, creating high sensitivites to small variations and uncertainties. Insurance of safe, reliable flight dictates the use of sound principles, procedures, analysis, and testing. Many of those principles which were refocused by the Space Shuttle Challenger (51-L) accident on January 26, 1986, and the activities conducted to insure safe shuttle reflights are discussed. The emphasis will be focused on engineering, while recognizing that project and project management are also key to success.
Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...
NASA Technical Reports Server (NTRS)
Rango, A.
1981-01-01
Both LANDSAT and NOAA satellite data were used in improving snowmelt runoff forecasts. When the satellite snow cover data were tested in both empirical seasonal runoff estimation and short term modeling approaches, a definite potential for reducing forecast error was evident. A cost benefit analysis run in conjunction with the snow mapping indicated a $36.5 million annual benefit accruing from a one percent improvement in forecast accuracy using the snow cover data for the western United States. The annual cost of employing the system would be $505,000. The snow mapping has proven that satellite snow cover data can be used to reduce snowmelt runoff forecast error in a cost effective manner once all operational satellite data are available within 72 hours after acquisition. Executive summaries of the individual snow mapping projects are presented.
NASA Astrophysics Data System (ADS)
Talamonti, C.; Bucciolini, M.; Marrazzo, L.; Menichelli, D.; Bruzzi, M.; Cirrone, G. A. P.; Cuttone, G.; LoJacono, P.
2008-10-01
Due to the features of the modern radiotherapy techniques, namely intensity modulated radiation therapy and proton therapy, where high spatial dose gradients are often present, detectors to be employed for 2D dose verifications have to satisfy very narrow requirements. In particular they have to show high spatial resolution. In the framework of the European Integrated Project—Methods and Advanced Equipment for Simulation and Treatment in Radio-Oncology (MAESTRO, no. LSHC-CT-2004-503564), a dosimetric detector adequate for 2D pre-treatment dose verifications was developed. It is a modular detector, based on a monolithic silicon-segmented sensor, with an n-type implantation on an epitaxial p-type layer. Each pixel element is 2×2 mm 2 and the distance center-to-center is 3 mm. The sensor is composed of 21×21 pixels. In this paper, we report the dosimetric characterization of the system with a proton beam. The sensor was irradiated with 62 MeV protons for clinical treatments at INFN-Laboratori Nazionali del Sud (LNS) Catania. The studied parameters were repeatability of a same pixel, response linearity versus absorbed dose, and dose rate and dependence on field size. The obtained results are promising since the performances are within the project specifications.
Independent Verification and Validation (IV and V) Criteria
NASA Technical Reports Server (NTRS)
McGill, Kenneth
2000-01-01
The purpose of this appendix is to establish quantifiable criteria for determining whether IV&V should be applied to a given software development. Since IV&V should begin in the Formulation Subprocess of a project, the process here described is based on metrics which are available before project approval.
Dental Laboratory Technology. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Sappe', Hoyt; Smith, Debra S.
This report provides results of Phase I of a project that researched the occupational area of dental laboratory technology, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train dental laboratory technicians. Section 1 contains general information:…
Environmental Horticulture. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Bachler, Mike; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of environmental horticulture, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to address the needs of the horticulture field. Section 1 contains general information:…
Using Replication Projects in Teaching Research Methods
ERIC Educational Resources Information Center
Standing, Lionel G.; Grenier, Manuel; Lane, Erica A.; Roberts, Meigan S.; Sykes, Sarah J.
2014-01-01
It is suggested that replication projects may be valuable in teaching research methods, and also address the current need in psychology for more independent verification of published studies. Their use in an undergraduate methods course is described, involving student teams who performed direct replications of four well-known experiments, yielding…
Commercial Art. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Brown, Ted; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of commercial art, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train commercial artists. Section 1 contains general information: purpose of Phase I; description…
Instrumentation Technology. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Sappe', Hoyt; Squires, Sheila S.
This report provides results of Phase I of a project that researched the occupational area of instrumentation technology, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train instrumentation technicians. Section 1 contains general information: purpose of…
Code of Federal Regulations, 2011 CFR
2011-04-01
... Reports for all projects using funds to be closed out have been submitted. HUD will use data contained in the project completion reports in the preparation of the Closeout Reports; (3) The grantee has been... grantee by the Area ONAP must include verification of data reflected in the Closeout Report and...
Time trend of injection drug errors before and after implementation of bar-code verification system.
Sakushima, Ken; Umeki, Reona; Endoh, Akira; Ito, Yoichi M; Nasuhara, Yasuyuki
2015-01-01
Bar-code technology, used for verification of patients and their medication, could prevent medication errors in clinical practice. Retrospective analysis of electronically stored medical error reports was conducted in a university hospital. The number of reported medication errors of injected drugs, including wrong drug administration and administration to the wrong patient, was compared before and after implementation of the bar-code verification system for inpatient care. A total of 2867 error reports associated with injection drugs were extracted. Wrong patient errors decreased significantly after implementation of the bar-code verification system (17.4/year vs. 4.5/year, p< 0.05), although wrong drug errors did not decrease sufficiently (24.2/year vs. 20.3/year). The source of medication errors due to wrong drugs was drug preparation in hospital wards. Bar-code medication administration is effective for prevention of wrong patient errors. However, ordinary bar-code verification systems are limited in their ability to prevent incorrect drug preparation in hospital wards.
NASA Astrophysics Data System (ADS)
Cohen, K. K.; Klara, S. M.; Srivastava, R. D.
2004-12-01
The U.S. Department of Energy's (U.S. DOE's) Carbon Sequestration Program is developing state-of-the-science technologies for measurement, mitigation, and verification (MM&V) in field operations of geologic sequestration. MM&V of geologic carbon sequestration operations will play an integral role in the pre-injection, injection, and post-injection phases of carbon capture and storage projects to reduce anthropogenic greenhouse gas emissions. Effective MM&V is critical to the success of CO2 storage projects and will be used by operators, regulators, and stakeholders to ensure safe and permanent storage of CO2. In the U.S. DOE's Program, Carbon sequestration MM&V has numerous instrumental roles: Measurement of a site's characteristics and capability for sequestration; Monitoring of the site to ensure the storage integrity; Verification that the CO2 is safely stored; and Protection of ecosystems. Other drivers for MM&V technology development include cost-effectiveness, measurement precision, and frequency of measurements required. As sequestration operations are implemented in the future, it is anticipated that measurements over long time periods and at different scales will be required; this will present a significant challenge. MM&V sequestration technologies generally utilize one of the following approaches: below ground measurements; surface/near-surface measurements; aerial and satellite imagery; and modeling/simulations. Advanced subsurface geophysical technologies will play a primary role for MM&V. It is likely that successful MM&V programs will incorporate multiple technologies including but not limited to: reservoir modeling and simulations; geophysical techniques (a wide variety of seismic methods, microgravity, electrical, and electromagnetic techniques); subsurface fluid movement monitoring methods such as injection of tracers, borehole and wellhead pressure sensors, and tiltmeters; surface/near surface methods such as soil gas monitoring and infrared sensors and; aerial and satellite imagery. This abstract will describe results, similarities, and contrasts for funded studies from the U.S. DOE's Carbon Sequestration Program including examples from the Sleipner North Sea Project, the Canadian Weyburn Field/Dakota Gasification Plant Project, the Frio Formation Texas Project, and Yolo County Bioreactor Landfill Project. The abstract will also address the following: How are the terms ``measurement,'' ``mitigation''and ``verification'' defined in the Program? What is the U.S. DOE's Carbon Sequestration Program Roadmap and what are the Roadmap goals for MM&V? What is the current status of MM&V technologies?
NASA Astrophysics Data System (ADS)
Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong
2007-03-01
As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.
Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G
2014-08-01
In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.
1982-01-29
N - Nw .VA COMPUTER PROGRAM USER’S MANUAL FOR . 0FIREFINDER DIGITAL TOPOGRAPHIC DATA VERIFICATION LIBRARY DUBBING SYSTEM VOLUME II DUBBING 29 JANUARY...Digital Topographic Data Verification Library Dubbing System, Volume II, Dubbing 6. PERFORMING ORG. REPORT NUMER 7. AUTHOR(q) S. CONTRACT OR GRANT...Software Library FIREFINDER Dubbing 20. ABSTRACT (Continue an revWee *Ide II necessary end identify by leek mauber) PThis manual describes the computer
2003-03-01
Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig
Advancing the practice of systems engineering at JPL
NASA Technical Reports Server (NTRS)
Jansma, Patti A.; Jones, Ross M.
2006-01-01
In FY 2004, JPL launched an initiative to improve the way it practices systems engineering. The Lab's senior management formed the Systems Engineering Advancement (SEA) Project in order to "significantly advance the practice and organizational capabilities of systems engineering at JPL on flight projects and ground support tasks." The scope of the SEA Project includes the systems engineering work performed in all three dimensions of a program, project, or task: 1. the full life-cycle, i.e., concept through end of operations 2. the full depth, i.e., Program, Project, System, Subsystem, Element (SE Levels 1 to 5) 3. the full technical scope, e.g., the flight, ground and launch systems, avionics, power, propulsion, telecommunications, thermal, etc. The initial focus of their efforts defined the following basic systems engineering functions at JPL: systems architecture, requirements management, interface definition, technical resource management, system design and analysis, system verification and validation, risk management, technical peer reviews, design process management and systems engineering task management, They also developed a list of highly valued personal behaviors of systems engineers, and are working to inculcate those behaviors into members of their systems engineering community. The SEA Project is developing products, services, and training to support managers and practitioners throughout the entire system lifecycle. As these are developed, each one needs to be systematically deployed. Hence, the SEA Project developed a deployment process that includes four aspects: infrastructure and operations, communication and outreach, education and training, and consulting support. In addition, the SEA Project has taken a proactive approach to organizational change management and customer relationship management - both concepts and approaches not usually invoked in an engineering environment. This paper'3 describes JPL's approach to advancing the practice of systems engineering at the Lab. It describes the general approach used and how they addressed the three key aspects of change: people, process and technology. It highlights a list of highly valued personal behaviors of systems engineers, discusses the various products, services and training that were developed, describes the deployment approach used, and concludes with several lessons learned.
NASA Astrophysics Data System (ADS)
Detsis, Emmanouil; Brodsky, Yuval; Knudtson, Peter; Cuba, Manuel; Fuqua, Heidi; Szalai, Bianca
2012-11-01
Space assets have a unique opportunity to play a more active role in global resource management. There is a clear need to develop resource management tools in a global framework. Illegal, Unregulated and Unreported (IUU) fishing is placing pressure on the health and size of fishing stocks around the world. Earth observation systems can provide fishery management organizations with cost effective monitoring of large swaths of ocean. Project Catch is a fisheries management project based upon the complimentary, but independent Catch-VMS and Catch-GIS systems. Catch-VMS is a Vessel Monitoring System with increased fidelity over existing offerings. Catch-GIS is a Geographical Information System that combines VMS information with existing Earth Observation data and other data sources to identify Illegal, Unregulated and Unreported (IUU) fishing. Project Catch was undertaken by 19 Masters students from the 2010 class of the International Space University. In this paper, the space-based system architecture of Project Catch is presented and analyzed. The rationale for the creation of the system, as well as the engineering trade-off studies in its creation, are discussed. The Catch-VMS proposal was envisaged in order to address two specific problems: (1) the expansion of illegal fishing to high-latitude regions where existing satellite systems coverage is an issue and (2) the lack of coverage in remote oceanic regions due to reliance on coastal-based monitoring. Catch-VMS utilizes ship-borne transponders and hosted-payload receivers on a Global Navigation Satellite System in order to monitor the position and activity of compliant fishing vessels. Coverage is global and continuous with multiple satellites in view providing positional verification through multilateration techniques. The second part of the paper briefly describes the Catch-GIS system and investigates its cost of implementation.
A Research Program in Computer Technology
1976-07-01
K PROGRAM VERIFICATION 12 [Shaw76b] Shaw, M., W. A. Wulf, and R. L. London, Abstraction and Verification ain Aiphard: Iteration and Generators...millisecond trame of speech: pitch, gain, and 10 k -parameters (often called reflection coefficients). The 12 parameters from each frame are encoded into...del rey, CA 90291 Program Code 3D30 & 3P1O I,%’POLLING OFFICE NAME AND ADDRESS 12 REPORT DATE Defense Advanced Research Projects Agency July 1976 1400
Distilling the Verification Process for Prognostics Algorithms
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai
2013-01-01
The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.
Development and Verification of the Charring Ablating Thermal Protection Implicit System Solver
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan D.; Kirk, Benjamin S.
2010-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method with first and second order implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton's method, while the fully implicit linear system is solved with the Generalized Minimal Residual method. Verification results from exact solutions and the Method of Manufactured Solutions are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
Development and Verification of the Charring, Ablating Thermal Protection Implicit System Simulator
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan; Kirk, Benjamin S.
2011-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver (CATPISS) is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method (FEM) with first and second order fully implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton s method, while the linear system is solved via the Generalized Minimum Residual method (GMRES). Verification results from exact solutions and Method of Manufactured Solutions (MMS) are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
A system for automatic evaluation of simulation software
NASA Technical Reports Server (NTRS)
Ryan, J. P.; Hodges, B. C.
1976-01-01
Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Divito, Ben L.
1992-01-01
The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).
Verification Testing: Meet User Needs Figure of Merit
NASA Technical Reports Server (NTRS)
Kelly, Bryan W.; Welch, Bryan W.
2017-01-01
Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible calculations. Additionally, this project will look at the coding generally and note inconsistencies, redundancies, and other aspects that may become problematic or slow down the codes run time. Certain scripts lacking in documentation also will be commented and cataloged.
Application of computer vision to automatic prescription verification in pharmaceutical mail order
NASA Astrophysics Data System (ADS)
Alouani, Ali T.
2005-05-01
In large volume pharmaceutical mail order, before shipping out prescriptions, licensed pharmacists ensure that the drug in the bottle matches the information provided in the patient prescription. Typically, the pharmacist has about 2 sec to complete the prescription verification process of one prescription. Performing about 1800 prescription verification per hour is tedious and can generate human errors as a result of visual and brain fatigue. Available automatic drug verification systems are limited to a single pill at a time. This is not suitable for large volume pharmaceutical mail order, where a prescription can have as many as 60 pills and where thousands of prescriptions are filled every day. In an attempt to reduce human fatigue, cost, and limit human error, the automatic prescription verification system (APVS) was invented to meet the need of large scale pharmaceutical mail order. This paper deals with the design and implementation of the first prototype online automatic prescription verification machine to perform the same task currently done by a pharmacist. The emphasis here is on the visual aspects of the machine. The system has been successfully tested on 43,000 prescriptions.
Future of Mechatronics and Human
NASA Astrophysics Data System (ADS)
Harashima, Fumio; Suzuki, Satoshi
This paper mentions circumstance of mechatronics that sustain our human society, and introduces HAM(Human Adaptive Mechatronics)-project as one of research projects to create new human-machine system. The key point of HAM is skill, and analysis of skill and establishment of assist method to enhance total performance of human-machine system are main research concerns. As study of skill is an elucidation of human itself, analyses of human higher function are significant. In this paper, after surveying researches of human brain functions, an experimental analysis of human characteristic in machine operation is shown as one example of our research activities. We used hovercraft simulator as verification system including observation, voluntary motion control and machine operation that are needed to general machine operation. Process and factors to become skilled were investigated by identification of human control characteristics with measurement of the operator's line-of sight. It was confirmed that early switching of sub-controllers / reference signals in human and enhancement of space perception are significant.
ERTS program of the US Army Corps of Engineers. [water resources
NASA Technical Reports Server (NTRS)
Jarman, J. W.
1974-01-01
The Army Corps of Engineers research and development efforts associated with the ERTS Program are confined to applications of investigation, design, construction, operation, and maintenance of water resource projects. Problems investigated covered: (1) resource inventory; (2) environmental impact; (3) pollution monitoring; (4) water circulation; (5) sediment transport; (6) data collection systems; (7) engineering; and (8) model verification. These problem areas were investigated in relation to bays, reservoirs, lakes, rivers, coasts, and regions. ERTS-1 imagery has been extremely valuable in developing techniques and is now being used in everyday applications.
A Case Study of 4 & 5 Cost Effectiveness
NASA Technical Reports Server (NTRS)
Neal, Ralph D.; McCaugherty, Dan; Joshi, Tulasi; Callahan, John
1997-01-01
This paper looks at the Independent Verification and Validation (IV&V) of NASA's Space Shuttle Day of Launch I-Load Update (DoLILU) project. IV&V is defined. The system's development life cycle is explained. Data collection and analysis are described. DoLILU Issue Tracking Reports (DITRs) authored by IV&V personnel are analyzed to determine the effectiveness of IV&V in finding errors before the code, testing, and integration phase of the software development life cycle. The study's findings are reported along with the limitations of the study and planned future research.
Middleware Trade Study for NASA Domain
NASA Technical Reports Server (NTRS)
Bowman, Dan
2007-01-01
This presentation presents preliminary results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are: the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL
NASA Technical Reports Server (NTRS)
Dunham, J. R. (Editor); Knight, J. C. (Editor)
1982-01-01
The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research.
Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...
NASA Technical Reports Server (NTRS)
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.
1998-01-01
Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.
ETV REPORT AND VERIFICATION STATEMENT; EVALUATION OF LOBO LIQUIDS RINSE WATER RECOVERY SYSTEM
The Lobo Liquids Rinse Water Recovery System (Lobo Liquids system) was tested, under actual production conditions, processing metal finishing wastewater, at Gull Industries in Houston, Texas. The verification test evaluated the ability of the ion exchange (IX) treatment system t...
DOT National Transportation Integrated Search
2005-09-01
This document describes a procedure for verifying a dynamic testing system (closed-loop servohydraulic). The procedure is divided into three general phases: (1) electronic system performance verification, (2) calibration check and overall system perf...
Cluster man/system design requirements and verification. [for Skylab program
NASA Technical Reports Server (NTRS)
Watters, H. H.
1974-01-01
Discussion of the procedures employed for determining the man/system requirements that guided Skylab design, and review of the techniques used for implementing the man/system design verification. The foremost lesson learned from the design need anticipation and design verification experience is the necessity to allow for human capabilities of in-flight maintenance and repair. It is now known that the entire program was salvaged by a series of unplanned maintenance and repair events which were implemented in spite of poor design provisions for maintenance.
NASA Technical Reports Server (NTRS)
Probst, D.; Jensen, L.
1991-01-01
Delay-insensitive VLSI systems have a certain appeal on the ground due to difficulties with clocks; they are even more attractive in space. We answer the question, is it possible to control state explosion arising from various sources during automatic verification (model checking) of delay-insensitive systems? State explosion due to concurrency is handled by introducing a partial-order representation for systems, and defining system correctness as a simple relation between two partial orders on the same set of system events (a graph problem). State explosion due to nondeterminism (chiefly arbitration) is handled when the system to be verified has a clean, finite recurrence structure. Backwards branching is a further optimization. The heart of this approach is the ability, during model checking, to discover a compact finite presentation of the verified system without prior composition of system components. The fully-implemented POM verification system has polynomial space and time performance on traditional asynchronous-circuit benchmarks that are exponential in space and time for other verification systems. We also sketch the generalization of this approach to handle delay-constrained VLSI systems.
Formal methods for dependable real-time systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
The Pilot Phase of the Global Soil Wetness Project Phase 3
NASA Astrophysics Data System (ADS)
Kim, H.; Oki, T.
2015-12-01
After the second phase of the Global Soil Wetness Project (GSWP2) as an early global continuous gridded multi-model analysis, a comprehensive set of land surface fluxes and state variables became available. It has been broadly utilized in the hydrology community, and its success has evolved to take advantages of recent scientific progress and to extend the relatively short time span (1986-1995) of the previous project. In the third phase proposed here (GSWP3), an extensive set of quantities for hydro-energy-eco systems will be produced to investigate their long-term (1901-2010) changes. The energy-water-carbon cycles and their interactions are also examined subcomponent-wise with appropriate model verifications in ensemble land simulations. In this study, the preliminary results and problems found from the first round analysis of the GSWP3 pilot study are shown. Also, it is discussed how the global offline simulation activity contributes to wider communities and a bigger scope such as Climate Model Intercomparison Project Phase 6 (CMIP6).
The Environmental Technology Verification report discusses the technology and performance of a gaseous-emissions monitoring system for large, natural-gas-fired internal combustion engines. The device tested is the Parametric Emissions Monitoring System (PEMS) manufactured by ANR ...
Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder
NASA Technical Reports Server (NTRS)
Lindsey, A. E.; Pecheur, Charles
2004-01-01
AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.
Autonomous and Autonomic Swarms
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy
2005-01-01
A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.
Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo
2012-01-01
Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044
Outdoor Irrigation Measurement and Verification Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W.; Stoughton, Kate M.; Figueroa, Jorge
This measurement and verification (M&V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with outdoor irrigation efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M&V plan, and details the procedures to use to determine water savings.
2010-03-01
is to develop a novel clinical useful delivered-dose verification protocol for modern prostate VMAT using Electronic Portal Imaging Device (EPID...technique. A number of important milestones have been accomplished, which include (i) calibrated CBCT HU vs. electron density curve; (ii...prostate VMAT using Electronic Portal Imaging Device (EPID) and onboard Cone beam Computed Tomography (CBCT). The specific aims of this project
Apollo experience report: Guidance and control systems. Engineering simulation program
NASA Technical Reports Server (NTRS)
Gilbert, D. W.
1973-01-01
The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.
Sheet Metal Contract. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Kirkpatrick, Thomas; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of sheet metal, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to train sheet metal workers. Section 1 contains general information: purpose of Phase I; description…
NASA Astrophysics Data System (ADS)
Murtiyoso, A.; Grussenmeyer, P.; Börlin, N.
2017-11-01
Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles), close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT) developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadio, G.; et al.
An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less
Fluorescence guided lymph node biopsy in large animals using direct image projection device
NASA Astrophysics Data System (ADS)
Ringhausen, Elizabeth; Wang, Tylon; Pitts, Jonathan; Akers, Walter J.
2016-03-01
The use of fluorescence imaging for aiding oncologic surgery is a fast growing field in biomedical imaging, revolutionizing open and minimally invasive surgery practices. We have designed, constructed, and tested a system for fluorescence image acquisition and direct display on the surgical field for fluorescence guided surgery. The system uses a near-infrared sensitive CMOS camera for image acquisition, a near-infra LED light source for excitation, and DLP digital projector for projection of fluorescence image data onto the operating field in real time. Instrument control was implemented in Matlab for image capture, processing of acquired data and alignment of image parameters with the projected pattern. Accuracy of alignment was evaluated statistically to demonstrate sensitivity to small objects and alignment throughout the imaging field. After verification of accurate alignment, feasibility for clinical application was demonstrated in large animal models of sentinel lymph node biopsy. Indocyanine green was injected subcutaneously in Yorkshire pigs at various locations to model sentinel lymph node biopsy in gynecologic cancers, head and neck cancer, and melanoma. Fluorescence was detected by the camera system during operations and projected onto the imaging field, accurately identifying tissues containing the fluorescent tracer at up to 15 frames per second. Fluorescence information was projected as binary green regions after thresholding and denoising raw intensity data. Promising results with this initial clinical scale prototype provided encouraging results for the feasibility of optical projection of acquired luminescence during open oncologic surgeries.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs Definitions Service § 62.209 Reassessment. Approved programs are subject to periodic reassessments to ensure...
Integrating MBSE into Ongoing Projects: Requirements Validation and Test Planning for the ISS SAFER
NASA Technical Reports Server (NTRS)
Anderson, Herbert A.; Williams, Antony; Pierce, Gregory
2016-01-01
The International Space Station (ISS) Simplified Aid for Extra Vehicular Activity (EVA) Rescue (SAFER) is the spacewalking astronaut's final safety measure against separating from the ISS and being unable to return safely. Since the late 1990s, the SAFER has been a standard element of the spacewalking astronaut's equipment. The ISS SAFER project was chartered to develop a new block of SAFER units using a highly similar design to the legacy SAFER (known as the USA SAFER). An on-orbit test module was also included in the project to enable periodic maintenance/propulsion system checkout on the ISS SAFER. On the ISS SAFER project, model-based systems engineering (MBSE) was not the initial systems engineering (SE) approach, given the volume of heritage systems engineering and integration (SE&I) products. The initial emphasis was ensuring traceability to ISS program standards as well as to legacy USA SAFER requirements. The requirements management capabilities of the Cradle systems engineering tool were to be utilized to that end. During development, however, MBSE approaches were applied selectively to address specific challenges in requirements validation and test and verification (T&V) planning, which provided measurable efficiencies to the project. From an MBSE perspective, ISS SAFER development presented a challenge and an opportunity. Addressing the challenge first, the project was tasked to use the original USA SAFER operational and design requirements baseline, with a number of additional ISS program requirements to address evolving certification expectations for systems operating on the ISS. Additionally, a need to redesign the ISS SAFER avionics architecture resulted in a set of changes to the design requirements baseline. Finally, the project added an entirely new functionality for on-orbit maintenance. After initial requirements integration, the system requirements count was approaching 1000, which represented a growth of 4x over the original USA SAFER system. This presented the challenge - How to confirm that this new set of requirements set would result in the creation of the desired capability.
Computational control of flexible aerospace systems
NASA Technical Reports Server (NTRS)
Sharpe, Lonnie, Jr.; Shen, Ji Yao
1994-01-01
The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed based on several incomplete versions. The verification of the code had been conducted by comparing the results with those examples for which the exact theoretical solutions can be obtained. The theoretical background of the package and the verification examples has been reported in a technical paper submitted to the Joint Applied Mechanics & Material Conference, ASME. A brief USER'S MANUAL had been compiled, which includes three parts: (1) Input data preparation; (2) Explanation of the Subroutines; and (3) Specification of control variables. Meanwhile, a theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modeling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide an embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.
Formalizing procedures for operations automation, operator training and spacecraft autonomy
NASA Technical Reports Server (NTRS)
Lecouat, Francois; Desaintvincent, Arnaud
1994-01-01
The generation and validation of operations procedures is a key task of mission preparation that is quite complex and costly. This has motivated the development of software applications providing support for procedures preparation. Several applications have been developed at MATRA MARCONI SPACE (MMS) over the last five years. They are presented in the first section of this paper. The main idea is that if procedures are represented in a formal language, they can be managed more easily with a computer tool and some automatic verifications can be performed. One difficulty is to define a formal language that is easy to use for operators and operations engineers. From the experience of the various procedures management tools developed in the last five years (including the POM, EOA, and CSS projects), MMS has derived OPSMAKER, a generic tool for procedure elaboration and validation. It has been applied to quite different types of missions, ranging from crew procedures (PREVISE system), ground control centers management procedures (PROCSU system), and - most relevant to the present paper - satellite operation procedures (PROCSAT developed for CNES, to support the preparation and verification of SPOT 4 operation procedures, and OPSAT for MMS telecom satellites operation procedures).
Clinical Experience and Evaluation of Patient Treatment Verification With a Transit Dosimeter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricketts, Kate, E-mail: k.ricketts@ucl.ac.uk; Department of Radiotherapy Physics, Royal Berkshire NHS Foundation Trust, Reading; Navarro, Clara
2016-08-01
Purpose: To prospectively evaluate a protocol for transit dosimetry on a patient population undergoing intensity modulated radiation therapy (IMRT) and to assess the issues in clinical implementation of electronic portal imaging devices (EPIDs) for treatment verification. Methods and Materials: Fifty-eight patients were enrolled in the study. Amorphous silicon EPIDs were calibrated for dose and used to acquire images of delivered fields. Measured EPID dose maps were back-projected using the planning computed tomographic (CT) images to calculate dose at prespecified points within the patient and compared with treatment planning system dose offline using point dose difference and point γ analysis. Themore » deviation of the results was used to inform future action levels. Results: Two hundred twenty-five transit images were analyzed, composed of breast, prostate, and head and neck IMRT fields. Patient measurements demonstrated the potential of the dose verification protocol to model dose well under complex conditions: 83.8% of all delivered beams achieved the initial set tolerance level of Δ{sub D} of 0 ± 5 cGy or %Δ{sub D} of 0% ± 5%. Importantly, the protocol was also sensitive to anatomic changes and spotted that 3 patients from 20 measured prostate patients had undergone anatomic change in comparison with the planning CT. Patient data suggested an EPID-reconstructed versus treatment planning system dose difference action level of 0% ± 7% for breast fields. Asymmetric action levels were more appropriate for inversed IMRT fields, using absolute dose difference (−2 ± 5 cGy) or summed field percentage dose difference (−6% ± 7%). Conclusions: The in vivo dose verification method was easy to use and simple to implement, and it could detect patient anatomic changes that impacted dose delivery. The system required no extra dose to the patient or treatment time delay and so could be used throughout the course of treatment to identify and limit systematic and random errors in dose delivery for patient groups.« less
Clinical Experience and Evaluation of Patient Treatment Verification With a Transit Dosimeter.
Ricketts, Kate; Navarro, Clara; Lane, Katherine; Blowfield, Claire; Cotten, Gary; Tomala, Dee; Lord, Christine; Jones, Joanne; Adeyemi, Abiodun
2016-08-01
To prospectively evaluate a protocol for transit dosimetry on a patient population undergoing intensity modulated radiation therapy (IMRT) and to assess the issues in clinical implementation of electronic portal imaging devices (EPIDs) for treatment verification. Fifty-eight patients were enrolled in the study. Amorphous silicon EPIDs were calibrated for dose and used to acquire images of delivered fields. Measured EPID dose maps were back-projected using the planning computed tomographic (CT) images to calculate dose at prespecified points within the patient and compared with treatment planning system dose offline using point dose difference and point γ analysis. The deviation of the results was used to inform future action levels. Two hundred twenty-five transit images were analyzed, composed of breast, prostate, and head and neck IMRT fields. Patient measurements demonstrated the potential of the dose verification protocol to model dose well under complex conditions: 83.8% of all delivered beams achieved the initial set tolerance level of ΔD of 0 ± 5 cGy or %ΔD of 0% ± 5%. Importantly, the protocol was also sensitive to anatomic changes and spotted that 3 patients from 20 measured prostate patients had undergone anatomic change in comparison with the planning CT. Patient data suggested an EPID-reconstructed versus treatment planning system dose difference action level of 0% ± 7% for breast fields. Asymmetric action levels were more appropriate for inversed IMRT fields, using absolute dose difference (-2 ± 5 cGy) or summed field percentage dose difference (-6% ± 7%). The in vivo dose verification method was easy to use and simple to implement, and it could detect patient anatomic changes that impacted dose delivery. The system required no extra dose to the patient or treatment time delay and so could be used throughout the course of treatment to identify and limit systematic and random errors in dose delivery for patient groups. Copyright © 2016 Elsevier Inc. All rights reserved.
Compressive sensing using optimized sensing matrix for face verification
NASA Astrophysics Data System (ADS)
Oey, Endra; Jeffry; Wongso, Kelvin; Tommy
2017-12-01
Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.
FY17 ISCR Scholar End-of-Assignment Report - Robbie Sadre
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadre, R.
2017-10-20
Throughout this internship assignment, I did various tasks that contributed towards the starting of the SASEDS (Safe Active Scanning for Energy Delivery Systems) and CES-21 (California Energy Systems for the 21st Century) projects in the SKYFALL laboratory. The goal of the SKYFALL laboratory is to perform modeling and simulation verification of transmission power system devices, while integrating with high-performance computing. The first thing I needed to do was acquire official Online LabVIEW training from National Instruments. Through these online tutorial modules, I learned the basics of LabVIEW, gaining experience in connecting to NI devices through the DAQmx API as wellmore » as LabVIEW basic programming techniques (structures, loops, state machines, front panel GUI design etc).« less
A Verification System for Distributed Objects with Asynchronous Method Calls
NASA Astrophysics Data System (ADS)
Ahrendt, Wolfgang; Dylla, Maximilian
We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.
40 CFR 1065.920 - PEMS calibrations and verifications.
Code of Federal Regulations, 2012 CFR
2012-07-01
... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...
40 CFR 1065.920 - PEMS calibrations and verifications.
Code of Federal Regulations, 2013 CFR
2013-07-01
... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...
40 CFR 1065.920 - PEMS calibrations and verifications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...
A new verification film system for routine quality control of radiation fields: Kodak EC-L.
Hermann, A; Bratengeier, K; Priske, A; Flentje, M
2000-06-01
The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.
Casado, María; de Lecuona, Itziar
2013-01-01
This paper identifies problems and analyzes those conflicts posed by the evaluation of research projects involving the collection and use of human induced pluripotent stem cells (iPS) in Spain. Current legislation is causing problems of interpretation, circular and unnecessary referrals, legal uncertainty and undue delays. Actually, this situation may cause a lack of control and monitoring, and even some paralysis in regenerative medicine and cell therapy research, that is a priority nowadays. The analysis of the current legislation and its bioethical implications, led us to conclude that the review of iPS research projects cannot be assimilated to the evaluation of research projects that involve human embryonic stem cell (hESC). In this context, our proposal is based on the review by the Research Ethics Committees and the checkout by the Spanish Comission of Guarantees for Donation and Use of Human Cells and Tissues (CGDUCTH) of human iPS cells research projects. Moreover, this article claims for a more transparent research system, by effectively articulating the Registry on Research Projects. Finally, a model of verification protocol (checklist) for checking out biomedical research projects involving human iPS cells is suggested.
7 CFR 62.213 - Official identification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs Definitions Service § 62.213 Official identification. The following, as shown in figure 1, constitutes...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Services. 62.200 Section 62.200 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs...
7 CFR 62.207 - Official assessment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs Definitions Service § 62.207 Official assessment. Official assessment of an applicant's program shall include...
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Remote geologic structural analysis of Yucca Flat
NASA Astrophysics Data System (ADS)
Foley, M. G.; Heasler, P. G.; Hoover, K. A.; Rynes, N. J.; Thiessen, R. L.; Alfaro, J. L.
1991-12-01
The Remote Geologic Analysis (RGA) system was developed by Pacific Northwest Laboratory (PNL) to identify crustal structures that may affect seismic wave propagation from nuclear tests. Using automated methods, the RGA system identifies all valleys in a digital elevation model (DEM), fits three-dimensional vectors to valley bottoms, and catalogs all potential fracture or fault planes defined by coplanar pairs of valley vectors. The system generates a cluster hierarchy of planar features having greater-than-random density that may represent areas of anomalous topography manifesting structural control of erosional drainage development. Because RGA uses computer methods to identify zones of hypothesized control of topography, ground truth using a well-characterized test site was critical in our evaluation of RGA's characterization of inaccessible test sites for seismic verification studies. Therefore, we applied RGA to a study area centered on Yucca Flat at the Nevada Test Site (NTS) and compared our results with both mapped geology and geologic structures and with seismic yield-magnitude models. This is the final report of PNL's RGA development project for peer review within the U.S. Department of Energy Office of Arms Control (OAC) seismic-verification community. In this report, we discuss the Yucca Flat study area, the analytical basis of the RGA system and its application to Yucca Flat, the results of the analysis, and the relation of the analytical results to known topography, geology, and geologic structures.
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Lowrey, Nikki M.
2015-01-01
Since the 1990's, when the Class I Ozone Depleting Substance (ODS) chlorofluorocarbon-113 (CFC-113) was banned, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have relied upon hydrochlorofluorocarbon-225 (HCFC-225) to safely clean and verify the cleanliness of large scale propulsion oxygen systems. Effective January 1, 2015, the production, import, export, and new use of HCFC-225, a Class II ODS, was prohibited by the Clean Air Act. In 2012 through 2014, leveraging resources from both the NASA Rocket Propulsion Test Program and the Defense Logistics Agency - Aviation Hazardous Minimization and Green Products Branch, test labs at MSFC, SSC, and Johnson Space Center's White Sands Test Facility (WSTF) collaborated to seek out, test, and qualify a replacement for HCFC-225 that is both an effective cleaner and safe for use with oxygen systems. Candidate solvents were selected and a test plan was developed following the guidelines of ASTM G127, Standard Guide for the Selection of Cleaning Agents for Oxygen Systems. Solvents were evaluated for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Two solvents were determined to be acceptable for cleaning oxygen systems and one was chosen for implementation at NASA's rocket propulsion test facilities. The test program and results are summarized. This project also demonstrated the benefits of cross-agency collaboration in a time of limited resources.
Tool for Human-Systems Integration Assessment: HSI Scorecard
NASA Technical Reports Server (NTRS)
Whitmore, Nihriban; Sandor, Aniko; McGuire, Kerry M.; Berdich, Debbie
2009-01-01
This paper describes the development and rationale for a human-systems integration (HSI) scorecard that can be used in reviews of vehicle specification and design. This tool can be used to assess whether specific HSI related criteria have been met as part of a project milestone or critical event, such as technical reviews, crew station reviews, mockup evaluations, or even review of major plans or processes. Examples of HSI related criteria include Human Performance Capabilities, Health Management, Human System Interfaces, Anthropometry and Biomechanics, and Natural and Induced Environments. The tool is not intended to evaluate requirements compliance and verification, but to review how well the human related systems have been considered for the specific event and to identify gaps and vulnerabilities from an HSI perspective. The scorecard offers common basis, and criteria for discussions among system managers, evaluators, and design engineers. Furthermore, the scorecard items highlight the main areas of system development that need to be followed during system lifecycle. The ratings provide a repeatable quantitative measure to what has been often seen as only subjective commentary. Thus, the scorecard is anticipated to be a useful HSI tool to communicate review results to the institutional and the project office management.
Verification of S&D Solutions for Network Communications and Devices
NASA Astrophysics Data System (ADS)
Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen
This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.
Online 3D EPID-based dose verification: Proof of concept.
Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel
2016-07-01
Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.
Formal verification of automated teller machine systems using SPIN
NASA Astrophysics Data System (ADS)
Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam
2017-08-01
Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.
Integrated testing and verification system for research flight software
NASA Technical Reports Server (NTRS)
Taylor, R. N.
1979-01-01
The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.
van Hoof, Joris J; Gosselt, Jordy F; de Jong, Menno D T
2010-02-01
To compare traditional in-store age verification with a newly developed remote age verification system, 100 cigarette purchase attempts were made by 15-year-old "mystery shoppers." The remote system led to a strong increase in compliance (96% vs. 12%), reflecting more identification requests and more sale refusals when adolescents showed their identification cards. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.
The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...
The Greenhouse Gas Technology Center (GHG Center), one of six verification organizations under the Environmental Technology Verification (ETV) program, evaluated the performance of the Parallon 75 kW Turbogenerator (Turbogenerator) with carbon monoxide (CO) emissions control syst...
40 CFR 1065.920 - PEMS Calibrations and verifications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... verification. The verification consists of operating an engine over a duty cycle in the laboratory and... by laboratory equipment as follows: (1) Mount an engine on a dynamometer for laboratory testing...
48 CFR 552.204-9 - Personal Identity Verification requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...
48 CFR 552.204-9 - Personal Identity Verification requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...
48 CFR 552.204-9 - Personal Identity Verification requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Personal Identity....204-9 Personal Identity Verification requirements. As prescribed in 504.1303, insert the following clause: Personal Identity Verification Requirements (OCT 2012) (a) The contractor shall comply with GSA...
40 CFR 1066.240 - Torque transducer verification and calibration.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Torque transducer verification and...
40 CFR 1066.240 - Torque transducer verification and calibration.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Torque transducer verification and...
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.
A field study of the accuracy and reliability of a biometric iris recognition system.
Latman, Neal S; Herb, Emily
2013-06-01
The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.
2004-11-01
The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.
Cooperative Networked Control of Dynamical Peer-to-Peer Vehicle Systems
2007-12-28
dynamic deployment and task allocation;verification and hybrid systems; and information management for cooperative control. The activity of the...32 5.3 Decidability Results on Discrete and Hybrid Systems ...... .................. 33 5.4 Switched Systems...solved. Verification and hybrid systems. The program has produced significant advances in the theory of hybrid input-output automata, (HIOA) and the
The Environmental Technology Verification report discusses the technology and performance of the IR PowerWorks 70kW Microturbine System manufactured by Ingersoll-Rand Energy Systems. This system is a 70 kW electrical generator that puts out 480 v AC at 60 Hz and that is driven by...
An elementary tutorial on formal specification and verification using PVS
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
1993-01-01
A tutorial on the development of a formal specification and its verification using the Prototype Verification System (PVS) is presented. The tutorial presents the formal specification and verification techniques by way of specific example - an airline reservation system. The airline reservation system is modeled as a simple state machine with two basic operations. These operations are shown to preserve a state invariant using the theorem proving capabilities of PVS. The technique of validating a specification via 'putative theorem proving' is also discussed and illustrated in detail. This paper is intended for the novice and assumes only some of the basic concepts of logic. A complete description of user inputs and the PVS output is provided and thus it can be effectively used while one is sitting at a computer terminal.
Active alignment/contact verification system
Greenbaum, William M.
2000-01-01
A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.
Software Development Technologies for Reactive, Real-Time, and Hybrid Systems: Summary of Research
NASA Technical Reports Server (NTRS)
Manna, Zohar
1998-01-01
This research is directed towards the implementation of a comprehensive deductive-algorithmic environment (toolkit) for the development and verification of high assurance reactive systems, especially concurrent, real-time, and hybrid systems. For this, we have designed and implemented the STCP (Stanford Temporal Prover) verification system. Reactive systems have an ongoing interaction with their environment, and their computations are infinite sequences of states. A large number of systems can be seen as reactive systems, including hardware, concurrent programs, network protocols, and embedded systems. Temporal logic provides a convenient language for expressing properties of reactive systems. A temporal verification methodology provides procedures for proving that a given system satisfies a given temporal property. The research covered necessary theoretical foundations as well as implementation and application issues.
Identity Verification Systems as a Critical Infrastructure
2012-03-01
COVERED Master’s Thesis 4 . TITLE AND SUBTITLE Identity Verification Systems as a Critical Infrastructure 5. FUNDING NUMBERS 6. AUTHOR(S...43 3. Cybercrime .........................................................................................45 4 ...24 Figure 3. Uses of Fictitious or Stolen Identity ................................................................30 Figure 4
Space telescope observatory management system preliminary test and verification plan
NASA Technical Reports Server (NTRS)
Fritz, J. S.; Kaldenbach, C. F.; Williams, W. B.
1982-01-01
The preliminary plan for the Space Telescope Observatory Management System Test and Verification (TAV) is provided. Methodology, test scenarios, test plans and procedure formats, schedules, and the TAV organization are included. Supporting information is provided.
Code of Federal Regulations, 2010 CFR
2010-01-01
... of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections... AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs Definitions Service § 62.211 Appeals. Appeals of adverse decisions under this part, may be made in writing to...
7 CFR 62.206 - Access to program documents and activities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (CONTINUED) LIVESTOCK, MEAT, AND OTHER AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs Definitions Service § 62.206 Access to program documents and... SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS...
7 CFR 62.212 - Official assessment reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... AGRICULTURAL COMMODITIES (QUALITY SYSTEMS VERIFICATION PROGRAMS) Quality Systems Verification Programs Definitions Service § 62.212 Official assessment reports. Official QSVP assessment reports shall be generated...
NASA Technical Reports Server (NTRS)
Defeo, P.; Doane, D.; Saito, J.
1982-01-01
A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.
Exploring system interconnection architectures with VIPACES: from direct connections to NOCs
NASA Astrophysics Data System (ADS)
Sánchez-Peña, Armando; Carballo, Pedro P.; Núñez, Antonio
2007-05-01
This paper presents a simple environment for the verification of AMBA 3 AXI systems in Verification IP (VIP) production called VIPACES (Verification Interface Primitives for the development of AXI Compliant Elements and Systems). These primitives are presented as a not compiled library written in SystemC where interfaces are the core of the library. The definition of interfaces instead of generic modules let the user construct custom modules improving the resources spent during the verification phase as well as easily adapting his modules to the AMBA 3 AXI protocol. This topic is the main discussion in the VIPACES library. The paper focuses on comparing and contrasting the main interconnection schemes for AMBA 3 AXI as modeled by VIPACES. For assessing these results we propose a validation scenario with a particular architecture belonging to the domain of MPEG4 video decoding, which is compound by an AXI bus connecting an IDCT and other processing resources.
The Environmental Technology Verification report discusses the technology and performance of the Parallon 75kW Turbogenerator manufactured by Honeywell Power Systems, Inc., formerly AlliedSignal Power Systems, Inc. The unit uses a natural-gas-fired turbine to power an electric ge...
NASA Astrophysics Data System (ADS)
Chaljub, E. O.; Bard, P.; Tsuno, S.; Kristek, J.; Moczo, P.; Franek, P.; Hollender, F.; Manakou, M.; Raptakis, D.; Pitilakis, K.
2009-12-01
During the last decades, an important effort has been dedicated to develop accurate and computationally efficient numerical methods to predict earthquake ground motion in heterogeneous 3D media. The progress in methods and increasing capability of computers have made it technically feasible to calculate realistic seismograms for frequencies of interest in seismic design applications. In order to foster the use of numerical simulation in practical prediction, it is important to (1) evaluate the accuracy of current numerical methods when applied to realistic 3D applications where no reference solution exists (verification) and (2) quantify the agreement between recorded and numerically simulated earthquake ground motion (validation). Here we report the results of the Euroseistest verification and validation project - an ongoing international collaborative work organized jointly by the Aristotle University of Thessaloniki, Greece, the Cashima research project (supported by the French nuclear agency, CEA, and the Laue-Langevin institute, ILL, Grenoble), and the Joseph Fourier University, Grenoble, France. The project involves more than 10 international teams from Europe, Japan and USA. The teams employ the Finite Difference Method (FDM), the Finite Element Method (FEM), the Global Pseudospectral Method (GPSM), the Spectral Element Method (SEM) and the Discrete Element Method (DEM). The project makes use of a new detailed 3D model of the Mygdonian basin (about 5 km wide, 15 km long, sediments reach about 400 m depth, surface S-wave velocity is 200 m/s). The prime target is to simulate 8 local earthquakes with magnitude from 3 to 5. In the verification, numerical predictions for frequencies up to 4 Hz for a series of models with increasing structural and rheological complexity are analyzed and compared using quantitative time-frequency goodness-of-fit criteria. Predictions obtained by one FDM team and the SEM team are close and different from other predictions (consistent with the ESG2006 exercise which targeted the Grenoble Valley). Diffractions off the basin edges and induced surface-wave propagation mainly contribute to differences between predictions. The differences are particularly large in the elastic models but remain important also in models with attenuation. In the validation, predictions are compared with the recordings by a local array of 19 surface and borehole accelerometers. The level of agreement is found event-dependent. For the largest-magnitude event the agreement is surprisingly good even at high frequencies.
Dynamic Modeling of the SMAP Rotating Flexible Antenna
NASA Technical Reports Server (NTRS)
Nayeri, Reza D.
2012-01-01
Dynamic model development in ADAMS for the SMAP project is explained: The main objective of the dynamic models are for pointing error assessment, and the control/stability margin requirement verifications
36 CFR 218.8 - Filing an objection.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) Signature or other verification of authorship upon request (a scanned signature for electronic mail may be... related to the proposed project; if applicable, how the objector believes the environmental analysis or...
36 CFR 218.8 - Filing an objection.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) Signature or other verification of authorship upon request (a scanned signature for electronic mail may be... related to the proposed project; if applicable, how the objector believes the environmental analysis or...
Cleanroom Software Engineering Reference Model. Version 1.0.
1996-11-01
teams. It also serves as a baseline for continued evolution of Cleanroom practice. The scope of the CRM is software management , specification...addition to project staff, participants include management , peer organization representatives, and customer representatives as appropriate for...2 Review the status of the process with management , the project team, peer groups, and the customer . These verification activities include
Test/QA Plan (TQAP) for Verification of Semi-Continuous Ambient Air Monitoring Systems
The purpose of the semi-continuous ambient air monitoring technology (or MARGA) test and quality assurance plan is to specify procedures for a verification test applicable to commercial semi-continuous ambient air monitoring technologies. The purpose of the verification test is ...
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...