Monitoring and verification R&D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L
2011-01-01
The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existingmore » energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.« less
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
Challenges in High-Assurance Runtime Verification
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.
2016-01-01
Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.
Verification Challenges at Low Numbers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.
2013-07-16
This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, themore » issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.
2018-01-01
Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.
Comments for A Conference on Verification in the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, James E.
2012-06-12
The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less
Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth A.
2016-01-01
We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brubaker, Erik; Deland, Sharon M.
This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less
Verification Challenges at Low Numbers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.
2013-06-01
Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered ofmore » the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less
A Secure Framework for Location Verification in Pervasive Computing
NASA Astrophysics Data System (ADS)
Liu, Dawei; Lee, Moon-Chuen; Wu, Dan
The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.
2017-01-01
Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.
2017-04-17
Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed
P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)
NASA Astrophysics Data System (ADS)
Kropp, Derek L.
2009-05-01
One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.
This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less
25 CFR 61.8 - Verification forms.
Code of Federal Regulations, 2010 CFR
2010-04-01
... using the last address of record. The verification form will be used to ascertain the previous enrollee... death. Name and/or address changes will only be made if the verification form is signed by an adult... 25 Indians 1 2010-04-01 2010-04-01 false Verification forms. 61.8 Section 61.8 Indians BUREAU OF...
Building confidence and credibility amid growing model and computing complexity
NASA Astrophysics Data System (ADS)
Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.
2017-12-01
As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.
Validation of a SysML based design for wireless sensor networks
NASA Astrophysics Data System (ADS)
Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed
2017-07-01
When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.
Carbon sequestration and its role in the global carbon cycle
McPherson, Brian J.; Sundquist, Eric T.
2009-01-01
For carbon sequestration the issues of monitoring, risk assessment, and verification of carbon content and storage efficacy are perhaps the most uncertain. Yet these issues are also the most critical challenges facing the broader context of carbon sequestration as a means for addressing climate change. In response to these challenges, Carbon Sequestration and Its Role in the Global Carbon Cycle presents current perspectives and research that combine five major areas: • The global carbon cycle and verification and assessment of global carbon sources and sinks • Potential capacity and temporal/spatial scales of terrestrial, oceanic, and geologic carbon storage • Assessing risks and benefits associated with terrestrial, oceanic, and geologic carbon storage • Predicting, monitoring, and verifying effectiveness of different forms of carbon storage • Suggested new CO2 sequestration research and management paradigms for the future. The volume is based on a Chapman Conference and will appeal to the rapidly growing group of scientists and engineers examining methods for deliberate carbon sequestration through storage in plants, soils, the oceans, and geological repositories.
Guidelines for qualifying cleaning and verification materials
NASA Technical Reports Server (NTRS)
Webb, D.
1995-01-01
This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.
NASA's Approach to Software Assurance
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2015-01-01
NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.
Specification, Validation and Verification of Mobile Application Behavior
2013-03-01
VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR by Christopher B. Bonine March 2013 Thesis Advisor: Man-Tak Shing Thesis Co...NUMBERS 6. AUTHOR(S) Christopher B. Bonine 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943–5000 8...VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR Christopher B. Bonine Lieutenant, United States Navy B.S. Southern Polytechnic State
Spatial Evaluation and Verification of Earthquake Simulators
NASA Astrophysics Data System (ADS)
Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.
2017-06-01
In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.
NASA Technical Reports Server (NTRS)
Thomas, Danny; Hartway, Bobby; Hale, Joe
2006-01-01
Throughout its rich history, NASA has invested heavily in sophisticated simulation capabilities. These capabilities reside in NASA facilities across the country - and with partners around the world. NASA s Exploration Systems Mission Directorate (ESMD) has the opportunity to leverage these considerable investments to resolve technical questions relating to its missions. The distributed nature of the assets, both in terms of geography and organization, present challenges to their combined and coordinated use, but precedents of geographically distributed real-time simulations exist. This paper will show how technological advances in simulation can be employed to address the issues associated with netting NASA simulation assets.
Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.
2018-01-01
Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.
Mathis, Carole; Dulize, Rémi H. J.; Ivanov, Nikolai V.; Alexopoulos, Leonidas; Jeremy Rice, J.; Peitsch, Manuel C.; Stolovitzky, Gustavo; Meyer, Pablo; Hoeng, Julia
2015-01-01
Motivation: Inferring how humans respond to external cues such as drugs, chemicals, viruses or hormones is an essential question in biomedicine. Very often, however, this question cannot be addressed because it is not possible to perform experiments in humans. A reasonable alternative consists of generating responses in animal models and ‘translating’ those results to humans. The limitations of such translation, however, are far from clear, and systematic assessments of its actual potential are urgently needed. sbv IMPROVER (systems biology verification for Industrial Methodology for PROcess VErification in Research) was designed as a series of challenges to address translatability between humans and rodents. This collaborative crowd-sourcing initiative invited scientists from around the world to apply their own computational methodologies on a multilayer systems biology dataset composed of phosphoproteomics, transcriptomics and cytokine data derived from normal human and rat bronchial epithelial cells exposed in parallel to 52 different stimuli under identical conditions. Our aim was to understand the limits of species-to-species translatability at different levels of biological organization: signaling, transcriptional and release of secreted factors (such as cytokines). Participating teams submitted 49 different solutions across the sub-challenges, two-thirds of which were statistically significantly better than random. Additionally, similar computational methods were found to range widely in their performance within the same challenge, and no single method emerged as a clear winner across all sub-challenges. Finally, computational methods were able to effectively translate some specific stimuli and biological processes in the lung epithelial system, such as DNA synthesis, cytoskeleton and extracellular matrix, translation, immune/inflammation and growth factor/proliferation pathways, better than the expected response similarity between species. Contact: pmeyerr@us.ibm.com or Julia.Hoeng@pmi.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25236459
Rhrissorrakrai, Kahn; Belcastro, Vincenzo; Bilal, Erhan; Norel, Raquel; Poussin, Carine; Mathis, Carole; Dulize, Rémi H J; Ivanov, Nikolai V; Alexopoulos, Leonidas; Rice, J Jeremy; Peitsch, Manuel C; Stolovitzky, Gustavo; Meyer, Pablo; Hoeng, Julia
2015-02-15
Inferring how humans respond to external cues such as drugs, chemicals, viruses or hormones is an essential question in biomedicine. Very often, however, this question cannot be addressed because it is not possible to perform experiments in humans. A reasonable alternative consists of generating responses in animal models and 'translating' those results to humans. The limitations of such translation, however, are far from clear, and systematic assessments of its actual potential are urgently needed. sbv IMPROVER (systems biology verification for Industrial Methodology for PROcess VErification in Research) was designed as a series of challenges to address translatability between humans and rodents. This collaborative crowd-sourcing initiative invited scientists from around the world to apply their own computational methodologies on a multilayer systems biology dataset composed of phosphoproteomics, transcriptomics and cytokine data derived from normal human and rat bronchial epithelial cells exposed in parallel to 52 different stimuli under identical conditions. Our aim was to understand the limits of species-to-species translatability at different levels of biological organization: signaling, transcriptional and release of secreted factors (such as cytokines). Participating teams submitted 49 different solutions across the sub-challenges, two-thirds of which were statistically significantly better than random. Additionally, similar computational methods were found to range widely in their performance within the same challenge, and no single method emerged as a clear winner across all sub-challenges. Finally, computational methods were able to effectively translate some specific stimuli and biological processes in the lung epithelial system, such as DNA synthesis, cytoskeleton and extracellular matrix, translation, immune/inflammation and growth factor/proliferation pathways, better than the expected response similarity between species. pmeyerr@us.ibm.com or Julia.Hoeng@pmi.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Options and Risk for Qualification of Electric Propulsion System
NASA Technical Reports Server (NTRS)
Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)
2002-01-01
Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model
NASA Astrophysics Data System (ADS)
Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal
How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.
The use of robots for arms control treaty verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalowski, S.J.
1991-01-01
Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
Effect of verification cores on tip capacity of drilled shafts.
DOT National Transportation Integrated Search
2009-02-01
This research addressed two key issues: : 1) Will verification cores holes fill during concrete backfilling? If so, what are the mechanical properties of the : filling material? In dry conditions, verification core holes always completely fill with c...
An unattended verification station for UF6 cylinders: Field trial findings
NASA Astrophysics Data System (ADS)
Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.
2017-12-01
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.
Challenges in the Verification of Reinforcement Learning Algorithms
NASA Technical Reports Server (NTRS)
Van Wesel, Perry; Goodloe, Alwyn E.
2017-01-01
Machine learning (ML) is increasingly being applied to a wide array of domains from search engines to autonomous vehicles. These algorithms, however, are notoriously complex and hard to verify. This work looks at the assumptions underlying machine learning algorithms as well as some of the challenges in trying to verify ML algorithms. Furthermore, we focus on the specific challenges of verifying reinforcement learning algorithms. These are highlighted using a specific example. Ultimately, we do not offer a solution to the complex problem of ML verification, but point out possible approaches for verification and interesting research opportunities.
2017-01-23
5e. TASK NUMBER N/A 5f. WORK UNIT NUMBER N/A 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AND ADDRESS(ES) RDECOM-TARDEC-ACT Attn...occupant work space, central 90% of the Soldier population, encumbrance, posture and position, verification and validation, computer aided design...factors engineers could benefit by working with vehicle designers to perform virtual assessments in CAD when there is not enough time and/or funding to
Biehl, Michael; Sadowski, Peter; Bhanot, Gyan; Bilal, Erhan; Dayarian, Adel; Meyer, Pablo; Norel, Raquel; Rhrissorrakrai, Kahn; Zeller, Michael D.; Hormoz, Sahand
2015-01-01
Motivation: Animal models are widely used in biomedical research for reasons ranging from practical to ethical. An important issue is whether rodent models are predictive of human biology. This has been addressed recently in the framework of a series of challenges designed by the systems biology verification for Industrial Methodology for Process Verification in Research (sbv IMPROVER) initiative. In particular, one of the sub-challenges was devoted to the prediction of protein phosphorylation responses in human bronchial epithelial cells, exposed to a number of different chemical stimuli, given the responses in rat bronchial epithelial cells. Participating teams were asked to make inter-species predictions on the basis of available training examples, comprising transcriptomics and phosphoproteomics data. Results: Here, the two best performing teams present their data-driven approaches and computational methods. In addition, post hoc analyses of the datasets and challenge results were performed by the participants and challenge organizers. The challenge outcome indicates that successful prediction of protein phosphorylation status in human based on rat phosphorylation levels is feasible. However, within the limitations of the computational tools used, the inclusion of gene expression data does not improve the prediction quality. The post hoc analysis of time-specific measurements sheds light on the signaling pathways in both species. Availability and implementation: A detailed description of the dataset, challenge design and outcome is available at www.sbvimprover.com. The code used by team IGB is provided under http://github.com/uci-igb/improver2013. Implementations of the algorithms applied by team AMG are available at http://bhanot.biomaps.rutgers.edu/wiki/AMG-sc2-code.zip. Contact: meikelbiehl@gmail.com PMID:24994890
Generic Protocol for the Verification of Ballast Water Treatment Technology
In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...
Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D
2014-03-01
Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.
Towards Open-World Person Re-Identification by One-Shot Group-Based Verification.
Zheng, Wei-Shi; Gong, Shaogang; Xiang, Tao
2016-03-01
Solving the problem of matching people across non-overlapping multi-camera views, known as person re-identification (re-id), has received increasing interests in computer vision. In a real-world application scenario, a watch-list (gallery set) of a handful of known target people are provided with very few (in many cases only a single) image(s) (shots) per target. Existing re-id methods are largely unsuitable to address this open-world re-id challenge because they are designed for (1) a closed-world scenario where the gallery and probe sets are assumed to contain exactly the same people, (2) person-wise identification whereby the model attempts to verify exhaustively against each individual in the gallery set, and (3) learning a matching model using multi-shots. In this paper, a novel transfer local relative distance comparison (t-LRDC) model is formulated to address the open-world person re-identification problem by one-shot group-based verification. The model is designed to mine and transfer useful information from a labelled open-world non-target dataset. Extensive experiments demonstrate that the proposed approach outperforms both non-transfer learning and existing transfer learning based re-id methods.
Kabir, Muhammad N.; Alginahi, Yasser M.
2014-01-01
This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247
The space shuttle launch vehicle aerodynamic verification challenges
NASA Technical Reports Server (NTRS)
Wallace, R. O.; Austin, L. D.; Hondros, J. G.; Surber, T. E.; Gaines, L. M.; Hamilton, J. T.
1985-01-01
The Space Shuttle aerodynamics and performance communities were challenged to verify the Space Shuttle vehicle (SSV) aerodynamics and system performance by flight measurements. Historically, launch vehicle flight test programs which faced these same challenges were unmanned instrumented flights of simple aerodynamically shaped vehicles. However, the manned SSV flight test program made these challenges more complex because of the unique aerodynamic configuration powered by the first man-rated solid rocket boosters (SRB). The analyses of flight data did not verify the aerodynamics or performance preflight predictions of the first flight of the Space Transportation System (STS-1). However, these analyses have defined the SSV aerodynamics and verified system performance. The aerodynamics community also was challenged to understand the discrepancy between the wind tunnel and flight defined aerodynamics. The preflight analysis challenges, the aerodynamic extraction challenges, and the postflight analyses challenges which led to the SSV system performance verification and which will lead to the verification of the operational ascent aerodynamics data base are presented.
Towards composition of verified hardware devices
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, G. C.
1991-01-01
Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.
78 FR 22522 - Privacy Act of 1974: New System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-16
... Privacy Act of 1974 (5 U.S.C. 552a), as amended, titled ``Biometric Verification System (CSOSA-20).'' This... Biometric Verification System allows individuals under supervision to electronically check-in for office... determination. ADDRESSES: You may submit written comments, identified by ``Biometric Verification System, CSOSA...
Engineering of the LISA Pathfinder mission—making the experiment a practical reality
NASA Astrophysics Data System (ADS)
Warren, Carl; Dunbar, Neil; Backler, Mike
2009-05-01
LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.
CASL Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mousseau, Vincent Andrew; Dinh, Nam
2016-06-30
This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less
An unattended verification station for UF 6 cylinders: Field trial findings
Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...
2017-08-26
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
An unattended verification station for UF 6 cylinders: Field trial findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L. E.; Miller, K. A.; McDonald, B. S.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
A comparative verification of high resolution precipitation forecasts using model output statistics
NASA Astrophysics Data System (ADS)
van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees
2017-04-01
Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.
Formal verification of a set of memory management units
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.
1992-01-01
This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.
Formal verification of an MMU and MMU cache
NASA Technical Reports Server (NTRS)
Schubert, E. T.
1991-01-01
We describe the formal verification of a hardware subsystem consisting of a memory management unit and a cache. These devices are verified independently and then shown to interact correctly when composed. The MMU authorizes memory requests and translates virtual addresses to real addresses. The cache improves performance by maintaining a LRU (least recently used) list from the memory resident segment table.
Testing an online, dynamic consent portal for large population biobank research.
Thiel, Daniel B; Platt, Jodyn; Platt, Tevah; King, Susan B; Fisher, Nicole; Shelton, Robert; Kardia, Sharon L R
2015-01-01
Michigan's BioTrust for Health, a public health research biobank comprised of residual dried bloodspot (DBS) cards from newborn screening contains over 4 million samples collected without written consent. Participant-centric initiatives are IT tools that hold great promise to address the consent challenges in biobank research. Working with Private Access Inc., a pioneer in patient-centric web solutions, we created and pilot tested a dynamic informed consent simulation, paired with an educational website, focusing on consent for research utilizing DBSs in Michigan's BioTrust for Health. Out of 187 pilot testers recruited in 2 groups, 137 completed the consent simulation and exit survey. Over 50% indicated their willingness to set up an account if the simulation went live and to recommend it to others. Participants raised concerns about the process of identity verification and appeared to have little experience with sharing health information online. Applying online, dynamic approaches to address the consent challenges raised by biobanks with legacy sample collections should be explored, given the positive reaction to our pilot test and the strong preference for active consent. Balancing security and privacy with accessibility and ease of use will continue to be a challenge. © 2014 S. Karger AG, Basel.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
2016-02-02
understanding is the experimental verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in...and additional qualifiers separated by commas, e.g. Smith, Richard, J, Jr. 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES). Self -explanatory... verification of a new model of light-induced loss spectra, employing continuum-dressed basis states, which agrees in shape and magnitude with all of our
Identity Verification, Control, and Aggression in Marriage
ERIC Educational Resources Information Center
Stets, Jan E.; Burke, Peter J.
2005-01-01
In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…
Review and verification of CARE 3 mathematical model and code
NASA Technical Reports Server (NTRS)
Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.
1983-01-01
The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.
The 2014 Sandia Verification and Validation Challenge: Problem statement
Hu, Kenneth; Orient, George
2016-01-18
This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less
Molecular Verification of Cryptops hortensis (Scolopendromorpha: Cryptopidae) in theNearctic Region
2018-01-29
Journal Article 3. DATES COVERED (From – To) March – April 2016 4. TITLE AND SUBTITLE Molecular Verification of Cryptops hortensis...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USAF School of Aerospace Medicine ...Public Health and Preventive Medicine Dept/PHR 2510 Fifth St., Bldg. 840 Wright-Patterson AFB, OH 45433-7913 8. PERFORMING ORGANIZATION REPORT
Verification and Validation Challenges for Adaptive Flight Control of Complex Autonomous Systems
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.
2018-01-01
Autonomy of aerospace systems requires the ability for flight control systems to be able to adapt to complex uncertain dynamic environment. In spite of the five decades of research in adaptive control, the fact still remains that currently no adaptive control system has ever been deployed on any safety-critical or human-rated production systems such as passenger transport aircraft. The problem lies in the difficulty with the certification of adaptive control systems since existing certification methods cannot readily be used for nonlinear adaptive control systems. Research to address the notion of metrics for adaptive control began to appear in the recent years. These metrics, if accepted, could pave a path towards certification that would potentially lead to the adoption of adaptive control as a future control technology for safety-critical and human-rated production systems. Development of certifiable adaptive control systems represents a major challenge to overcome. Adaptive control systems with learning algorithms will never become part of the future unless it can be proven that they are highly safe and reliable. Rigorous methods for adaptive control software verification and validation must therefore be developed to ensure that adaptive control system software failures will not occur, to verify that the adaptive control system functions as required, to eliminate unintended functionality, and to demonstrate that certification requirements imposed by regulatory bodies such as the Federal Aviation Administration (FAA) can be satisfied. This presentation will discuss some of the technical issues with adaptive flight control and related V&V challenges.
A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem
Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour
2018-01-01
The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem. PMID:29597286
A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.
AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour
2018-03-28
The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.
Verification Challenges of Dynamic Testing of Space Flight Hardware
NASA Technical Reports Server (NTRS)
Winnitoy, Susan
2010-01-01
The Six Degree-of-Freedom Dynamic Test System (SDTS) is a test facility at the National Aeronautics and Space Administration (NASA) Johnson Space Center in Houston, Texas for performing dynamic verification of space structures and hardware. Some examples of past and current tests include the verification of on-orbit robotic inspection systems, space vehicle assembly procedures and docking/berthing systems. The facility is able to integrate a dynamic simulation of on-orbit spacecraft mating or demating using flight-like mechanical interface hardware. A force moment sensor is utilized for input to the simulation during the contact phase, thus simulating the contact dynamics. While the verification of flight hardware presents many unique challenges, one particular area of interest is with respect to the use of external measurement systems to ensure accurate feedback of dynamic contact. There are many commercial off-the-shelf (COTS) measurement systems available on the market, and the test facility measurement systems have evolved over time to include two separate COTS systems. The first system incorporates infra-red sensing cameras, while the second system employs a laser interferometer to determine position and orientation data. The specific technical challenges with the measurement systems in a large dynamic environment include changing thermal and humidity levels, operational area and measurement volume, dynamic tracking, and data synchronization. The facility is located in an expansive high-bay area that is occasionally exposed to outside temperature when large retractable doors at each end of the building are opened. The laser interferometer system, in particular, is vulnerable to the environmental changes in the building. The operational area of the test facility itself is sizeable, ranging from seven meters wide and five meters deep to as much as seven meters high. Both facility measurement systems have desirable measurement volumes and the accuracies vary within the respective volumes. In addition, because this is a dynamic facility with a moving test bed, direct line-of-sight may not be available at all times between the measurement sensors and the tracking targets. Finally, the feedback data from the active test bed along with the two external measurement systems must be synchronized to allow for data correlation. To ensure the desired accuracy and resolution of these systems, calibration of the systems must be performed regularly. New innovations in sensor technology itself are periodically incorporated into the facility s overall measurement scheme. In addressing the challenges of the measurement systems, the facility is able to provide essential position and orientation data to verify the dynamic performance of space flight hardware.
Measles and rubella elimination in the WHO Region for Europe: progress and challenges.
O'Connor, P; Jankovic, D; Muscat, M; Ben-Mamou, M; Reef, S; Papania, M; Singh, S; Kaloumenos, T; Butler, R; Datta, S
2017-08-01
Globally measles remains one of the leading causes of death among young children even though a safe and cost-effective vaccine is available. The World Health Organization (WHO) European Region has seen a decline in measles and rubella cases in recent years. The recent outbreaks have primarily affected adolescents and young adults with no vaccination or an incomplete vaccination history. Eliminating measles and rubella is one of the top immunization priorities of the European Region as outlined in the European Vaccine Action Plan 2015-2020. Following the 2010 decision by the Member States in the Region to initiate the process of verifying elimination, the European Regional Verification Commission for Measles and Rubella Elimination (RVC) was established in 2011. The RVC meets every year to evaluate the status of measles and rubella elimination in the Region based on documentation submitted by each country's National Verification Committees. The verification process was however modified in late 2014 to assess the elimination status at the individual country level instead of at regional level. The WHO European Region has made substantial progress towards measles and rubella elimination over the past 5 years. The RVC's conclusion in 2016 that 70% and 66% of the 53 Member States in the Region had interrupted the endemic transmission of measles and rubella, respectively, by 2015 is a testament to this progress. Nevertheless, where measles and rubella remain endemic, challenges in vaccination service delivery and disease surveillance will need to be addressed through focused technical assistance from WHO and development partners. Copyright © 2017 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Verification of S&D Solutions for Network Communications and Devices
NASA Astrophysics Data System (ADS)
Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen
This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.
The politics of verification and the control of nuclear tests, 1945-1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, N.W.
1990-01-01
This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
Gaia challenging performances verification: combination of spacecraft models and test results
NASA Astrophysics Data System (ADS)
Ecale, Eric; Faye, Frédéric; Chassat, François
2016-08-01
To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.
Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program
NASA Technical Reports Server (NTRS)
Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby
2017-01-01
Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsumi Marukawa; Kazuki Nakashima; Masashi Koga
1994-12-31
This paper presents a paper form processing system with an error correcting function for reading handwritten kanji strings. In the paper form processing system, names and addresses are important key data, and especially this paper takes up an error correcting method for name and address recognition. The method automatically corrects errors of the kanji OCR (Optical Character Reader) with the help of word dictionaries and other knowledge. Moreover, it allows names and addresses to be written in any style. The method consists of word matching {open_quotes}furigana{close_quotes} verification for name strings, and address approval for address strings. For word matching, kanjimore » name candidates are extracted by automaton-type word matching. In {open_quotes}furigana{close_quotes} verification, kana candidate characters recognized by the kana OCR are compared with kana`s searched from the name dictionary based on kanji name candidates, given by the word matching. The correct name is selected from the results of word matching and furigana verification. Also, the address approval efficiently searches for the right address based on a bottom-up procedure which follows hierarchical relations from a lower placename to a upper one by using the positional condition among the placenames. We ascertained that the error correcting method substantially improves the recognition rate and processing speed in experiments on 5,032 forms.« less
Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools
NASA Technical Reports Server (NTRS)
Bis, Rachael; Maul, William A.
2015-01-01
Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.
Test/QA Plan for Verification of Ozone Indicator Cards
This verification test will address ozone indicator cards (OICs) that provide short-term semi-quantitative measures of ozone concentration in ambient air. Testing will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Tec...
Stern, Robin L; Heaton, Robert; Fraser, Martin W; Goddu, S Murty; Kirby, Thomas H; Lam, Kwok Leung; Molineu, Andrea; Zhu, Timothy C
2011-01-01
The requirement of an independent verification of the monitor units (MU) or time calculated to deliver the prescribed dose to a patient has been a mainstay of radiation oncology quality assurance. The need for and value of such a verification was obvious when calculations were performed by hand using look-up tables, and the verification was achieved by a second person independently repeating the calculation. However, in a modern clinic using CT/MR/PET simulation, computerized 3D treatment planning, heterogeneity corrections, and complex calculation algorithms such as convolution/superposition and Monte Carlo, the purpose of and methodology for the MU verification have come into question. In addition, since the verification is often performed using a simpler geometrical model and calculation algorithm than the primary calculation, exact or almost exact agreement between the two can no longer be expected. Guidelines are needed to help the physicist set clinically reasonable action levels for agreement. This report addresses the following charges of the task group: (1) To re-evaluate the purpose and methods of the "independent second check" for monitor unit calculations for non-IMRT radiation treatment in light of the complexities of modern-day treatment planning. (2) To present recommendations on how to perform verification of monitor unit calculations in a modern clinic. (3) To provide recommendations on establishing action levels for agreement between primary calculations and verification, and to provide guidance in addressing discrepancies outside the action levels. These recommendations are to be used as guidelines only and shall not be interpreted as requirements.
Doing Our Homework: A Pre-Employment Screening Checklist for Better Head Start Hiring.
ERIC Educational Resources Information Center
McSherry, Jim; Caruso, Karen; Cagnetta, Mary
1999-01-01
Provides a checklist for screening potential Head Start employees. Components include checks for criminal record, credit history, motor vehicle records, social security number tracing, previous address research, education verification, work and personal reference interview, and professional license verification. (KB)
Ada(R) Test and Verification System (ATVS)
NASA Technical Reports Server (NTRS)
Strelich, Tom
1986-01-01
The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.
Improving the Effectiveness of Speaker Verification Domain Adaptation With Inadequate In-Domain Data
2017-08-20
Improving the Effectiveness of Speaker Verification Domain Adaptation With Inadequate In-Domain Data Bengt J. Borgström1, Elliot Singer1, Douglas...ll.mit.edu.edu, dar@ll.mit.edu, es@ll.mit.edu, omid.sadjadi@nist.gov Abstract This paper addresses speaker verification domain adaptation with...contain speakers with low channel diversity. Existing domain adaptation methods are reviewed, and their shortcomings are discussed. We derive an
Poussin, Carine; Mathis, Carole; Alexopoulos, Leonidas G; Messinis, Dimitris E; Dulize, Rémi H J; Belcastro, Vincenzo; Melas, Ioannis N; Sakellaropoulos, Theodore; Rhrissorrakrai, Kahn; Bilal, Erhan; Meyer, Pablo; Talikka, Marja; Boué, Stéphanie; Norel, Raquel; Rice, John J; Stolovitzky, Gustavo; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia
2014-01-01
The biological responses to external cues such as drugs, chemicals, viruses and hormones, is an essential question in biomedicine and in the field of toxicology, and cannot be easily studied in humans. Thus, biomedical research has continuously relied on animal models for studying the impact of these compounds and attempted to 'translate' the results to humans. In this context, the SBV IMPROVER (Systems Biology Verification for Industrial Methodology for PROcess VErification in Research) collaborative initiative, which uses crowd-sourcing techniques to address fundamental questions in systems biology, invited scientists to deploy their own computational methodologies to make predictions on species translatability. A multi-layer systems biology dataset was generated that was comprised of phosphoproteomics, transcriptomics and cytokine data derived from normal human (NHBE) and rat (NRBE) bronchial epithelial cells exposed in parallel to more than 50 different stimuli under identical conditions. The present manuscript describes in detail the experimental settings, generation, processing and quality control analysis of the multi-layer omics dataset accessible in public repositories for further intra- and inter-species translation studies.
Statement Verification: A Stochastic Model of Judgment and Response.
ERIC Educational Resources Information Center
Wallsten, Thomas S.; Gonzalez-Vallejo, Claudia
1994-01-01
A stochastic judgment model (SJM) is presented as a framework for addressing issues in statement verification and probability judgment. Results of 5 experiments with 264 undergraduates support the validity of the model and provide new information that is interpreted in terms of the SJM. (SLD)
2016-10-01
comes when considering numerous scores and statistics during a preliminary evaluation of the applicability of the fuzzy- verification minimum coverage...The selection of thresholds with which to generate categorical-verification scores and statistics from the application of both traditional and...of statistically significant numbers of cases; the latter presents a challenge of limited application for assessment of the forecast models’ ability
8 CFR 274a.2 - Verification of identity and employment authorization.
Code of Federal Regulations, 2012 CFR
2012-01-01
... of birth, sex, height, color of eyes, and address; (ii) School identification card with a photograph... REGULATIONS CONTROL OF EMPLOYMENT OF ALIENS Employer Requirements § 274a.2 Verification of identity and... contain a photograph, identifying information shall be included such as: name, date of birth, sex, height...
8 CFR 274a.2 - Verification of identity and employment authorization.
Code of Federal Regulations, 2013 CFR
2013-01-01
... of birth, sex, height, color of eyes, and address; (ii) School identification card with a photograph... REGULATIONS CONTROL OF EMPLOYMENT OF ALIENS Employer Requirements § 274a.2 Verification of identity and... contain a photograph, identifying information shall be included such as: name, date of birth, sex, height...
8 CFR 274a.2 - Verification of identity and employment authorization.
Code of Federal Regulations, 2014 CFR
2014-01-01
... of birth, sex, height, color of eyes, and address; (ii) School identification card with a photograph... REGULATIONS CONTROL OF EMPLOYMENT OF ALIENS Employer Requirements § 274a.2 Verification of identity and... contain a photograph, identifying information shall be included such as: name, date of birth, sex, height...
8 CFR 274a.2 - Verification of identity and employment authorization.
Code of Federal Regulations, 2011 CFR
2011-01-01
... of birth, sex, height, color of eyes, and address; (ii) School identification card with a photograph... REGULATIONS CONTROL OF EMPLOYMENT OF ALIENS Employer Requirements § 274a.2 Verification of identity and... contain a photograph, identifying information shall be included such as: name, date of birth, sex, height...
Domain Specific Language Support for Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadayappan, Ponnuswamy
Domain-Specific Languages (DSLs) offer an attractive path to Exascale software since they provide expressive power through appropriate abstractions and enable domain-specific optimizations. But the advantages of a DSL compete with the difficulties of implementing a DSL, even for a narrowly defined domain. The DTEC project addresses how a variety of DSLs can be easily implemented to leverage existing compiler analysis and transformation capabilities within the ROSE open source compiler as part of a research program focusing on Exascale challenges. The OSU contributions to the DTEC project are in the area of code generation from high-level DSL descriptions, as well asmore » verification of the automatically-generated code.« less
Challenges in verification and validation of autonomous systems for space exploration
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Jonsson, Ari
2005-01-01
Space exploration applications offer a unique opportunity for the development and deployment of autonomous systems, due to limited communications, large distances, and great expense of direct operation. At the same time, the risk and cost of space missions leads to reluctance to taking on new, complex and difficult-to-understand technology. A key issue in addressing these concerns is the validation of autonomous systems. In recent years, higher-level autonomous systems have been applied in space applications. In this presentation, we will highlight those autonomous systems, and discuss issues in validating these systems. We will then look to future demands on validating autonomous systems for space, identify promising technologies and open issues.
Generic Protocol for the Verification of Ballast Water Treatment Technology. Version 5.1
2010-09-01
the Protocol ..................................................................................... 2 1.4 Verification Testing Process ...Volumes, Containers and Processing .................................................................38 Table 10. Recommendation for Water...or persistent distortion of a measurement process that causes errors in one direction. Challenge Water: Water supplied to a treatment system under
Formal methods for dependable real-time systems
NASA Technical Reports Server (NTRS)
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
International Cooperative for Aerosol Prediction Workshop on Aerosol Forecast Verification
NASA Technical Reports Server (NTRS)
Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.
2011-01-01
The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.
NASA Technical Reports Server (NTRS)
Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.
2011-01-01
The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.
Quickulum: A Process for Quick Response Curriculum Verification
ERIC Educational Resources Information Center
Lovett, Marvin; Jones, Irma S.; Stingley, Paul
2010-01-01
This paper addresses the need for a method of continual and frequent verification regarding course content taught in some post-secondary courses. With excessive amounts of information generated within the workplace, continual change exists for what is taught in some of our business courses. This is especially true for specific content areas such…
76 FR 14038 - TWIC/MTSA Policy Advisory Council; Voluntary Use of TWIC Readers
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... and would like to know that they reached the Facility, please enclose a stamped, self-addressed... regulatory requirements for effective (1) identity verification, (2) card validity, and (3) card... access is granted. 33 CFR 101.514. At each entry, the TWIC must be checked for (1) identity verification...
75 FR 43943 - Defense Science Board; Task Force on Nuclear Treaty Monitoring and Verification
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-27
... DEPARTMENT OF DEFENSE Office of the Secretary Defense Science Board; Task Force on Nuclear Treaty... meetings. SUMMARY: The Defense Science Board Task Force on Nuclear Treaty Monitoring and Verification will... held September 13-14, and 25-26, 2010. ADDRESSES: The meetings will be held at Science Applications...
Verification and benchmark testing of the NUFT computer code
NASA Astrophysics Data System (ADS)
Lee, K. H.; Nitao, J. J.; Kulshrestha, A.
1993-10-01
This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties. PMID:21713128
Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana
2011-01-01
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Salter, L. D.; Young, G. M., III; Munafo, P. M.
1985-01-01
The planned missions for the space shuttle dictated a unique and technology-extending rocket engine. The high specific impulse requirements in conjunction with a 55-mission lifetime, plus volume and weight constraints, produced unique structural design, manufacturing, and verification requirements. Operations from Earth to orbit produce severe dynamic environments, which couple with the extreme pressure and thermal environments associated with the high performance, creating large low cycle loads and high alternating stresses above endurance limit which result in high sensitivity to alternating stresses. Combining all of these effects resulted in the requirements for exotic materials, which are more susceptible to manufacturing problems, and the use of an all-welded structure. The challenge of integrating environments, dynamics, structures, and materials into a verified SSME structure is discussed. The verification program and developmental flight results are included. The first six shuttle flights had engine performance as predicted with no failures. The engine system has met the basic design challenges.
49 CFR 1104.4 - Attestation and verification.
Code of Federal Regulations, 2012 CFR
2012-10-01
... in ink by the practitioner or attorney, whose address should be stated. The signature of a... or attorney must be: (1) Signed in ink; (2) Accompanied by the signer's address; and (3) Verified, if...
49 CFR 1104.4 - Attestation and verification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... in ink by the practitioner or attorney, whose address should be stated. The signature of a... or attorney must be: (1) Signed in ink; (2) Accompanied by the signer's address; and (3) Verified, if...
49 CFR 1104.4 - Attestation and verification.
Code of Federal Regulations, 2011 CFR
2011-10-01
... in ink by the practitioner or attorney, whose address should be stated. The signature of a... or attorney must be: (1) Signed in ink; (2) Accompanied by the signer's address; and (3) Verified, if...
49 CFR 1104.4 - Attestation and verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
... in ink by the practitioner or attorney, whose address should be stated. The signature of a... or attorney must be: (1) Signed in ink; (2) Accompanied by the signer's address; and (3) Verified, if...
49 CFR 1104.4 - Attestation and verification.
Code of Federal Regulations, 2013 CFR
2013-10-01
... in ink by the practitioner or attorney, whose address should be stated. The signature of a... or attorney must be: (1) Signed in ink; (2) Accompanied by the signer's address; and (3) Verified, if...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinh, Nam; Athe, Paridhi; Jones, Christopher
The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. Thismore » approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.« less
Preliminary Analysis of the Transient Reactor Test Facility (TREAT) with PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Lee, C. H.
The neutron transport code PROTEUS has been used to perform preliminary simulations of the Transient Reactor Test Facility (TREAT). TREAT is an experimental reactor designed for the testing of nuclear fuels and other materials under transient conditions. It operated from 1959 to 1994, when it was placed on non-operational standby. The restart of TREAT to support the U.S. Department of Energy’s resumption of transient testing is currently underway. Both single assembly and assembly-homogenized full core models have been evaluated. Simulations were performed using a historic set of WIMS-ANL-generated cross-sections as well as a new set of Serpent-generated cross-sections. To supportmore » this work, further analyses were also performed using additional codes in order to investigate particular aspects of TREAT modeling. DIF3D and the Monte-Carlo codes MCNP and Serpent were utilized in these studies. MCNP and Serpent were used to evaluate the effect of geometry homogenization on the simulation results and to support code-to-code comparisons. New meshes for the PROTEUS simulations were created using the CUBIT toolkit, with additional meshes generated via conversion of selected DIF3D models to support code-to-code verifications. All current analyses have focused on code-to-code verifications, with additional verification and validation studies planned. The analysis of TREAT with PROTEUS-SN is an ongoing project. This report documents the studies that have been performed thus far, and highlights key challenges to address in future work.« less
Formal verification of human-automation interaction
NASA Technical Reports Server (NTRS)
Degani, Asaf; Heymann, Michael
2002-01-01
This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.
Code of Federal Regulations, 2011 CFR
2011-10-01
...), except as otherwise provided in this section. (c) In the verification interview, you must explain the laboratory findings to the employee and address technical questions or issues the employee may raise. (d) You... at the time of the verification interview. As the MRO, you have discretion to extend the time...
Code of Federal Regulations, 2010 CFR
2010-10-01
...), except as otherwise provided in this section. (c) In the verification interview, you must explain the laboratory findings to the employee and address technical questions or issues the employee may raise. (d) You... at the time of the verification interview. As the MRO, you have discretion to extend the time...
Verification and quality control of routine hematology analyzers.
Vis, J Y; Huisman, A
2016-05-01
Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.
Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System
NASA Technical Reports Server (NTRS)
Kurtoglu, Tolga; Jensen, David; Poll, Scott
2009-01-01
Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsipis, K.
A method for verifying a cruise-missile agreement that would be acceptable to the military, industrial, and intelligence communities in both nations must be as unintrusive as possible, while remaining immune to cheating of any significance. This is the goal of the technical solutions outlined here. The elements of a verification regime described do not require routine, intrusive, on-site visits to naval vessels, aircraft, air bases, or weapons magazines where missiles may be stored. They do not interfere with the operational readiness of the missiles, and they protect legitimate military secrets of the inspected nation. If supported by competent national technicalmore » means of verification such as those both sides already employ, with a small number of on-site challenge inspections, a combination of technical solutions and procedures such as these could be effective. They would adequately safeguard the national security and sovereignty of the participating nations while providing policymakers with the option of a treaty that limits the number of long-range nuclear cruise missiles or eliminates them completely. As discussed, there are problems that remain to be addressed, but they should not be allowed to block a U.S.-Soviet agreement significantly reducing strategic nuclear arsenals.« less
NASA Technical Reports Server (NTRS)
Ortiz, James N.; Scott,Kelly; Smith, Harold
2004-01-01
The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.
Verification in Referral-Based Crowdsourcing
Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.
2012-01-01
Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530
Emerging technologies for V&V of ISHM software for space exploration
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Markosian, Lawrence Z.
2006-01-01
Systems1,2 required to exhibit high operational reliability often rely on some form of fault protection to recognize and respond to faults, preventing faults' escalation to catastrophic failures. Integrated System Health Management (ISHM) extends the functionality of fault protection to both scale to more complex systems (and systems of systems), and to maintain capability rather than just avert catastrophe. Forms of ISHM have been utilized to good effect in the maintenance phase of systems' total lifecycles (often referred to as 'condition-based mainte-nance'), but less so in a 'fault protection' role during actual operations. One of the impediments to such use lies in the challenges of verification, validation and certification of ISHM systems themselves. This paper makes the case that state-of-the-practice V&V and certification techniques will not suffice for emerging forms of ISHM systems; however, a number of maturing software engineering assurance technologies show particular promise for addressing these ISHM V&V challenges.
Integrating MBSE into Ongoing Projects: Requirements Validation and Test Planning for the ISS SAFER
NASA Technical Reports Server (NTRS)
Anderson, Herbert A.; Williams, Antony; Pierce, Gregory
2016-01-01
The International Space Station (ISS) Simplified Aid for Extra Vehicular Activity (EVA) Rescue (SAFER) is the spacewalking astronaut's final safety measure against separating from the ISS and being unable to return safely. Since the late 1990s, the SAFER has been a standard element of the spacewalking astronaut's equipment. The ISS SAFER project was chartered to develop a new block of SAFER units using a highly similar design to the legacy SAFER (known as the USA SAFER). An on-orbit test module was also included in the project to enable periodic maintenance/propulsion system checkout on the ISS SAFER. On the ISS SAFER project, model-based systems engineering (MBSE) was not the initial systems engineering (SE) approach, given the volume of heritage systems engineering and integration (SE&I) products. The initial emphasis was ensuring traceability to ISS program standards as well as to legacy USA SAFER requirements. The requirements management capabilities of the Cradle systems engineering tool were to be utilized to that end. During development, however, MBSE approaches were applied selectively to address specific challenges in requirements validation and test and verification (T&V) planning, which provided measurable efficiencies to the project. From an MBSE perspective, ISS SAFER development presented a challenge and an opportunity. Addressing the challenge first, the project was tasked to use the original USA SAFER operational and design requirements baseline, with a number of additional ISS program requirements to address evolving certification expectations for systems operating on the ISS. Additionally, a need to redesign the ISS SAFER avionics architecture resulted in a set of changes to the design requirements baseline. Finally, the project added an entirely new functionality for on-orbit maintenance. After initial requirements integration, the system requirements count was approaching 1000, which represented a growth of 4x over the original USA SAFER system. This presented the challenge - How to confirm that this new set of requirements set would result in the creation of the desired capability.
Cognitive Bias in the Verification and Validation of Space Flight Systems
NASA Technical Reports Server (NTRS)
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.
From Operating-System Correctness to Pervasively Verified Applications
NASA Astrophysics Data System (ADS)
Daum, Matthias; Schirmer, Norbert W.; Schmidt, Mareike
Though program verification is known and has been used for decades, the verification of a complete computer system still remains a grand challenge. Part of this challenge is the interaction of application programs with the operating system, which is usually entrusted with retrieving input data from and transferring output data to peripheral devices. In this scenario, the correct operation of the applications inherently relies on operating-system correctness. Based on the formal correctness of our real-time operating system Olos, this paper describes an approach to pervasively verify applications running on top of the operating system.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter
2017-01-01
Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…
Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Miller, Karen A.; Garner, James R.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less
Laboratory Verification of Occulter Contrast Performance and Formation Flight
NASA Astrophysics Data System (ADS)
Sirbu, Dan
2014-01-01
Direct imaging of an exo-Earth is a difficult technical challenge. First, the intensity ratio between the parent star and its dim, rocky planetary companion is expected to be ten billion times. Additionally, for a planetary companion in the habitable zone the angular separation to the star is very small, such that only nearby stars are feasible targets. An external occulter is a spacecraft that is flown in formation with the observing space telescope and blocks starlight prior to the entrance pupil. Its shape must be specially designed to control for diffraction and be tolerant of errors such as misalignment, manufacturing, and deformations. In this dissertation, we present laboratory results pertaining to the optical verification of the contrast performance of a scaled occulter and implementation of an algorithm for the alignment of the telescope in the shadow of the occulter. The experimental testbed is scaled from space dimensions to the laboratory by maintaining constant Fresnel numbers while preserving an identical diffraction integral. We present monochromatic results in the image plane showing contrast better than 10 orders of magnitude, consistent with the level required for imaging an Exo-earth, and obtained using an optimized occulter shape. We compare these results to a baseline case using a circular occulter and to the theoretical predictions. Additionally, we address the principal technical challenge in the formation flight problem through demonstration of an alignment algorithm that is based on out-of-band leaked light. Such leaked light can be used a map to estimate the location of the telescope in the shadow and perform fine alignment during science observations.
Land Ice Verification and Validation Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-07-15
To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less
NASA Astrophysics Data System (ADS)
Kearns, E. J.
2017-12-01
NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.
Tang, G.; Andre, B.; Hoffman, F. M.; Painter, S. L.; Thornton, P. E.; Yuan, F.; Bisht, G.; Hammond, G. E.; Lichtner, P. C.; Kumar, J.; Mills, R. T.; Xu, X.
2016-04-19
This Modeling Archive is in support of an NGEE Arctic discussion paper under review and available at doi:10.5194/gmd-9-927-2016. The purpose is to document the simulations to allow verification, reproducibility, and follow-up studies. This dataset contains shell scripts to create the CLM-PFLOTRAN cases, specific input files for PFLOTRAN and CLM, outputs, and python scripts to make the figures using the outputs in the publication. Through these results, we demonstrate that CLM-PFLOTRAN can approximately reproduce CLM results in selected cases for the Arctic, temperate and tropic sites. In addition, the new framework facilitates mechanistic representations of soil biogeochemistry processes in the land surface model.
Applying MDA to SDR for Space to Model Real-time Issues
NASA Technical Reports Server (NTRS)
Blaser, Tammy M.
2007-01-01
NASA space communications systems have the challenge of designing SDRs with highly-constrained Size, Weight and Power (SWaP) resources. A study is being conducted to assess the effectiveness of applying the MDA Platform-Independent Model (PIM) and one or more Platform-Specific Models (PSM) specifically to address NASA space domain real-time issues. This paper will summarize our experiences with applying MDA to SDR for Space to model real-time issues. Real-time issues to be examined, measured, and analyzed are: meeting waveform timing requirements and efficiently applying Real-time Operating System (RTOS) scheduling algorithms, applying safety control measures, and SWaP verification. Real-time waveform algorithms benchmarked with the worst case environment conditions under the heaviest workload will drive the SDR for Space real-time PSM design.
Addressing challenges of modulation transfer function measurement with fisheye lens cameras
NASA Astrophysics Data System (ADS)
Deegan, Brian M.; Denny, Patrick E.; Zlokolica, Vladimir; Dever, Barry; Russell, Laura
2015-03-01
Modulation transfer function (MTF) is a well defined and accepted method of measuring image sharpness. The slanted edge test, as defined in ISO12233 is a standard method of calculating MTF, and is widely used for lens alignment and auto-focus algorithm verification. However, there are a number of challenges which should be considered when measuring MTF in cameras with fisheye lenses. Due to trade-offs related Petzval curvature, planarity of the optical plane is difficult to achieve in fisheye lenses. It is therefore critical to have the ability to accurately measure sharpness throughout the entire image, particularly for lens alignment. One challenge for fisheye lenses is that, because of the radial distortion, the slanted edges will have different angles, depending on the location within the image and on the distortion profile of the lens. Previous work in the literature indicates that MTF measurements are robust for angles between 2 and 10 degrees. Outside of this range, MTF measurements become unreliable. Also, the slanted edge itself will be curved by the lens distortion, causing further measurement problems. This study summarises the difficulties in the use of MTF for sharpness measurement in fisheye lens cameras, and proposes mitigations and alternative methods.
NASA Astrophysics Data System (ADS)
Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine
2017-09-01
Integrating prognostics to a real application requires a certain maturity level and for this reason there is a lack of success stories about development of a complete Prognostics and Health Management system. In fact, the maturity of prognostics is closely linked to data and domain specific entities like modeling. Basically, prognostics task aims at predicting the degradation of engineering assets. However, practically it is not possible to precisely predict the impending failure, which requires a thorough understanding to encounter different sources of uncertainty that affect prognostics. Therefore, different aspects crucial to the prognostics framework, i.e., from monitoring data to remaining useful life of equipment need to be addressed. To this aim, the paper contributes to state of the art and taxonomy of prognostics approaches and their application perspectives. In addition, factors for prognostics approach selection are identified, and new case studies from component-system level are discussed. Moreover, open challenges toward maturity of the prognostics under uncertainty are highlighted and scheme for an efficient prognostics approach is presented. Finally, the existing challenges for verification and validation of prognostics at different technology readiness levels are discussed with respect to open challenges.
A Verification System for Distributed Objects with Asynchronous Method Calls
NASA Astrophysics Data System (ADS)
Ahrendt, Wolfgang; Dylla, Maximilian
We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.
Towards Verification and Validation for Increased Autonomy
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra
2017-01-01
This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.
EOS-AM precision pointing verification
NASA Technical Reports Server (NTRS)
Throckmorton, A.; Braknis, E.; Bolek, J.
1993-01-01
The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.
Dosimetry for audit and clinical trials: challenges and requirements
NASA Astrophysics Data System (ADS)
Kron, T.; Haworth, A.; Williams, I.
2013-06-01
Many important dosimetry audit networks for radiotherapy have their roots in clinical trial quality assurance (QA). In both scenarios it is essential to test two issues: does the treatment plan conform with the clinical requirements and is the plan a reasonable representation of what is actually delivered to a patient throughout their course of treatment. Part of a sound quality program would be an external audit of these issues with verification of the equivalence of plan and treatment typically referred to as a dosimetry audit. The increasing complexity of radiotherapy planning and delivery makes audits challenging. While verification of absolute dose delivered at a reference point was the standard of external dosimetry audits two decades ago this is often deemed inadequate for verification of treatment approaches such as Intensity Modulated Radiation Therapy (IMRT) and Volumetric Modulated Arc Therapy (VMAT). As such, most dosimetry audit networks have successfully introduced more complex tests of dose delivery using anthropomorphic phantoms that can be imaged, planned and treated as a patient would. The new challenge is to adapt this approach to ever more diversified radiotherapy procedures with image guided/adaptive radiotherapy, motion management and brachytherapy being the focus of current research.
WRAP-RIB antenna technology development
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Garcia, N. F.; Iwamoto, H.
1985-01-01
The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.
Motion Planning of Two Stacker Cranes in a Large-Scale Automated Storage/Retrieval System
NASA Astrophysics Data System (ADS)
Kung, Yiheng; Kobayashi, Yoshimasa; Higashi, Toshimitsu; Ota, Jun
We propose a method for reducing the computational time of motion planning for stacker cranes. Most automated storage/retrieval systems (AS/RSs) are only equipped with one stacker crane. However, this is logistically challenging, and greater work efficiency in warehouses, such as those using two stacker cranes, is required. In this paper, a warehouse with two stacker cranes working simultaneously is proposed. Unlike warehouses with only one crane, trajectory planning in those with two cranes is very difficult. Since there are two cranes working together, a proper trajectory must be considered to avoid collision. However, verifying collisions is complicated and requires a considerable amount of computational time. As transport work in AS/RSs occurs randomly, motion planning cannot be conducted in advance. Planning an appropriate trajectory within a restricted duration would be a difficult task. We thereby address the current problem of motion planning requiring extensive calculation time. As a solution, we propose a “free-step” to simplify the procedure of collision verification and reduce the computational time. On the other hand, we proposed a method to reschedule the order of collision verification in order to find an appropriate trajectory in less time. By the proposed method, we reduce the calculation time to less than 1/300 of that achieved in former research.
Communication Architecture in Mixed-Reality Simulations of Unmanned Systems
2018-01-01
Verification of the correct functionality of multi-vehicle systems in high-fidelity scenarios is required before any deployment of such a complex system, e.g., in missions of remote sensing or in mobile sensor networks. Mixed-reality simulations where both virtual and physical entities can coexist and interact have been shown to be beneficial for development, testing, and verification of such systems. This paper deals with the problems of designing a certain communication subsystem for such highly desirable realistic simulations. Requirements of this communication subsystem, including proper addressing, transparent routing, visibility modeling, or message management, are specified prior to designing an appropriate solution. Then, a suitable architecture of this communication subsystem is proposed together with solutions to the challenges that arise when simultaneous virtual and physical message transmissions occur. The proposed architecture can be utilized as a high-fidelity network simulator for vehicular systems with implicit mobility models that are given by real trajectories of the vehicles. The architecture has been utilized within multiple projects dealing with the development and practical deployment of multi-UAV systems, which support the architecture’s viability and advantages. The provided experimental results show the achieved similarity of the communication characteristics of the fully deployed hardware setup to the setup utilizing the proposed mixed-reality architecture. PMID:29538290
Communication Architecture in Mixed-Reality Simulations of Unmanned Systems.
Selecký, Martin; Faigl, Jan; Rollo, Milan
2018-03-14
Verification of the correct functionality of multi-vehicle systems in high-fidelity scenarios is required before any deployment of such a complex system, e.g., in missions of remote sensing or in mobile sensor networks. Mixed-reality simulations where both virtual and physical entities can coexist and interact have been shown to be beneficial for development, testing, and verification of such systems. This paper deals with the problems of designing a certain communication subsystem for such highly desirable realistic simulations. Requirements of this communication subsystem, including proper addressing, transparent routing, visibility modeling, or message management, are specified prior to designing an appropriate solution. Then, a suitable architecture of this communication subsystem is proposed together with solutions to the challenges that arise when simultaneous virtual and physical message transmissions occur. The proposed architecture can be utilized as a high-fidelity network simulator for vehicular systems with implicit mobility models that are given by real trajectories of the vehicles. The architecture has been utilized within multiple projects dealing with the development and practical deployment of multi-UAV systems, which support the architecture's viability and advantages. The provided experimental results show the achieved similarity of the communication characteristics of the fully deployed hardware setup to the setup utilizing the proposed mixed-reality architecture.
Verification of a Byzantine-Fault-Tolerant Self-stabilizing Protocol for Clock Synchronization
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2008-01-01
This paper presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system except for the presence of sufficient good nodes, thus making the weakest possible assumptions and producing the strongest results. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV). The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space.
NASA Technical Reports Server (NTRS)
Bryant, Larry W.; Fragoso, Ruth S.
2007-01-01
In 2003 we proposed an effort to develop a core program of standardized training and verification practices and standards against which the implementation of these practices could be measured. The purpose was to provide another means of risk reduction for deep space missions to preclude the likelihood of a repeat of the tragedies of the 1998 Mars missions. We identified six areas where the application of standards and standardization would benefit the overall readiness process for flight projects at JPL. These are Individual Training, Team Training, Interface and Procedure Development, Personnel Certification, Interface and procedure Verification, and Operations Readiness Testing. In this paper we will discuss the progress that has been made in the tasks of developing the proposed infrastructure in each of these areas. Specifically we will address the Position Training and Certification Standards that are now available for each operational position found on our Flight Operations Teams (FOT). We will also discuss the MGSS Baseline Flight Operations Team Training Plan which can be tailored for each new flight project at JPL. As these tasks have been progressing, the climate and emphasis for Training and for V and V at JPL has changed, and we have learned about the expansion, growth, and limitations in the roles of traditional positions at JPL such as the Project's Training Engineer, V and V Engineer, and Operations Engineer. The need to keep a tight rein on budgets has led to a merging and/or reduction in these positions which pose challenges to individual capacities and capabilities. We examine the evolution of these processes and the roles involved while taking a look at the impact or potential impact of our proposed training related infrastructure tasks. As we conclude our examination of the changes taking place for new flight projects, we see that the importance of proceeding with our proposed tasks and adapting them to the changing climate remains an important element in reducing the risk in the challenging business of space exploration.
Integration and verification testing of the Large Synoptic Survey Telescope camera
NASA Astrophysics Data System (ADS)
Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.
2016-08-01
We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...
2017-09-01
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
Leveraging pattern matching to solve SRAM verification challenges at advanced nodes
NASA Astrophysics Data System (ADS)
Kan, Huan; Huang, Lucas; Yang, Legender; Zou, Elaine; Wan, Qijian; Du, Chunshan; Hu, Xinyi; Liu, Zhengfang; Zhu, Yu; Zhang, Recoo; Huang, Elven; Muirhead, Jonathan
2018-03-01
Memory is a critical component in today's system-on-chip (SoC) designs. Static random-access memory (SRAM) blocks are assembled by combining intellectual property (IP) blocks that come from SRAM libraries developed and certified by the foundries for both functionality and a specific process node. Customers place these SRAM IP in their designs, adjusting as necessary to achieve DRC-clean results. However, any changes a customer makes to these SRAM IP during implementation, whether intentionally or in error, can impact yield and functionality. Physical verification of SRAM has always been a challenge, because these blocks usually contain smaller feature sizes and spacing constraints compared to traditional logic or other layout structures. At advanced nodes, critical dimension becomes smaller and smaller, until there is almost no opportunity to use optical proximity correction (OPC) and lithography to adjust the manufacturing process to mitigate the effects of any changes. The smaller process geometries, reduced supply voltages, increasing process variation, and manufacturing uncertainty mean accurate SRAM physical verification results are not only reaching new levels of difficulty, but also new levels of criticality for design success. In this paper, we explore the use of pattern matching to create an SRAM verification flow that provides both accurate, comprehensive coverage of the required checks and visual output to enable faster, more accurate error debugging. Our results indicate that pattern matching can enable foundries to improve SRAM manufacturing yield, while allowing designers to benefit from SRAM verification kits that can shorten the time to market.
Tessera: Open source software for accelerated data science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.
2014-06-30
Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need,more » Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.« less
Development of NASA's Models and Simulations Standard
NASA Technical Reports Server (NTRS)
Bertch, William J.; Zang, Thomas A.; Steele, Martin J.
2008-01-01
From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.
The purpose of this verification was a cut fiber challenge study for the Dow Chemical Company SFD-2880 UF membrane module. MS2 coliphage virus was the surrogate challenge organism. The challenge tests followed the requirements of the Department of Health Victoria (Australia) Dr...
Simulation based mask defect repair verification and disposition
NASA Astrophysics Data System (ADS)
Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo
2009-10-01
As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple optical model can be used to get simulated aerial image intensity in the AOI. With built-in contour analysis functions, the SMDD software can easily compare the contour (or intensity) differences between defect pattern and normal pattern. With user provided judging criteria, this software can be easily disposition the defect based on contour comparison. In addition, process sensitivity properties, like MEEF and NILS, can be readily obtained in the AOI with a lithography model, which will make mask defect disposition criteria more intelligent.
The Challenge for Arms Control Verification in the Post-New START World
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wuest, C R
Nuclear weapon arms control treaty verification is a key aspect of any agreement between signatories to establish that the terms and conditions spelled out in the treaty are being met. Historically, arms control negotiations have focused more on the rules and protocols for reducing the numbers of warheads and delivery systems - sometimes resorting to complex and arcane procedures for counting forces - in an attempt to address perceived or real imbalances in a nation's strategic posture that could lead to instability. Verification procedures are generally defined in arms control treaties and supporting documents and tend to focus on technicalmore » means and measures designed to ensure that a country is following the terms of the treaty and that it is not liable to engage in deception or outright cheating in an attempt to circumvent the spirit and the letter of the agreement. As the Obama Administration implements the articles, terms, and conditions of the recently ratified and entered-into-force New START treaty, there are already efforts within and outside of government to move well below the specified New START levels of 1550 warheads, 700 deployed strategic delivery vehicles, and 800 deployed and nondeployed strategic launchers (Inter-Continental Ballistic Missile (ICBM) silos, Submarine-Launched Ballistic Missile (SLBM) tubes on submarines, and bombers). A number of articles and opinion pieces have appeared that advocate for significantly deeper cuts in the U.S. nuclear stockpile, with some suggesting that unilateral reductions on the part of the U.S. would help coax Russia and others to follow our lead. Papers and studies prepared for the U.S. Department of Defense and at the U.S. Air War College have also been published, suggesting that nuclear forces totaling no more than about 300 warheads would be sufficient to meet U.S. national security and deterrence needs. (Davis 2011, Schaub and Forsyth 2010) Recent articles by James M. Acton and others suggest that the prospects for maintaining U.S. security and minimizing the chances of nuclear war, while deliberately reducing stockpiles to a few hundred weapons, is possible but not without risk. While the question of the appropriate level of cuts to U.S. nuclear forces is being actively debated, a key issue continues to be whether verification procedures are strong enough to ensure that both the U.S. and Russia are fulfilling their obligations under the current New Start treaty and any future arms reduction treaties. A recent opinion piece by Henry Kissinger and Brent Scowcroft (2012) raised a number of issues with respect to governing a policy to enhance strategic stability, including: in deciding on force levels and lower numbers, verification is crucial. Particularly important is a determination of what level of uncertainty threatens the calculation of stability. At present, that level is well within the capabilities of the existing verification systems. We must be certain that projected levels maintain - and when possible, reinforce - that confidence. The strengths and weaknesses of the New START verification regime should inform and give rise to stronger regimes for future arms control agreements. These future arms control agreements will likely need to include other nuclear weapons states and so any verification regime will need to be acceptable to all parties. Currently, China is considered the most challenging party to include in any future arms control agreement and China's willingness to enter into verification regimes such as those implemented in New START may only be possible when it feels it has reached nuclear parity with the U.S. and Russia. Similarly, in keeping with its goals of reaching peer status with the U.S. and Russia, Frieman (2004) suggests that China would be more willing to accept internationally accepted and applied verification regimes rather than bilateral ones. The current verification protocols specified in the New START treaty are considered as the baseline case and are contrasted with possible alternative verification protocols that could be effective in a post-New START era of significant reductions in U.S. and other countries nuclear stockpiles. Of particular concern is the possibility of deception and breakout when declared and observed numbers of weapons are below the level considered to pose an existential threat to the U.S. In a regime of very low stockpile numbers, 'traditional' verification protocols as currently embodied in the New START treaty might prove less than adequate. I introduce and discuss a number of issues that need to be considered in future verification protocols, many of which do not have immediate solutions and so require further study. I also discuss alternatives and enhancements to traditional verification protocols, for example, confidence building measures such as burden sharing against the common threat of weapon of mass destruction (WMD) terrorism, joint research and development.« less
Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)
2016-03-01
up game Binary Fission, which was deployed during Phase Two of CHEKOFV. Xylem: The Code of Plants is a casual game for players using mobile ...there are the design and engineering challenges of building a game infrastructure that integrates verification technology with crowd participation...the backend processes that annotate the originating software. Allowing players to construct their own equations opened up the flexibility to receive
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nekoogar, F; Dowla, F
An IAEA Technical Meeting on Techniques for IAEA Verification of Enrichment Activities identified 'smart tags' as a technology that should be assessed for tracking and locating UF6 cylinders. Although there is vast commercial industry working on RFID systems, the vulnerabilities of commercial products are only beginning to emerge. Most of the commercially off-the-shelf (COTS) RFID systems operate in very narrow frequency bands, making them vulnerable to detection, jamming and tampering and also presenting difficulties when used around metals (i.e. UF6 cylinders). Commercial passive RFID tags have short range, while active RFID tags that provide long ranges have limited lifetimes. Theremore » are also some concerns with the introduction of strong (narrowband) radio frequency signals around radioactive and nuclear materials. Considering the shortcomings of commercial RFID systems, in their current form, they do not offer a promising solution for continuous monitoring and tracking of UF6 cylinders. In this paper, we identify the key challenges faced by commercial RFID systems for monitoring UF6 cylinders, and introduce an ultra-wideband approach for tag/reader communications that addresses most of the identified challenges for IAEA safeguards applications.« less
Summary of the 2014 Sandia V&V Challenge Workshop
Schroeder, Benjamin B.; Hu, Kenneth T.; Mullins, Joshua Grady; ...
2016-02-19
A discussion of the five responses to the 2014 Sandia Verification and Validation (V&V) Challenge Problem, presented within this special issue, is provided hereafter. Overviews of the challenge problem workshop, workshop participants, and the problem statement are also included. Brief summations of teams' responses to the challenge problem are provided. Issues that arose throughout the responses that are deemed applicable to the general verification, validation, and uncertainty quantification (VVUQ) community are the main focal point of this paper. The discussion is oriented and organized into big picture comparison of data and model usage, VVUQ activities, and differentiating conceptual themes behindmore » the teams' VVUQ strategies. Significant differences are noted in the teams' approaches toward all VVUQ activities, and those deemed most relevant are discussed. Beyond the specific details of VVUQ implementations, thematic concepts are found to create differences among the approaches; some of the major themes are discussed. Lastly, an encapsulation of the key contributions, the lessons learned, and advice for the future are presented.« less
Application of an ADS-B Sense and Avoid Algorithm
NASA Technical Reports Server (NTRS)
Arteaga, Ricardo; Kotcher, Robert; Cavalin, Moshe; Dandachy, Mohammed
2016-01-01
The National Aeronautics and Space Administration Armstrong Flight Research Center in Edwards, California is leading a program aimed towards integrating unmanned aircraft system into the national airspace system (UAS in the NAS). The overarching goal of the program is to reduce technical barriers associated with related safety issues as well as addressing challenges that will allow UAS routine access to the national airspace. This research paper focuses on three novel ideas: (1) A design of an integrated UAS equipped with Automatic Dependent Surveillance-Broadcast that constructs a more accurate state-based airspace model; (2) The use of Stratway Algorithm in a real-time environment; and (3) The verification and validation of sense and avoid performance and usability test results which provide a pilot's perspective on how our system will benefit the UAS in the NAS program for both piloted and unmanned aircraft.
Some remarks relating to Short Notice Random Inspection (SNRI) and verification of flow strata
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphey, W.; Emeigh, C.; Lessler, L.
1991-01-01
Short Notice Random Inspection (SNRI) is a concept which is to enable the International Atomic Energy Agency (Agency) to make technically valid statements of verification of shipment or receipt strata when the Agency cannot have a resident inspector. Gordon and Sanborn addressed this problem for a centrifuge enrichment plant. In this paper other operating conditions of interest are examined and modifications of the necessary conditions for application of SNRI discussed.
A Research Program in Computer Technology
1976-07-01
K PROGRAM VERIFICATION 12 [Shaw76b] Shaw, M., W. A. Wulf, and R. L. London, Abstraction and Verification ain Aiphard: Iteration and Generators...millisecond trame of speech: pitch, gain, and 10 k -parameters (often called reflection coefficients). The 12 parameters from each frame are encoded into...del rey, CA 90291 Program Code 3D30 & 3P1O I,%’POLLING OFFICE NAME AND ADDRESS 12 REPORT DATE Defense Advanced Research Projects Agency July 1976 1400
Consolidated Site (CS) 022 Verification Survey at Former McClellan AFB, Sacramento, California
2015-03-31
currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 31 Mar 2015 2. REPORT TYPE...Consultative Letter 3. DATES COVERED (From – To) July 2014 – December 2014 4. TITLE AND SUBTITLE Consolidated Site (CS) 022 Verification Survey at...the U.S. Air Force Radioisotope Committee Secretariat (RICS), the U.S. Air Force School of Aerospace Medicine, Consultative Services Division
Consolidated Site (CS) 024 Verification Survey at Former McClellan AFB, Sacramento, California
2015-03-31
currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 31 Mar 2015 2. REPORT TYPE...Consultative Letter 3. DATES COVERED (From – To) July 2014 – December 2014 4. TITLE AND SUBTITLE Consolidated Site (CS) 024 Verification Survey at...the U.S. Air Force Radioisotope Committee Secretariat (RICS), the U.S. Air Force School of Aerospace Medicine, Consultative Services Division
NASA Technical Reports Server (NTRS)
Luck, Rogelio; Ray, Asok
1990-01-01
The implementation and verification of the delay-compensation algorithm are addressed. The delay compensator has been experimentally verified at an IEEE 802.4 network testbed for velocity control of a DC servomotor. The performance of the delay-compensation algorithm was also examined by combined discrete-event and continuous-time simulation of the flight control system of an advanced aircraft that uses the SAE (Society of Automotive Engineers) linear token passing bus for data communications.
The SAIL Databank: building a national architecture for e-health research and evaluation.
Ford, David V; Jones, Kerina H; Verplancke, Jean-Philippe; Lyons, Ronan A; John, Gareth; Brown, Ginevra; Brooks, Caroline J; Thompson, Simon; Bodger, Owen; Couch, Tony; Leake, Ken
2009-09-04
Vast quantities of electronic data are collected about patients and service users as they pass through health service and other public sector organisations, and these data present enormous potential for research and policy evaluation. The Health Information Research Unit (HIRU) aims to realise the potential of electronically-held, person-based, routinely-collected data to conduct and support health-related studies. However, there are considerable challenges that must be addressed before such data can be used for these purposes, to ensure compliance with the legislation and guidelines generally known as Information Governance. A set of objectives was identified to address the challenges and establish the Secure Anonymised Information Linkage (SAIL) system in accordance with Information Governance. These were to: 1) ensure data transportation is secure; 2) operate a reliable record matching technique to enable accurate record linkage across datasets; 3) anonymise and encrypt the data to prevent re-identification of individuals; 4) apply measures to address disclosure risk in data views created for researchers; 5) ensure data access is controlled and authorised; 6) establish methods for scrutinising proposals for data utilisation and approving output; and 7) gain external verification of compliance with Information Governance. The SAIL databank has been established and it operates on a DB2 platform (Data Warehouse Edition on AIX) running on an IBM 'P' series Supercomputer: Blue-C. The findings of an independent internal audit were favourable and concluded that the systems in place provide adequate assurance of compliance with Information Governance. This expanding databank already holds over 500 million anonymised and encrypted individual-level records from a range of sources relevant to health and well-being. This includes national datasets covering the whole of Wales (approximately 3 million population) and local provider-level datasets, with further growth in progress. The utility of the databank is demonstrated by increasing engagement in high quality research studies. Through the pragmatic approach that has been adopted, we have been able to address the key challenges in establishing a national databank of anonymised person-based records, so that the data are available for research and evaluation whilst meeting the requirements of Information Governance.
The SAIL Databank: building a national architecture for e-health research and evaluation
Ford, David V; Jones, Kerina H; Verplancke, Jean-Philippe; Lyons, Ronan A; John, Gareth; Brown, Ginevra; Brooks, Caroline J; Thompson, Simon; Bodger, Owen; Couch, Tony; Leake, Ken
2009-01-01
Background Vast quantities of electronic data are collected about patients and service users as they pass through health service and other public sector organisations, and these data present enormous potential for research and policy evaluation. The Health Information Research Unit (HIRU) aims to realise the potential of electronically-held, person-based, routinely-collected data to conduct and support health-related studies. However, there are considerable challenges that must be addressed before such data can be used for these purposes, to ensure compliance with the legislation and guidelines generally known as Information Governance. Methods A set of objectives was identified to address the challenges and establish the Secure Anonymised Information Linkage (SAIL) system in accordance with Information Governance. These were to: 1) ensure data transportation is secure; 2) operate a reliable record matching technique to enable accurate record linkage across datasets; 3) anonymise and encrypt the data to prevent re-identification of individuals; 4) apply measures to address disclosure risk in data views created for researchers; 5) ensure data access is controlled and authorised; 6) establish methods for scrutinising proposals for data utilisation and approving output; and 7) gain external verification of compliance with Information Governance. Results The SAIL databank has been established and it operates on a DB2 platform (Data Warehouse Edition on AIX) running on an IBM 'P' series Supercomputer: Blue-C. The findings of an independent internal audit were favourable and concluded that the systems in place provide adequate assurance of compliance with Information Governance. This expanding databank already holds over 500 million anonymised and encrypted individual-level records from a range of sources relevant to health and well-being. This includes national datasets covering the whole of Wales (approximately 3 million population) and local provider-level datasets, with further growth in progress. The utility of the databank is demonstrated by increasing engagement in high quality research studies. Conclusion Through the pragmatic approach that has been adopted, we have been able to address the key challenges in establishing a national databank of anonymised person-based records, so that the data are available for research and evaluation whilst meeting the requirements of Information Governance. PMID:19732426
Flood Forecasting in Wales: Challenges and Solutions
NASA Astrophysics Data System (ADS)
How, Andrew; Williams, Christopher
2015-04-01
With steep, fast-responding river catchments, exposed coastal reaches with large tidal ranges and large population densities in some of the most at-risk areas; flood forecasting in Wales presents many varied challenges. Utilising advances in computing power and learning from best practice within the United Kingdom and abroad have seen significant improvements in recent years - however, many challenges still remain. Developments in computing and increased processing power comes with a significant price tag; greater numbers of data sources and ensemble feeds brings a better understanding of uncertainty but the wealth of data needs careful management to ensure a clear message of risk is disseminated; new modelling techniques utilise better and faster computation, but lack the history of record and experience gained from the continued use of more established forecasting models. As a flood forecasting team we work to develop coastal and fluvial forecasting models, set them up for operational use and manage the duty role that runs the models in real time. An overview of our current operational flood forecasting system will be presented, along with a discussion on some of the solutions we have in place to address the challenges we face. These include: • real-time updating of fluvial models • rainfall forecasting verification • ensemble forecast data • longer range forecast data • contingency models • offshore to nearshore wave transformation • calculation of wave overtopping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matloch, L.; Vaccaro, S.; Couland, M.
The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction ofmore » encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)« less
NASA Astrophysics Data System (ADS)
Cavalcante, S. F. A.; de Paula, R. L.; Kitagawa, D. A. S.; Barcellos, M. C.; Simas, A. B. C.; Granjeiro, J. M.
2018-03-01
This paper deals with challenges that Brazilian Army Organic Synthesis Laboratory has been going through to access reference compounds related to the Chemical Weapons Convention in order to support verification analysis and for research of novel antidotes. Some synthetic procedures to produce the chemicals, as well as Quality Assurance issues and a brief introduction of international agreements banning chemical weapons are also presented.
Component Verification and Certification in NASA Missions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)
2001-01-01
Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.
Verification of Faulty Message Passing Systems with Continuous State Space in PVS
NASA Technical Reports Server (NTRS)
Pilotto, Concetta; White, Jerome
2010-01-01
We present a library of Prototype Verification System (PVS) meta-theories that verifies a class of distributed systems in which agent commu nication is through message-passing. The theoretic work, outlined in, consists of iterative schemes for solving systems of linear equations , such as message-passing extensions of the Gauss and Gauss-Seidel me thods. We briefly review that work and discuss the challenges in formally verifying it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giurgiutiu, Victor; Mendez Torres, Adrian E.
2013-07-01
Radioactive waste systems and structures (RWSS) are safety-critical facilities in need of monitoring over prolonged periods of time. Structural health monitoring (SHM) is an emerging technology that aims at monitoring the state of a structure through the use of networks of permanently mounted sensors. SHM technologies have been developed primarily within the aerospace and civil engineering communities. This paper addresses the issue of transitioning the SHM concept to the monitoring of RWSS and evaluates the opportunities and challenges associated with this process. Guided wave SHM technologies utilizing structurally-mounted piezoelectric wafer active sensors (PWAS) have a wide range of applications basedmore » on both propagating-wave and standing-wave methodologies. Hence, opportunities exist for transitioning these SHM technologies into RWSS monitoring. However, there exist certain special operational conditions specific to RWSS such as: radiation field, caustic environments, marine environments, and chemical, mechanical and thermal stressors. In order to address the high discharge of used nuclear fuel (UNF) and the limited space in the storage pools the U.S. the Department of Energy (DOE) has adopted a 'Strategy for the Management and Disposal of Used Nuclear Fuel and High-Level Radioactive Waste' (January 2013). This strategy endorses the key principles that underpin the Blue Ribbon Commission's on America's Nuclear Future recommendations to develop a sustainable program for deploying an integrated system capable of transporting, storing, and disposing of UNF and high-level radioactive waste from civilian nuclear power generation, defense, national security, and other activities. This will require research to develop monitoring, diagnosis, and prognosis tools that can aid to establish a strong technical basis for extended storage and transportation of UNF. Monitoring of such structures is critical for assuring the safety and security of the nation's spent nuclear fuel until a national policy for closure of the nuclear fuel cycle is defined and implemented. In addition, such tools can provide invaluable and timely information for verification of the predicted mechanical performance of RWSS (e.g. concrete or steel barriers) during off-normal occurrence and accident events such as the tsunami and earthquake event that affected Fukushima Daiichi nuclear power plant. The ability to verify the conditions, health, and degradation behavior of RWSS over time by applying nondestructive testing (NDT) as well as development of nondestructive evaluation (NDE) tools for new degradation processes will become challenging. The paper discusses some of the challenges associated to verification and diagnosis for RWSS and identifies SHM technologies which are more readily available for transitioning into RWSS applications. Fundamental research objectives that should be considered for the transition of SHM technologies (e.g., radiation hardened piezoelectric materials) for RWSS applications are discussed. The paper ends with summary, conclusions, and suggestions for further work. (authors)« less
Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems
NASA Technical Reports Server (NTRS)
Powell, John D.; Gilliam, David
2004-01-01
The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.
Structural Margins Assessment Approach
NASA Technical Reports Server (NTRS)
Ryan, Robert S.
1988-01-01
A general approach to the structural design and verification used to determine the structural margins of the space vehicle elements under Marshall Space Flight Center (MSFC) management is described. The Space Shuttle results and organization will be used as illustrations for techniques discussed. Given also are: (1) the system analyses performed or to be performed by, and (2) element analyses performed by MSFC and its contractors. Analysis approaches and their verification will be addressed. The Shuttle procedures are general in nature and apply to other than Shuttle space vehicles.
Technology Transfer Challenges for High-Assurance Software Engineering Tools
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.
2003-01-01
In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.
NASA Technical Reports Server (NTRS)
Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio
1992-01-01
This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.
sbv IMPROVER: Modern Approach to Systems Biology.
Guryanova, Svetlana; Guryanova, Anna
2017-01-01
The increasing amount and variety of data in biosciences call for innovative methods of visualization, scientific verification, and pathway analysis. Novel approaches to biological networks and research quality control are important because of their role in development of new products, improvement, and acceleration of existing health policies and research for novel ways of solving scientific challenges. One such approach is sbv IMPROVER. It is a platform that uses crowdsourcing and verification to create biological networks with easy public access. It contains 120 networks built in Biological Expression Language (BEL) to interpret data from PubMed articles with high-quality verification available for free on the CBN database. Computable, human-readable biological networks with a structured syntax are a powerful way of representing biological information generated from high-density data. This article presents sbv IMPROVER, a crowd-verification approach for the visualization and expansion of biological networks.
Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder
NASA Technical Reports Server (NTRS)
Lindsey, A. E.; Pecheur, Charles
2004-01-01
AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.
Biometrics based authentication scheme for session initiation protocol.
Xie, Qi; Tang, Zhixiong
2016-01-01
Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.
The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without themore » use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrence, Chris C.; Flaska, Marek; Pozzi, Sara A.
2016-08-14
Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrixmore » condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.« less
NASA Astrophysics Data System (ADS)
Lawrence, Chris C.; Febbraro, Michael; Flaska, Marek; Pozzi, Sara A.; Becchetti, F. D.
2016-08-01
Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrix condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.
Credit Card Fraud Detection: A Realistic Modeling and a Novel Learning Strategy.
Dal Pozzolo, Andrea; Boracchi, Giacomo; Caelen, Olivier; Alippi, Cesare; Bontempi, Gianluca
2017-09-14
Detecting frauds in credit card transactions is perhaps one of the best testbeds for computational intelligence algorithms. In fact, this problem involves a number of relevant challenges, namely: concept drift (customers' habits evolve and fraudsters change their strategies over time), class imbalance (genuine transactions far outnumber frauds), and verification latency (only a small set of transactions are timely checked by investigators). However, the vast majority of learning algorithms that have been proposed for fraud detection rely on assumptions that hardly hold in a real-world fraud-detection system (FDS). This lack of realism concerns two main aspects: 1) the way and timing with which supervised information is provided and 2) the measures used to assess fraud-detection performance. This paper has three major contributions. First, we propose, with the help of our industrial partner, a formalization of the fraud-detection problem that realistically describes the operating conditions of FDSs that everyday analyze massive streams of credit card transactions. We also illustrate the most appropriate performance measures to be used for fraud-detection purposes. Second, we design and assess a novel learning strategy that effectively addresses class imbalance, concept drift, and verification latency. Third, in our experiments, we demonstrate the impact of class unbalance and concept drift in a real-world data stream containing more than 75 million transactions, authorized over a time window of three years.
NASA Technical Reports Server (NTRS)
Prill, Mark E.
2005-01-01
and Accreditation (VV&A) session audience, a snapshot review of the Exploration Space Mission Directorate s (ESMD) investigation into implementation of a modeling and simulation (M&S) VV&A program. The presentation provides some legacy ESMD reference material, including information on the then-current organizational structure, and M&S (Simulation Based Acquisition (SBA)) focus contained therein, to provide a context for the proposed M&S VV&A approach. This reference material briefly highlights the SBA goals and objectives, and outlines FY05 M&S development and implementation consistent with the Subjective Assessment, Constructive Assessment, Operator-in-the-Loop Assessment, Hardware-in-the-Loop Assessment, and In Service Operations Assessment M&S construct, the NASA Exploration Information Ontology Model (NExIOM) data model, and integration with the Windchill-based Integrated Collaborative Environment (ICE). The presentation then addresses the ESMD team s initial conclusions regarding an M&S VV&A program, summarizes the general VV&A implementation approach anticipated, and outlines some of the recognized VV&A program challenges, all within a broader context of the overarching Integrated Modeling and Simulation (IM&S) environment at both the ESMD and Agency (NASA) levels. The presentation concludes with a status on the current M&S organization s progress to date relative to the recommended IM&S implementation activity. The overall presentation was focused to provide, for the Verification, Validation,
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2007-01-01
This report presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV) [SMV]. The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space. Also, additional innovative state space reduction techniques are introduced that can be used in future verification efforts applied to this and other protocols.
Towards a New Architecture for Autonomous Data Collection
NASA Astrophysics Data System (ADS)
Tanzi, T. J.; Roudier, Y.; Apvrille, L.
2015-08-01
A new generation of UAVs is coming that will help improve the situational awareness and assessment necessary to ensure quality data collection, especially in difficult conditions like natural disasters. Operators should be relieved from time-consuming data collection tasks as much as possible and at the same time, UAVs should assist data collection operations through a more insightful and automated guidance thanks to advanced sensing capabilities. In order to achieve this vision, two challenges must be addressed though. The first one is to achieve a sufficient autonomy, both in terms of navigation and of interpretation of the data sensed. The second one relates to the reliability of the UAV with respect to accidental (safety) or even malicious (security) risks. This however requires the design and development of new embedded architectures for drones to be more autonomous, while mitigating the harm they may potentially cause. We claim that the increased complexity and flexibility of such platforms requires resorting to modelling, simulation, or formal verification techniques in order to validate such critical aspects of the platform. This paper first discusses the potential and challenges faced by autonomous UAVs for data acquisition. The design of a flexible and adaptable embedded UAV architecture is then addressed. Finally, the need for validating the properties of the platform is discussed. Our approach is sketched and illustrated with the example of a lightweight drone performing 3D reconstructions out of the combination of 2D image acquisition and a specific motion control.
Advanced Extra-Vehicular Activity Pressure Garment Requirements Development
NASA Technical Reports Server (NTRS)
Ross, Amy; Aitchison, Lindsay; Rhodes, Richard
2015-01-01
The NASA Johnson Space Center advanced pressure garment technology development team is addressing requirements development for exploration missions. Lessons learned from the Z-2 high fidelity prototype development have reiterated that clear low-level requirements and verification methods reduce risk to the government, improve efficiency in pressure garment design efforts, and enable the government to be a smart buyer. The expectation is to provide requirements at the specification level that are validated so that their impact on pressure garment design is understood. Additionally, the team will provide defined verification protocols for the requirements. However, in reviewing exploration space suit high level requirements there are several gaps in the team's ability to define and verify related lower level requirements. This paper addresses the efforts in requirement areas such as mobility/fit/comfort and environmental protection (dust, radiation, plasma, secondary impacts) to determine the method by which the requirements can be defined and use of those methods for verification. Gaps exist at various stages. In some cases component level work is underway, but no system level effort has begun; in other cases no effort has been initiated to close the gap. Status of on-going efforts and potential approaches to open gaps are discussed.
Test and Verification Approach for the NASA Constellation Program
NASA Technical Reports Server (NTRS)
Strong, Edward
2008-01-01
This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.
A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.
Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang
2017-07-24
With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Yin; Jiang, Yuanwen; Cherukara, Mathew J.
Large-scale assembly of individual atoms over smooth surfaces is difficult to achieve. A configuration of an atom reservoir, in which individual atoms can be readily extracted, may successfully address this challenge. In this work, we demonstrate that a liquid gold-silicon alloy established in classical vapor-liquid-solid growth can deposit ordered and three-dimensional rings of isolated gold atoms over silicon nanowire sidewalls. Here, we perform ab initio molecular dynamics simulation and unveil a surprising single atomic gold-catalyzed chemical etching of silicon. Experimental verification of this catalytic process in silicon nanowires yields dopant-dependent, massive and ordered 3D grooves with spacing down to similarmore » to 5 nm. Finally, we use these grooves as self-labeled and ex situ markers to resolve several complex silicon growths, including the formation of nodes, kinks, scale-like interfaces, and curved backbones.« less
Using Adaptive Turnaround Documents to Electronically Acquire Structured Data in Clinical Settings
Biondich, Paul G.; Anand, Vibha; Downs, Stephen M.; McDonald, Clement J.
2003-01-01
We developed adaptive turnaround documents (ATDs) to address longstanding challenges inherent in acquiring structured data at the point of care. These computer-generated paper forms both request and receive patient tailored information specifically for electronic storage. In our pilot, we evaluated the usability, accuracy, and user acceptance of an ATD designed to enrich a pediatric preventative care decision support system. The system had an overall digit recognition rate of 98.6% (95% CI: 98.3 to 98.9) and a marksense accuracy of 99.2% (95% CI: 99.1 to 99.3). More importantly, the system reliably extracted all data from 56.6% (95% CI: 53.3 to 59.9) of our pilot forms without the need for a verification step. These results translate to a minimal workflow burden to end users. This suggests that ATDs can serve as an inexpensive, workflow-sensitive means of structured data acquisition in the clinical setting. PMID:14728139
Negative self-efficacy and goal effects revisited.
Bandura, Albert; Locke, Edwin A
2003-02-01
The authors address the verification of the functional properties of self-efficacy beliefs and document how self-efficacy beliefs operate in concert with goal systems within a sociocognitive theory of self-regulation in contrast to the focus of control theory on discrepancy reduction. Social cognitive theory posits proactive discrepancy production by adoption of goal challenges working in concert with reactive discrepancy reduction in realizing them. Converging evidence from diverse methodological and analytic strategies verifies that perceived self-efficacy and personal goals enhance motivation and performance attainments. The large body of evidence, as evaluated by 9 meta-analyses for the effect sizes of self-efficacy beliefs and by the vast body of research on goal setting, contradicts findings (J. B. Vancouver, C. M. Thompson, & A. A. Williams, 2001; J. B. Vancouver, C. M. Thompson, E. C. Tischner, & D. J. Putka 2002) that belief in one's capabilities and personal goals is self-debilitating.
Fang, Yin; Jiang, Yuanwen; Cherukara, Mathew J.; ...
2017-12-08
Large-scale assembly of individual atoms over smooth surfaces is difficult to achieve. A configuration of an atom reservoir, in which individual atoms can be readily extracted, may successfully address this challenge. In this work, we demonstrate that a liquid gold-silicon alloy established in classical vapor-liquid-solid growth can deposit ordered and three-dimensional rings of isolated gold atoms over silicon nanowire sidewalls. Here, we perform ab initio molecular dynamics simulation and unveil a surprising single atomic gold-catalyzed chemical etching of silicon. Experimental verification of this catalytic process in silicon nanowires yields dopant-dependent, massive and ordered 3D grooves with spacing down to similarmore » to 5 nm. Finally, we use these grooves as self-labeled and ex situ markers to resolve several complex silicon growths, including the formation of nodes, kinks, scale-like interfaces, and curved backbones.« less
Malingering dissociative identity disorder: objective and projective assessment.
Labott, Susan M; Wallach, Heather R
2002-04-01
Verification of dissociative identity disorder presents challenges given the complex nature of the illness. This study addressed the concern that this disorder can be successfully malingered on objective and projective psychological tests. 50 undergraduate women were assigned to a Malingering or a Control condition, then completed the Rorschach Inkblot Test and the Dissociative Experiences Scale II. The Malingering group were asked to simulate dissociative identity disorder; controls received instructions to answer all materials honestly. Analysis indicated that malingerers were significantly more likely to endorse dissociative experiences on the Dissociative Experiences Scale II in the range common to patients with diagnosed dissociative identity disorder. However, on the Rorschach there were no significant differences between the two groups. Results suggest that the assessment of dissociative identity disorder requires a multifaceted approach with both objective and projective assessment tools. Research is needed to assess these issues in clinical populations.
40 CFR 82.40 - Technician training and certification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... address in § 82.38(a) verification that the program meets all of the following standards: (1) Training... training, training through self-study of instructional material, or on-site training involving instructors...
Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis
NASA Technical Reports Server (NTRS)
Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.
2009-01-01
Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).
2015-10-01
Hawaii HASP Health and Safety Plan IDA Institute for Defense Analyses IVS Instrument Verification Strip m Meter mm Millimeter MPV Man Portable...the ArcSecond laser ranger was impractical due to the requirement to maintain line-of-sight for three rovers and tedious calibration. The SERDP...within 0.1m spacing and 99% within 0.15 m Repeatability of Instrument Verification Strip (IVS) survey Amplitude of EM anomaly Amplitude of
The Second NASA Formal Methods Workshop 1992
NASA Technical Reports Server (NTRS)
Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)
1992-01-01
The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.
Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?
Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R
2013-01-01
The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.
Towards Requirements in Systems Engineering for Aerospace IVHM Design
NASA Technical Reports Server (NTRS)
Saxena, Abhinav; Roychoudhury, Indranil; Lin, Wei; Goebel, Kai
2013-01-01
Health management (HM) technologies have been employed for safety critical system for decades, but a coherent systematic process to integrate HM into the system design is not yet clear. Consequently, in most cases, health management resorts to be an after-thought or 'band-aid' solution. Moreover, limited guidance exists for carrying out systems engineering (SE) on the subject of writing requirements for designs with integrated vehicle health management (IVHM). It is well accepted that requirements are key to developing a successful IVHM system right from the concept stage to development, verification, utilization, and support. However, writing requirements for systems with IVHM capability have unique challenges that require the designers to look beyond their own domains and consider the constraints and specifications of other interlinked systems. In this paper we look at various stages in the SE process and identify activities specific to IVHM design and development. More importantly, several relevant questions are posed that system engineers must address at various design and development stages. Addressing these questions should provide some guidance to systems engineers towards writing IVHM related requirements to ensure that appropriate IVHM functions are built into the system design.
A framework of multitemplate ensemble for fingerprint verification
NASA Astrophysics Data System (ADS)
Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li
2012-12-01
How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.
QPF verification using different radar-based analyses: a case study
NASA Astrophysics Data System (ADS)
Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.
2009-09-01
Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.
78 FR 15030 - Introduction of the Revised Employment Eligibility Verification Form
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-08
... several improvements designed to minimize errors in form completion. The key revisions to Form I-9 include... and email addresses. Improving the form's instructions. Revising the layout of the form, expanding the...
32 CFR 855.8 - Application procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... To allow time for processing, the application (DD Forms 2400, 2401, and 2402) and a self-addressed... verification required for each purpose of use must be included with the application. The name of the user must...
21 CFR 607.37 - Inspection of establishment registrations and blood product listings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... offices for firms within the geographical area of such district office. Upon request and receipt of a self-addressed stamped envelope, verification of registration number, or location of registered establishment...
Safeguards by Design Challenge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alwin, Jennifer Louise
The International Atomic Energy Agency (IAEA) defines Safeguards as a system of inspection and verification of the peaceful uses of nuclear materials as part of the Nuclear Nonproliferation Treaty. IAEA oversees safeguards worldwide. Safeguards by Design (SBD) involves incorporation of safeguards technologies, techniques, and instrumentation during the design phase of a facility, rather that after the fact. Design challenge goals are the following: Design a system of safeguards technologies, techniques, and instrumentation for inspection and verification of the peaceful uses of nuclear materials. Cost should be minimized to work with the IAEA’s limited budget. Dose to workers should always bemore » as low are reasonably achievable (ALARA). Time is of the essence in operating facilities and flow of material should not be interrupted significantly. Proprietary process information in facilities may need to be protected, thus the amount of information obtained by inspectors should be the minimum required to achieve the measurement goal. Then three different design challenges are detailed: Plutonium Waste Item Measurement System, Marine-based Modular Reactor, and Floating Nuclear Power Plant (FNPP).« less
A Model-Based Approach to Engineering Behavior of Complex Aerospace Systems
NASA Technical Reports Server (NTRS)
Ingham, Michel; Day, John; Donahue, Kenneth; Kadesch, Alex; Kennedy, Andrew; Khan, Mohammed Omair; Post, Ethan; Standley, Shaun
2012-01-01
One of the most challenging yet poorly defined aspects of engineering a complex aerospace system is behavior engineering, including definition, specification, design, implementation, and verification and validation of the system's behaviors. This is especially true for behaviors of highly autonomous and intelligent systems. Behavior engineering is more of an art than a science. As a process it is generally ad-hoc, poorly specified, and inconsistently applied from one project to the next. It uses largely informal representations, and results in system behavior being documented in a wide variety of disparate documents. To address this problem, JPL has undertaken a pilot project to apply its institutional capabilities in Model-Based Systems Engineering to the challenge of specifying complex spacecraft system behavior. This paper describes the results of the work in progress on this project. In particular, we discuss our approach to modeling spacecraft behavior including 1) requirements and design flowdown from system-level to subsystem-level, 2) patterns for behavior decomposition, 3) allocation of behaviors to physical elements in the system, and 4) patterns for capturing V&V activities associated with behavioral requirements. We provide examples of interesting behavior specification patterns, and discuss findings from the pilot project.
Telepharmacy: a pharmacist’s perspective on the clinical benefits and challenges
Poudel, Arjun; Nissen, Lisa M
2016-01-01
The use of information and telecommunication technologies has expanded at a rapid rate, which has a strong influence on healthcare delivery in many countries. Rural residents and communities, however, often lack easy access to healthcare services due to geographical and demographical factors. Telepharmacy, a more recent concept that refers to pharmaceutical service provision, enables healthcare services, such as medication review, patients counseling, and prescription verification, by a qualified pharmacist for the patients located at a distance from a remotely located hospital, pharmacy, or healthcare center. Telepharmacy has many recognizable benefits such as the easy access to healthcare services in remote and rural locations, economic benefits, patient satisfaction as a result of medication access and information in rural areas, effective patient counseling, and minimal scarcity of local pharmacist and pharmacy services. Telepharmacy undoubtedly is a great concept, but it is sometimes challenging to put into practice. Inherent to the adoption of these practices are legal challenges and pitfalls that need to be addressed. The start-up of telepharmacy (hardware, software, connectivity, and operational cost) involves considerable time, effort, and money. For rural hospitals with fewer patients, the issue of costs appears to be one of the biggest barriers to telepharmacy services. Moreover, execution and implementation of comprehensive and uniform telepharmacy law is still a challenge. A well-developed system, however, can change the practice of pharmacy that is beneficial to both the rural communities and the hospitals or retail pharmacies that deliver these services. PMID:29354542
Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo
2012-01-01
Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044
Field Verification of Undercut Criteria and Alternatives for Subgrade Stabilization-Coastal Plain
DOT National Transportation Integrated Search
2012-06-01
The North Carolina Department of Transportation (NCDOT) is progressing toward developing quantitative and systematic : criteria that address the implementation of undercutting as a subgrade stabilization measure. As part of this effort, a : laborator...
The Evolution of the NASA Commercial Crew Program Mission Assurance Process
NASA Technical Reports Server (NTRS)
Canfield, Amy C.
2016-01-01
In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.
An analysis of random projection for changeable and privacy-preserving biometric verification.
Wang, Yongjin; Plataniotis, Konstantinos N
2010-10-01
Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.
2009-01-01
Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.
Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas
2008-01-01
The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of the detector itself. In addition to PET alone, PET/CT imaging provides accurate information on the position of the imaged object and may assess possible anatomical changes during fractionated radiotherapy in clinical applications. PMID:17388158
Two-Level Verification of Data Integrity for Data Storage in Cloud Computing
NASA Astrophysics Data System (ADS)
Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping
Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.
Challenges in Regional CTBT Monitoring: The Experience So Far From Vienna
NASA Astrophysics Data System (ADS)
Bratt, S. R.
2001-05-01
The verification system being established to monitor the CTBT will include an International Monitoring System (IMS) network of 321 seismic, hydroacoustic, infrasound and radionuclide stations, transmitting digital data to the International Data Centre (IDC) in Vienna, Austria over a Global Communications Infrastructure (GCI). The IDC started in February 2000 to disseminate a wide range of products based on automatic processing and interactive analysis of data from about 90 stations from the four IMS technologies. The number of events in the seismo-acoustic Reviewed Event Bulletins (REB) was 18,218 for the year 2000, with the daily number ranging from 30 to 360. Over 300 users from almost 50 Member States are now receiving an average of 18,000 data and product deliveries per month from the IDC. As the IMS network expands (40 - 60 new stations are scheduled start transmitting data this year) and as GCI communications links bring increasing volumes of new data into Vienna (70 new GCI sites are currently in preparation), the monitoring capability of the IMS and IDC has the potential to improve significantly. To realize this potential, the IDC must continue to improve its capacity to exploit regional seismic data from events defined by few stations with large azimuthal gaps. During 2000, 25% of the events in the REB were defined by five or fewer stations. 48% were defined by at least one regional phase, and 24% were defined by at least three. 34% had gaps in azimuthal coverage of more than 180 degrees. The fraction of regional, sparsely detected events will only increase as new, sensitive stations come on-line, and the detection threshold drops. This will be offset, to some extent, because stations within the denser network that detect near-threshold events will be at closer distances, on average. Thus to address the challenges of regional monitoring, the IDC must integrate "tuned" station and network processing parameters for new stations; enhanced and/or new methods for estimating location, depth and uncertainty bounds; and validated, regionally-calibrated travel times, event characterization parameters and screening criteria. A new IDC program to fund research to calibrate regional seismic travel paths seeks to address, in cooperation with other national efforts, one item on this list. More effective use of the full waveform data and cross-technology synergies must be explored. All of this work must be integrated into modular software systems that can be maintained and improved over time. To motivate these regional monitoring challenges and possible improvements, the experience from the IDC will be presented via a series of illustrative, sample events. Challenges in the technical and policy arenas must be addressed as well. IMS data must first be available at the IDC before they can be analyzed. The encouraging experience to date is that the availability of data arriving via the GCI is significantly higher (~95%) than the availability (~70%) from the same stations prior to GCI installation, when they were transmitting data via other routes. Within the IDC, trade-offs must be considered between the desired levels of product quality and timeliness, and the investment in personnel and system development to support the levels sought. Another high-priority objective is to develop a policy for providing data and products to scientific and disaster alert organizations. It is clear that broader exploitation of these rich and unique assets could be of great, mutual benefit, and is, perhaps, a necessity for the CTBT verification system to achieve its potential.
Certification of lightning protection for a full-authority digital engine control
NASA Technical Reports Server (NTRS)
Dargi, M.; Rupke, E.; Wiles, K.
1991-01-01
FADEC systems present many challenges to the lightning protection engineer. Verification of the protection-design adequacy for certification purposes presents additional challenges. The basic requirements of the certification plan of a FADEC is to demonstrate compliance with Federal Airworthiness Regulations (FAR) 25.1309 and 25.581. These FARs are intended for transport aircraft, but there are equivalent sections for general aviation aircraft, normal and transport rotorcraft. Military aircraft may have additional requirements. The criteria for demonstration of adequate lightning protection for a FADEC systems include the procedures outlined in FAA Advisory Circular (AC) 20-136, Protection of aircraft electrical/electronic systems against the indirect effects of lightning. As FADEC systems, including the interconnecting wiring, are generally not susceptible to direct attachment of lightning currents, the verification of protection against indirect effects is primarily described.
NASA GSFC Mechanical Engineering Latest Inputs for Verification Standards (GEVS) Updates
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2003-01-01
This viewgraph presentation provides information on quality control standards in mechanical engineering. The presentation addresses safety, structural loads, nonmetallic composite structural elements, bonded structural joints, externally induced shock, random vibration, acoustic tests, and mechanical function.
Why do verification and validation?
Hu, Kenneth T.; Paez, Thomas L.
2016-02-19
In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.
Formal Validation of Aerospace Software
NASA Astrophysics Data System (ADS)
Lesens, David; Moy, Yannick; Kanig, Johannes
2013-08-01
Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.
Fatigue Prediction Verification of Fiberglass Hulls
2001-10-01
UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS( ES ) United States Naval Academy,Department of Naval Architecture & Ocean Engineering...Annapolis,MD,21402 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS( ES ) 10. SPONSOR/MONITOR’S ACRONYM(S) 11...the relatively low number of cycles-to-failure these speci- mens were more efficiently tested on the Satec 50UD ma- chine. The wet specimens were
Scalable Adaptive Architectures for Maritime Operations Center Command and Control
2011-05-06
the project to investigate the possibility of using earlier work on the validation and verification of rule bases in addressing the dynamically ...support the organization. To address the dynamically changing rules of engagement of a maritime force as it crosses different geographical areas, GMU... dynamic analysis, makes use of an Occurrence Graph that corresponds to the dynamics (or execution) of the Petri Net, to capture properties
Verification of target motion effects on SAR imagery using the Gotcha GMTI challenge dataset
NASA Astrophysics Data System (ADS)
Hack, Dan E.; Saville, Michael A.
2010-04-01
This paper investigates the relationship between a ground moving target's kinematic state and its SAR image. While effects such as cross-range offset, defocus, and smearing appear well understood, their derivations in the literature typically employ simplifications of the radar/target geometry and assume point scattering targets. This study adopts a geometrical model for understanding target motion effects in SAR imagery, termed the target migration path, and focuses on experimental verification of predicted motion effects using both simulated and empirical datasets based on the Gotcha GMTI challenge dataset. Specifically, moving target imagery is generated from three data sources: first, simulated phase history for a moving point target; second, simulated phase history for a moving vehicle derived from a simulated Mazda MPV X-band signature; and third, empirical phase history from the Gotcha GMTI challenge dataset. Both simulated target trajectories match the truth GPS target position history from the Gotcha GMTI challenge dataset, allowing direct comparison between all three imagery sets and the predicted target migration path. This paper concludes with a discussion of the parallels between the target migration path and the measurement model within a Kalman filtering framework, followed by conclusions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunch, Kyle J.; Jones, Anthony M.; Ramuhalli, Pradeep
The ratification and ongoing implementation of the New START Treaty have been widely regarded as noteworthy global security achievements for both the Obama Administration and the Putin (formerly Medvedev) regime. But deeper cuts that move beyond the United States and Russia to engage the P-5 and other nuclear weapons possessor states are envisioned under future arms control regimes, and are indeed required for the P-5 in accordance with their Article VI disarmament obligations in the Nuclear Non-Proliferation Treaty. Future verification needs will include monitoring the cessation of production of new fissile material for weapons, monitoring storage of warhead components andmore » fissile materials and verifying dismantlement of warheads, pits, secondary stages, and other materials. A fundamental challenge to implementing a nuclear disarmament regime is the ability to thwart unauthorized material diversion throughout the dismantlement and disposition process through strong chain of custody implementation. Verifying the declared presence, or absence, of nuclear materials and weapons components throughout the dismantlement and disposition lifecycle is a critical aspect of the disarmament process. From both the diplomatic and technical perspectives, verification under these future arms control regimes will require new solutions. Since any acceptable verification technology must protect sensitive design information and attributes to prevent the release of classified or other proliferation-sensitive information, non-nuclear non-sensitive modalities may provide significant new verification tools which do not require the use of additional information barriers. Alternative verification technologies based upon electromagnetic and acoustics could potentially play an important role in fulfilling the challenging requirements of future verification regimes. For example, researchers at the Pacific Northwest National Laboratory (PNNL) have demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to rapidly confirm the presence of specific components on a yes/no basis without revealing classified information. PNNL researchers have also used ultrasonic measurements to obtain images of material microstructures which may be used as templates or unique identifiers of treaty-limited items. Such alternative technologies are suitable for application in various stages of weapons dismantlement and often include the advantage of an inherent information barrier due to the inability to extract classified weapon design information from the collected data. As a result, these types of technologies complement radiation-based verification methods for arms control. This article presents an overview of several alternative verification technologies that are suitable for supporting a future, broader and more intrusive arms control regime that spans the nuclear weapons disarmament lifecycle. The general capabilities and limitations of each verification modality are discussed and example technologies are presented. Potential applications are defined in the context of the nuclear material and weapons lifecycle. Example applications range from authentication (e.g., tracking and signatures within the chain of custody from downloading through weapons storage, unclassified templates and unique identification) to verification of absence and final material disposition.« less
Expert system verification and validation survey, delivery 4
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
NASA Technical Reports Server (NTRS)
Stevens, G. H.; Anzic, G.
1979-01-01
NASA is conducting a series of millimeter wave satellite communication systems and market studies to: (1) determine potential domestic 30/20 GHz satellite concepts and market potential, and (2) establish the requirements for a suitable technology verification payload which, although intended to be modest in capacity, would sufficiently demonstrate key technologies and experimentally address key operational issues. Preliminary results and critical issues of the current contracted effort are described. Also included is a description of a NASA-developed multibeam satellite payload configuration which may be representative of concepts utilized in a technology flight verification program.
Performing Verification and Validation in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1999-01-01
The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.
Expert system verification and validation survey. Delivery 2: Survey results
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and industry applications. This is the first task of the series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
NASA Technical Reports Server (NTRS)
Melendez, Orlando; Trizzino, Mary; Fedderson, Bryan
1997-01-01
The National Aeronautics and Space Administration (NASA), Kennedy Space Center (KSC) Materials Science Division conducted a study to evaluate alternative solvents for CFC-113 in precision cleaning and verification on typical samples that are used in the KSC environment. The effects of AK-225(R), Vertrel(R), MCA, and HFE A 7100 on selected metal and polymer materials were studied over 1, 7 and 30 day test times. This report addresses a study on the compatibility aspects of replacement solvents for materials in aerospace applications.
Expert system verification and validation survey. Delivery 5: Revised
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
Expert system verification and validation survey. Delivery 3: Recommendations
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.
2008-09-01
Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worthmore » of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high consequence decision-making.« less
Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling
Kraan, Aafke Christine
2015-01-01
Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects. PMID:26217586
NASA Technical Reports Server (NTRS)
1995-01-01
The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
NASA Astrophysics Data System (ADS)
Arndt, J.; Kreimer, J.
2010-09-01
The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-13
... Hampshire Ave., Bldg. 51, rm. 2201, Silver Spring, MD 20993-0002. Send one self-addressed adhesive label to... exploration and verification of drug effects under epidemic and pandemic conditions. A draft notice of...
This booklet, ETV Program Case Studies: Demonstrating Program Outcomes, Volume III contains two case studies, addressing verified environmental technologies for decentalized wastewater treatment and converting animal waste to energy. Each case study contains a brief description ...
PHM for Ground Support Systems Case Study: From Requirements to Integration
NASA Technical Reports Server (NTRS)
Teubert, Chris
2015-01-01
This session will detail the experience of members of the NASA Ames Prognostic Center of Excellence (PCoE) producing PHM tools for NASA Advanced Ground Support Systems, including the challenges in applying their research in a production environment. Specifically, we will 1) go over the systems engineering and review process used; 2) Discuss the challenges and pitfalls in this process; 3) discuss software architecting, documentation, verification and validation activities and 4) discuss challenges in communicating the benefits and limitations of PHM Technologies.
NASA Astrophysics Data System (ADS)
Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.
2006-03-01
By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).
Space station System Engineering and Integration (SE and I). Volume 2: Study results
NASA Technical Reports Server (NTRS)
1987-01-01
A summary of significant study results that are products of the Phase B conceptual design task are contained. Major elements are addressed. Study results applicable to each major element or area of design are summarized and included where appropriate. Areas addressed include: system engineering and integration; customer accommodations; test and program verification; product assurance; conceptual design; operations and planning; technical and management information system (TMIS); and advanced development.
Evaluation and Research for Technology: Not Just Playing Around.
ERIC Educational Resources Information Center
Baker, Eva L.; O'Neil, Harold F., Jr.
2003-01-01
Discusses some of the challenges of technology-based training and education, the role of quality verification and evaluation, and strategies to integrate evaluation into the everyday design of technology-based systems for education and training. (SLD)
International Space Station Passive Thermal Control System Analysis, Top Ten Lessons-Learned
NASA Technical Reports Server (NTRS)
Iovine, John
2011-01-01
The International Space Station (ISS) has been on-orbit for over 10 years, and there have been numerous technical challenges along the way from design to assembly to on-orbit anomalies and repairs. The Passive Thermal Control System (PTCS) management team has been a key player in successfully dealing with these challenges. The PTCS team performs thermal analysis in support of design and verification, launch and assembly constraints, integration, sustaining engineering, failure response, and model validation. This analysis is a significant body of work and provides a unique opportunity to compile a wealth of real world engineering and analysis knowledge and the corresponding lessons-learned. The analysis lessons encompass the full life cycle of flight hardware from design to on-orbit performance and sustaining engineering. These lessons can provide significant insight for new projects and programs. Key areas to be presented include thermal model fidelity, verification methods, analysis uncertainty, and operations support.
ESTEST: An Open Science Platform for Electronic Structure Research
ERIC Educational Resources Information Center
Yuan, Gary
2012-01-01
Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…
Proceedings of the NASA Workshop on Registration and Rectification
NASA Technical Reports Server (NTRS)
Bryant, N. A. (Editor)
1982-01-01
Issues associated with the registration and rectification of remotely sensed data. Near and long range applications research tasks and some medium range technology augmentation research areas are recommended. Image sharpness, feature extraction, inter-image mapping, error analysis, and verification methods are addressed.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the geographical area of such district office. Upon request and receipt of a self-addressed stamped envelope, verification of a registration number or the location of a registered establishment will be...
78 FR 23835 - Sex Offender Registration Amendments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... 3225-AA10 Sex Offender Registration Amendments AGENCY: Court Services and Offender Supervision Agency... and requirements relating to periodic verification of registration information for sex offenders. Furthermore, the rule permits CSOSA to verify addresses of sex offenders by conducting home visits on its own...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-26
... from nearly all of the parties in this proceeding. All of these parties raised issues of first impression that were not addressed in the initial phase of this proceeding. The Office is studying these new...
NASA Astrophysics Data System (ADS)
Cohen, K. K.; Klara, S. M.; Srivastava, R. D.
2004-12-01
The U.S. Department of Energy's (U.S. DOE's) Carbon Sequestration Program is developing state-of-the-science technologies for measurement, mitigation, and verification (MM&V) in field operations of geologic sequestration. MM&V of geologic carbon sequestration operations will play an integral role in the pre-injection, injection, and post-injection phases of carbon capture and storage projects to reduce anthropogenic greenhouse gas emissions. Effective MM&V is critical to the success of CO2 storage projects and will be used by operators, regulators, and stakeholders to ensure safe and permanent storage of CO2. In the U.S. DOE's Program, Carbon sequestration MM&V has numerous instrumental roles: Measurement of a site's characteristics and capability for sequestration; Monitoring of the site to ensure the storage integrity; Verification that the CO2 is safely stored; and Protection of ecosystems. Other drivers for MM&V technology development include cost-effectiveness, measurement precision, and frequency of measurements required. As sequestration operations are implemented in the future, it is anticipated that measurements over long time periods and at different scales will be required; this will present a significant challenge. MM&V sequestration technologies generally utilize one of the following approaches: below ground measurements; surface/near-surface measurements; aerial and satellite imagery; and modeling/simulations. Advanced subsurface geophysical technologies will play a primary role for MM&V. It is likely that successful MM&V programs will incorporate multiple technologies including but not limited to: reservoir modeling and simulations; geophysical techniques (a wide variety of seismic methods, microgravity, electrical, and electromagnetic techniques); subsurface fluid movement monitoring methods such as injection of tracers, borehole and wellhead pressure sensors, and tiltmeters; surface/near surface methods such as soil gas monitoring and infrared sensors and; aerial and satellite imagery. This abstract will describe results, similarities, and contrasts for funded studies from the U.S. DOE's Carbon Sequestration Program including examples from the Sleipner North Sea Project, the Canadian Weyburn Field/Dakota Gasification Plant Project, the Frio Formation Texas Project, and Yolo County Bioreactor Landfill Project. The abstract will also address the following: How are the terms ``measurement,'' ``mitigation''and ``verification'' defined in the Program? What is the U.S. DOE's Carbon Sequestration Program Roadmap and what are the Roadmap goals for MM&V? What is the current status of MM&V technologies?
ICSH guidelines for the verification and performance of automated cell counters for body fluids.
Bourner, G; De la Salle, B; George, T; Tabe, Y; Baum, H; Culp, N; Keng, T B
2014-12-01
One of the many challenges facing laboratories is the verification of their automated Complete Blood Count cell counters for the enumeration of body fluids. These analyzers offer improved accuracy, precision, and efficiency in performing the enumeration of cells compared with manual methods. A patterns of practice survey was distributed to laboratories that participate in proficiency testing in Ontario, Canada, the United States, the United Kingdom, and Japan to determine the number of laboratories that are testing body fluids on automated analyzers and the performance specifications that were performed. Based on the results of this questionnaire, an International Working Group for the Verification and Performance of Automated Cell Counters for Body Fluids was formed by the International Council for Standardization in Hematology (ICSH) to prepare a set of guidelines to help laboratories plan and execute the verification of their automated cell counters to provide accurate and reliable results for automated body fluid counts. These guidelines were discussed at the ICSH General Assemblies and reviewed by an international panel of experts to achieve further consensus. © 2014 John Wiley & Sons Ltd.
Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach
NASA Technical Reports Server (NTRS)
Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip
2017-01-01
While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.
A system verification platform for high-density epiretinal prostheses.
Chen, Kuanfu; Lo, Yi-Kai; Yang, Zhi; Weiland, James D; Humayun, Mark S; Liu, Wentai
2013-06-01
Retinal prostheses have restored light perception to people worldwide who have poor or no vision as a consequence of retinal degeneration. To advance the quality of visual stimulation for retinal implant recipients, a higher number of stimulation channels is expected in the next generation retinal prostheses, which poses a great challenge to system design and verification. This paper presents a system verification platform dedicated to the development of retinal prostheses. The system includes primary processing, dual-band power and data telemetry, a high-density stimulator array, and two methods for output verification. End-to-end system validation and individual functional block characterization can be achieved with this platform through visual inspection and software analysis. Custom-built software running on the computers also provides a good way for testing new features before they are realized by the ICs. Real-time visual feedbacks through the video displays make it easy to monitor and debug the system. The characterization of the wireless telemetry and the demonstration of the visual display are reported in this paper using a 256-channel retinal prosthetic IC as an example.
Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop
NASA Technical Reports Server (NTRS)
Rozier, Kristin Yvonne (Editor)
2008-01-01
Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.
An Overview and Empirical Comparison of Distance Metric Learning Methods.
Moutafis, Panagiotis; Leng, Mengjun; Kakadiaris, Ioannis A
2016-02-16
In this paper, we first offer an overview of advances in the field of distance metric learning. Then, we empirically compare selected methods using a common experimental protocol. The number of distance metric learning algorithms proposed keeps growing due to their effectiveness and wide application. However, existing surveys are either outdated or they focus only on a few methods. As a result, there is an increasing need to summarize the obtained knowledge in a concise, yet informative manner. Moreover, existing surveys do not conduct comprehensive experimental comparisons. On the other hand, individual distance metric learning papers compare the performance of the proposed approach with only a few related methods and under different settings. This highlights the need for an experimental evaluation using a common and challenging protocol. To this end, we conduct face verification experiments, as this task poses significant challenges due to varying conditions during data acquisition. In addition, face verification is a natural application for distance metric learning because the encountered challenge is to define a distance function that: 1) accurately expresses the notion of similarity for verification; 2) is robust to noisy data; 3) generalizes well to unseen subjects; and 4) scales well with the dimensionality and number of training samples. In particular, we utilize well-tested features to assess the performance of selected methods following the experimental protocol of the state-of-the-art database labeled faces in the wild. A summary of the results is presented along with a discussion of the insights obtained and lessons learned by employing the corresponding algorithms.
Implementation of proteomic biomarkers: making it work
Mischak, Harald; Ioannidis, John PA; Argiles, Angel; Attwood, Teresa K; Bongcam-Rudloff, Erik; Broenstrup, Mark; Charonis, Aristidis; Chrousos, George P; Delles, Christian; Dominiczak, Anna; Dylag, Tomasz; Ehrich, Jochen; Egido, Jesus; Findeisen, Peter; Jankowski, Joachim; Johnson, Robert W; Julien, Bruce A; Lankisch, Tim; Leung, Hing Y; Maahs, David; Magni, Fulvio; Manns, Michael P; Manolis, Efthymios; Mayer, Gert; Navis, Gerjan; Novak, Jan; Ortiz, Alberto; Persson, Frederik; Peter, Karlheinz; Riese, Hans H; Rossing, Peter; Sattar, Naveed; Spasovski, Goce; Thongboonkerd, Visith; Vanholder, Raymond; Schanstra, Joost P; Vlahou, Antonia
2012-01-01
While large numbers of proteomic biomarkers have been described, they are generally not implemented in medical practice. We have investigated the reasons for this shortcoming, focusing on hurdles downstream of biomarker verification, and describe major obstacles and possible solutions to ease valid biomarker implementation. Some of the problems lie in suboptimal biomarker discovery and validation, especially lack of validated platforms with well-described performance characteristics to support biomarker qualification. These issues have been acknowledged and are being addressed, raising the hope that valid biomarkers may start accumulating in the foreseeable future. However, successful biomarker discovery and qualification alone does not suffice for successful implementation. Additional challenges include, among others, limited access to appropriate specimens and insufficient funding, the need to validate new biomarker utility in interventional trials, and large communication gaps between the parties involved in implementation. To address this problem, we propose an implementation roadmap. The implementation effort needs to involve a wide variety of stakeholders (clinicians, statisticians, health economists, and representatives of patient groups, health insurance, pharmaceutical companies, biobanks, and regulatory agencies). Knowledgeable panels with adequate representation of all these stakeholders may facilitate biomarker evaluation and guide implementation for the specific context of use. This approach may avoid unwarranted delays or failure to implement potentially useful biomarkers, and may expedite meaningful contributions of the biomarker community to healthcare. PMID:22519700
Formal Verification of Large Software Systems
NASA Technical Reports Server (NTRS)
Yin, Xiang; Knight, John
2010-01-01
We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain
Verification of Space Weather Forecasts using Terrestrial Weather Approaches
NASA Astrophysics Data System (ADS)
Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.
2015-12-01
The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.
Boost OCR accuracy using iVector based system combination approach
NASA Astrophysics Data System (ADS)
Peng, Xujun; Cao, Huaigu; Natarajan, Prem
2015-01-01
Optical character recognition (OCR) is a challenging task because most existing preprocessing approaches are sensitive to writing style, writing material, noises and image resolution. Thus, a single recognition system cannot address all factors of real document images. In this paper, we describe an approach to combine diverse recognition systems by using iVector based features, which is a newly developed method in the field of speaker verification. Prior to system combination, document images are preprocessed and text line images are extracted with different approaches for each system, where iVector is transformed from a high-dimensional supervector of each text line and is used to predict the accuracy of OCR. We merge hypotheses from multiple recognition systems according to the overlap ratio and the predicted OCR score of text line images. We present evaluation results on an Arabic document database where the proposed method is compared against the single best OCR system using word error rate (WER) metric.
A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing
Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang
2017-01-01
With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733
NASA Technical Reports Server (NTRS)
Fitz, Rhonda; Whitman, Gerek
2016-01-01
Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IVV) Program, with Software Assurance Research Program support, extracted FM architectures across the IVV portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IVV projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management.
Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter
2017-06-28
High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.
Applying Content Management to Automated Provenance Capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.
2008-04-10
Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less
Entering the New Millennium: Dilemmas in Arms Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
BROWN,JAMES
The end of the Cold War finds the international community no longer divided into two opposing blocks. The concerns that the community now faces are becoming more fluid, less focused, and, in many ways, much less predictable. Issues of religion, ethnicity, and nationalism; the possible proliferation of Weapons of Mass Destruction; and the diffusion of technology and information processing throughout the world community have greatly changed the international security landscape in the last decade. Although our challenges appear formidable, the United Nations, State Parties, nongovernmental organizations, and the arms control community are moving to address and lessen these concerns throughmore » both formal and informal efforts. Many of the multilateral agreements (e.g., NPT, BWC, CWC, CTBT, MTCR), as well as the bilateral efforts that are taking place between Washington and Moscow employ confidence-building and transparency measures. These measures along with on-site inspection and other verification procedures lessen suspicion and distrust and reduce uncertainty, thus enhancing stability, confidence, and cooperation.« less
Chapter 10.3: Reliability and Durability of PV Modules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah
2017-01-07
Each year the world invests tens of billions of dollars or euros in PV systems with the expectation that these systems will last approximately 25 years. Although the disciplines of reliability, quality, and service life prediction have been well established for numerous products, a full understanding of these is currently challenging for PV modules because the desired service lifetimes are decades, preventing direct verification of lifetime predictions. A number of excellent reviews can be found in the literature summarizing the types of failures that are commonly observed for PV modules. This chapter discusses key failure/degradation mechanisms selected to highlight howmore » the kinetics of failure rates can and cannot be confidently predicted. For EVA-encapsulated modules, corrosion is observed to follow delamination, which then allows water droplets to directly contact the metallization. Extended test protocols such as Qualification Plus were created to address the known problems while standards groups update standard tests through the consensus process.« less
NASA Astrophysics Data System (ADS)
Eiriksson, D.; Jones, A. S.; Horsburgh, J. S.; Cox, C.; Dastrup, D.
2017-12-01
Over the past few decades, advances in electronic dataloggers and in situ sensor technology have revolutionized our ability to monitor air, soil, and water to address questions in the environmental sciences. The increased spatial and temporal resolution of in situ data is alluring. However, an often overlooked aspect of these advances are the challenges data managers and technicians face in performing quality control on millions of data points collected every year. While there is general agreement that high quantities of data offer little value unless the data are of high quality, it is commonly understood that despite efforts toward quality assurance, environmental data collection occasionally goes wrong. After identifying erroneous data, data managers and technicians must determine whether to flag, delete, leave unaltered, or retroactively correct suspect data. While individual instrumentation networks often develop their own QA/QC procedures, there is a scarcity of consensus and literature regarding specific solutions and methods for correcting data. This may be because back correction efforts are time consuming, so suspect data are often simply abandoned. Correction techniques are also rarely reported in the literature, likely because corrections are often performed by technicians rather than the researchers who write the scientific papers. Details of correction procedures are often glossed over as a minor component of data collection and processing. To help address this disconnect, we present case studies of quality control challenges, solutions, and lessons learned from a large scale, multi-watershed environmental observatory in Northern Utah that monitors Gradients Along Mountain to Urban Transitions (GAMUT). The GAMUT network consists of over 40 individual climate, water quality, and storm drain monitoring stations that have collected more than 200 million unique data points in four years of operation. In all of our examples, we emphasize that scientists should remain skeptical and seek independent verification of sensor data, even for sensors purchased from trusted manufacturers.
NASA Technical Reports Server (NTRS)
1990-01-01
The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.
NASA Formal Methods Workshop, 1990
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Compiler)
1990-01-01
The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.
Biometrics, identification and surveillance.
Lyon, David
2008-11-01
Governing by identity describes the emerging regime of a globalizing, mobile world. Governance depends on identification but identification increasingly depends on biometrics. This 'solution' to difficulties of verification is described and some technical weaknesses are discussed. The role of biometrics in classification systems is also considered and is shown to contain possible prejudice in relation to racialized criteria of identity. Lastly, the culture of biometric identification is shown to be limited to abstract data, artificially separated from the lived experience of the body including the orientation to others. It is proposed that creators of national ID systems in particular address these crucial deficiencies in their attempt to provide new modes of verification.
Florida Language, Speech and Hearing Association Journal, 1994.
ERIC Educational Resources Information Center
Langhans, Joseph J., Ed.
1994-01-01
This annual volume is an annual compilation of articles that address evaluation, treatment, efficacy, and credentialing, and a synopsis of programs that provide speech, language, hearing, and swallowing services. Featured articles include: (1) "Verification of Credentials and Privileging Review" (Kathryn W. Enchelmayer); (2) "The…
EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: RAISING CONFIDENCE IN INNOVATION
This is a general article on the ETV Program which is being submitted to EM, the Air & Waste Management Association's (A&WMA's) monthly magazine. In addition to background on the program, some of its accomplishments, and organization, the article briefly addresses different veri...
Field verification of geogrid properties for base course reinforcement applications : final report.
DOT National Transportation Integrated Search
2013-11-01
The proposed field study is a continuation of a recently concluded, ODOT-funded project titled: Development of ODOT Guidelines for the Use of Geogrids in Aggregate Bases, which is aimed at addressing the need for improved guidelines for base reinforc...
Remarks to Eighth Annual State of Modeling and Simulation
1999-06-04
organization, training as well as materiel Discovery vice Verification Tolerance for Surprise Free play Red Team Iterative Process Push to failure...Account for responsive & innovative future adversaries – free play , adaptive strategies and tactics by professional red teams • Address C2 issues & human
NASA Astrophysics Data System (ADS)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.
2017-06-01
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.
Verifying the Chemical Weapons Convention: The Case for a United Nations Verification Agency
1991-12-01
ORGANIZATION REPORT NUMBER(S) 6&. NAME OF PERFORMING ORGANIZATION j6b. OFFICE SYMBOL 7&. NAME OF MONITORING ORGANIZATION Naval Postgraduate School J(if applicaip...Naval Postgraduate School 6c. ADDRESS (City, State, and ZIP Code) 7b. ADDRESS (City, State, and ZIP Code) Monterey. CA 93943-5000 Monterey, CA 93943...Governinent. 17. COSATI CODES 18. SUBJECT TERMS (continue on reverse if necessaty and identify by black number) -FIELD GROUP SUBGROUP Chemical
Active neutron and gamma-ray imaging of highly enriched uranium for treaty verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamel, Michael C.; Polack, J. Kyle; Ruch, Marc L.
The detection and characterization of highly enriched uranium (HEU) presents a large challenge in the non-proliferation field. HEU has a low neutron emission rate and most gamma rays are low energy and easily shielded. To address this challenge, an instrument known as the dual-particle imager (DPI) was used with a portable deuterium-tritium (DT) neutron generator to detect neutrons and gamma rays from induced fission in HEU. We evaluated system response using a 13.7-kg HEU sphere in several configurations with no moderation, high-density polyethylene (HDPE) moderation, and tungsten moderation. A hollow tungsten sphere was interrogated to evaluate the response to amore » possible hoax item. First, localization capabilities were demonstrated by reconstructing neutron and gamma-ray images. Once localized, additional properties such as fast neutron energy spectra and time-dependent neutron count rates were attributed to the items. For the interrogated configurations containing HEU, the reconstructed neutron spectra resembled Watt spectra, which gave confidence that the interrogated items were undergoing induced fission. The time-dependent neutron count rate was also compared for each configuration and shown to be dependent on the neutron multiplication of the item. This result showed that the DPI is a viable tool for localizing and confirming fissile mass and multiplication.« less
Active neutron and gamma-ray imaging of highly enriched uranium for treaty verification
Hamel, Michael C.; Polack, J. Kyle; Ruch, Marc L.; ...
2017-08-11
The detection and characterization of highly enriched uranium (HEU) presents a large challenge in the non-proliferation field. HEU has a low neutron emission rate and most gamma rays are low energy and easily shielded. To address this challenge, an instrument known as the dual-particle imager (DPI) was used with a portable deuterium-tritium (DT) neutron generator to detect neutrons and gamma rays from induced fission in HEU. We evaluated system response using a 13.7-kg HEU sphere in several configurations with no moderation, high-density polyethylene (HDPE) moderation, and tungsten moderation. A hollow tungsten sphere was interrogated to evaluate the response to amore » possible hoax item. First, localization capabilities were demonstrated by reconstructing neutron and gamma-ray images. Once localized, additional properties such as fast neutron energy spectra and time-dependent neutron count rates were attributed to the items. For the interrogated configurations containing HEU, the reconstructed neutron spectra resembled Watt spectra, which gave confidence that the interrogated items were undergoing induced fission. The time-dependent neutron count rate was also compared for each configuration and shown to be dependent on the neutron multiplication of the item. This result showed that the DPI is a viable tool for localizing and confirming fissile mass and multiplication.« less
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2014-01-01
The field of runtime verification has during the last decade seen a multitude of systems for monitoring event sequences (traces) emitted by a running system. The objective is to ensure correctness of a system by checking its execution traces against formal specifications representing requirements. A special challenge is data parameterized events, where monitors have to keep track of the combination of control states as well as data constraints, relating events and the data they carry across time points. This poses a challenge wrt. efficiency of monitors, as well as expressiveness of logics. Data automata is a form of automata where states are parameterized with data, supporting monitoring of data parameterized events. We describe the full details of a very simple API in the Scala programming language, an internal DSL (Domain-Specific Language), implementing data automata. The small implementation suggests a design pattern. Data automata allow transition conditions to refer to other states than the source state, and allow target states of transitions to be inlined, offering a temporal logic flavored notation. An embedding of a logic in a high-level language like Scala in addition allows monitors to be programmed using all of Scala's language constructs, offering the full flexibility of a programming language. The framework is demonstrated on an XML processing scenario previously addressed in related work.
Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.
2015-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.
Stratway: A Modular Approach to Strategic Conflict Resolution
NASA Technical Reports Server (NTRS)
Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.
2011-01-01
In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Secure voice-based authentication for mobile devices: vaulted voice verification
NASA Astrophysics Data System (ADS)
Johnson, R. C.; Scheirer, Walter J.; Boult, Terrance E.
2013-05-01
As the use of biometrics becomes more wide-spread, the privacy concerns that stem from the use of biometrics are becoming more apparent. As the usage of mobile devices grows, so does the desire to implement biometric identification into such devices. A large majority of mobile devices being used are mobile phones. While work is being done to implement different types of biometrics into mobile phones, such as photo based biometrics, voice is a more natural choice. The idea of voice as a biometric identifier has been around a long time. One of the major concerns with using voice as an identifier is the instability of voice. We have developed a protocol that addresses those instabilities and preserves privacy. This paper describes a novel protocol that allows a user to authenticate using voice on a mobile/remote device without compromising their privacy. We first discuss the Vaulted Verification protocol, which has recently been introduced in research literature, and then describe its limitations. We then introduce a novel adaptation and extension of the Vaulted Verification protocol to voice, dubbed Vaulted Voice Verification (V3). Following that we show a performance evaluation and then conclude with a discussion of security and future work.
Nonlinear 3D MHD verification study: SpeCyl and PIXIE3D codes for RFP and Tokamak plasmas
NASA Astrophysics Data System (ADS)
Bonfiglio, D.; Cappello, S.; Chacon, L.
2010-11-01
A strong emphasis is presently placed in the fusion community on reaching predictive capability of computational models. An essential requirement of such endeavor is the process of assessing the mathematical correctness of computational tools, termed verification [1]. We present here a successful nonlinear cross-benchmark verification study between the 3D nonlinear MHD codes SpeCyl [2] and PIXIE3D [3]. Excellent quantitative agreement is obtained in both 2D and 3D nonlinear visco-resistive dynamics for reversed-field pinch (RFP) and tokamak configurations [4]. RFP dynamics, in particular, lends itself as an ideal non trivial test-bed for 3D nonlinear verification. Perspectives for future application of the fully-implicit parallel code PIXIE3D to RFP physics, in particular to address open issues on RFP helical self-organization, will be provided. [4pt] [1] M. Greenwald, Phys. Plasmas 17, 058101 (2010) [0pt] [2] S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996) [0pt] [3] L. Chac'on, Phys. Plasmas 15, 056103 (2008) [0pt] [4] D. Bonfiglio, L. Chac'on and S. Cappello, Phys. Plasmas 17 (2010)
Design and performance verification of UHPC piles for deep foundations.
DOT National Transportation Integrated Search
2008-11-01
The strategic plan for bridge engineering issued by AASHTO in 2005 identified extending the service life and optimizing structural : systems of bridges in the United States as two grand challenges in bridge engineering, with the objective of producin...
Report #16-P-0086, January 27, 2016. The effectiveness of the CSB’s information security program is challenged by its lack of personal identity verification cards for logical access, complete system inventory.
Systems engingeering for the Kepler Mission : a search for terrestrial planets
NASA Technical Reports Server (NTRS)
Duren, Riley M.; Dragon, Karen; Gunter, Steve Z.; Gautier, Nick; Koch, Dave; Harvey, Adam; Enos, Alan; Borucki, Bill; Sobeck, Charlie; Mayer, Dave;
2004-01-01
The Kepler mission will launch in 2007 and determine the distribution of earth-size planets (0.5 to 10 earth masses) in the habitable zones (HZs) of solar-like stars. The mission will monitor > 100,000 dwarf stars simultaneously for at least 4 years. Precision differential photometry will be used to detect the periodic signals of transiting planets. Kepler will also support asteroseismology by measuring the pressure-mode (p-mode) oscillations of selected stars. Key mission elements include a spacecraft bus and 0.95 meter, wide-field, CCD-based photometer injected into an earth-trailing heliocentric orbit by a 3-stage Delta II launch vehicle as well as a distributed Ground Segment and Follow-up Observing Program. The project is currently preparing for Preliminary Design Review (October 2004) and is proceeding with detailed design and procurement of long-lead components. In order to meet the unprecedented photometric precision requirement and to ensure a statistically significant result, the Kepler mission involves technical challenges in the areas of photometric noise and systematic error reduction, stability, and false-positive rejection. Programmatic and logistical challenges include the collaborative design, modeling, integration, test, and operation of a geographically and functionally distributed project. A very rigorous systems engineering program has evolved to address these challenges. This paper provides an overview of the Kepler systems engineering program, including some examples of our processes and techniques in areas such as requirements synthesis, validation & verification, system robustness design, and end-to-end performance modeling.
Quality Assurance in the Presence of Variability
NASA Astrophysics Data System (ADS)
Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus
Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.
ETV Program Report: Big Fish Septage and High Strength Waste Water Treatment System
Verification testing of the Big Fish Environmental Septage and High Strength Wastewater Processing System for treatment of high-strength wastewater was conducted at the Big Fish facility in Charlevoix, Michigan. Testing was conducted over a 13-month period to address different c...
Design evolution of the orbiter reaction control subsystem
NASA Technical Reports Server (NTRS)
Taeber, R. J.; Karakulko, W.; Belvins, D.; Hohmann, C.; Henderson, J.
1985-01-01
The challenges of space shuttle orbiter reaction control subsystem development began with selection of the propellant for the subsystem. Various concepts were evaluated before the current Earth storable, bipropellant combination was selected. Once that task was accomplished, additional challenges of designing the system to satisfy the wide range of requirements dictated by operating environments, reusability, and long life were met. Verification of system adequacy was achieved by means of a combination of analysis and test. The studies, the design efforts, and the test and analysis techniques employed in meeting the challenges are described.
Miniaturized magnet-less RF electron trap. II. Experimental verification
Deng, Shiyang; Green, Scott R.; Markosyan, Aram H.; ...
2017-06-15
Atomic microsystems have the potential of providing extremely accurate measurements of timing and acceleration. But, atomic microsystems require active maintenance of ultrahigh vacuum in order to have reasonable operating lifetimes and are particularly sensitive to magnetic fields that are used to trap electrons in traditional sputter ion pumps. Our paper presents an approach to trapping electrons without the use of magnetic fields, using radio frequency (RF) fields established between two perforated electrodes. The challenges associated with this magnet-less approach, as well as the miniaturization of the structure, are addressed. These include, for example, the transfer of large voltage (100–200 V)more » RF power to capacitive loads presented by the structure. The electron trapping module (ETM) described here uses eight electrode elements to confine and measure electrons injected by an electron beam, within an active trap volume of 0.7 cm 3. The operating RF frequency is 143.6 MHz, which is the measured series resonant frequency between the two RF electrodes. It was found experimentally that the steady state electrode potentials on electrodes near the trap became more negative after applying a range of RF power levels (up to 0.15 W through the ETM), indicating electron densities of ≈3 × 10 5 cm -3 near the walls of the trap. The observed results align well with predicted electron densities from analytical and numerical models. The peak electron density within the trap is estimated as ~1000 times the electron density in the electron beam as it exits the electron gun. Finally, this successful demonstration of the RF electron trapping concept addresses critical challenges in the development of miniaturized magnet-less ion pumps.« less
Fine-grained visual marine vessel classification for coastal surveillance and defense applications
NASA Astrophysics Data System (ADS)
Solmaz, Berkan; Gundogdu, Erhan; Karaman, Kaan; Yücesoy, Veysel; Koç, Aykut
2017-10-01
The need for capabilities of automated visual content analysis has substantially increased due to presence of large number of images captured by surveillance cameras. With a focus on development of practical methods for extracting effective visual data representations, deep neural network based representations have received great attention due to their success in visual categorization of generic images. For fine-grained image categorization, a closely related yet a more challenging research problem compared to generic image categorization due to high visual similarities within subgroups, diverse applications were developed such as classifying images of vehicles, birds, food and plants. Here, we propose the use of deep neural network based representations for categorizing and identifying marine vessels for defense and security applications. First, we gather a large number of marine vessel images via online sources grouping them into four coarse categories; naval, civil, commercial and service vessels. Next, we subgroup naval vessels into fine categories such as corvettes, frigates and submarines. For distinguishing images, we extract state-of-the-art deep visual representations and train support-vector-machines. Furthermore, we fine tune deep representations for marine vessel images. Experiments address two scenarios, classification and verification of naval marine vessels. Classification experiment aims coarse categorization, as well as learning models of fine categories. Verification experiment embroils identification of specific naval vessels by revealing if a pair of images belongs to identical marine vessels by the help of learnt deep representations. Obtaining promising performance, we believe these presented capabilities would be essential components of future coastal and on-board surveillance systems.
Sandia National Laboratories: 100 Resilient Cities: Sandia Challenge:
Accomplishments Energy Stationary Power Earth Science Transportation Energy Energy Research Global Security WMD Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-09
... transportation systems to ensure freedom of movement for people and commerce. To achieve this mission, TSA is... security screening and identity verification of individuals, including identification media and identifying... addresses, phone numbers); Social Security Number, Fingerprints or other biometric identifiers; Photographs...
Environmental Horticulture. Project Report Phase I with Research Findings.
ERIC Educational Resources Information Center
Bachler, Mike; Sappe', Hoyt
This report provides results of Phase I of a project that researched the occupational area of environmental horticulture, established appropriate committees, and conducted task verification. These results are intended to guide development of a program designed to address the needs of the horticulture field. Section 1 contains general information:…
Using Replication Projects in Teaching Research Methods
ERIC Educational Resources Information Center
Standing, Lionel G.; Grenier, Manuel; Lane, Erica A.; Roberts, Meigan S.; Sykes, Sarah J.
2014-01-01
It is suggested that replication projects may be valuable in teaching research methods, and also address the current need in psychology for more independent verification of published studies. Their use in an undergraduate methods course is described, involving student teams who performed direct replications of four well-known experiments, yielding…
ERIC Educational Resources Information Center
Nieuwland, Mante S.
2016-01-01
Do negative quantifiers like "few" reduce people's ability to rapidly evaluate incoming language with respect to world knowledge? Previous research has addressed this question by examining whether online measures of quantifier comprehension match the "final" interpretation reflected in verification judgments. However, these…
Information Society: Agenda for Action in the UK.
ERIC Educational Resources Information Center
Phillips of Ellesmere, Lord
1997-01-01
Explains the House of Lords Select Committee on Science and Technology in the UK (United Kingdom) and discusses its report that addresses the need for information technology planning on a national basis. Topics include electronic publishing for access to government publications, universal access, regulatory framework, encryption and verification,…
Spring-Based Helmet System Support Prototype to Address Aircrew Neck Strain
2014-06-01
Helicopter Squadron stationed at CFB Borden ALSE Personnel Flight Engineers Pilots 4.6 Discussion of Verification Results 4.6.1 Reduce the mass on the...the participant in the pilot’s posture. Figure 8. A simulation of Flight Engineers’ postures during landing and low flying maneuvres. Figure 9
Environmental Technology Verification Program Quality Management Plan, Version 3.0
The ETV QMP is a document that addresses specific policies and procedures that have been established for managing quality-related activities in the ETV program. It is the “blueprint” that defines an organization’s QA policies and procedures; the criteria for and areas of QA appli...
Addressing Behavior Needs by Disability Category
ERIC Educational Resources Information Center
Serfass, Cynthia
2009-01-01
The purpose of this study was to determine whether students with identified behavioral needs were provided a different level of behavioral intervention based on their special education disability category verification. A second purpose of this study was to determine what caused potential differences as interpreted by individuals working in the…
Which Accelerates Faster--A Falling Ball or a Porsche?
ERIC Educational Resources Information Center
Rall, James D.; Abdul-Razzaq, Wathiq
2012-01-01
An introductory physics experiment has been developed to address the issues seen in conventional physics lab classes including assumption verification, technological dependencies, and real world motivation for the experiment. The experiment has little technology dependence and compares the acceleration due to gravity by using position versus time…
NASA Technical Reports Server (NTRS)
Stehura, Aaron; Rozek, Matthew
2013-01-01
The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.
Direct and full-scale experimental verifications towards ground-satellite quantum key distribution
NASA Astrophysics Data System (ADS)
Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei
2013-05-01
Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee
2016-09-01
This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.
NASA Astrophysics Data System (ADS)
Vijayakumar, Ganesh; Sprague, Michael
2017-11-01
Demonstrating expected convergence rates with spatial- and temporal-grid refinement is the ``gold standard'' of code and algorithm verification. However, the lack of analytical solutions and generating manufactured solutions presents challenges for verifying codes for complex systems. The application of the method of manufactured solutions (MMS) for verification for coupled multi-physics phenomena like fluid-structure interaction (FSI) has only seen recent investigation. While many FSI algorithms for aeroelastic phenomena have focused on boundary-resolved CFD simulations, the actuator-line representation of the structure is widely used for FSI simulations in wind-energy research. In this work, we demonstrate the verification of an FSI algorithm using MMS for actuator-line CFD simulations with a simplified structural model. We use a manufactured solution for the fluid velocity field and the displacement of the SMD system. We demonstrate the convergence of both the fluid and structural solver to second-order accuracy with grid and time-step refinement. This work was funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Wind Energy Technologies Office, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.
Verification of the Sentinel-4 focal plane subsystem
NASA Astrophysics Data System (ADS)
Williges, Christian; Uhlig, Mathias; Hilbert, Stefan; Rossmann, Hannes; Buchwinkler, Kevin; Babben, Steffen; Sebastian, Ilse; Hohn, Rüdiger; Reulke, Ralf
2017-09-01
The Sentinel-4 payload is a multi-spectral camera system, designed to monitor atmospheric conditions over Europe from a geostationary orbit. The German Aerospace Center, DLR Berlin, conducted the verification campaign of the Focal Plane Subsystem (FPS) during the second half of 2016. The FPS consists, of two Focal Plane Assemblies (FPAs), two Front End Electronics (FEEs), one Front End Support Electronic (FSE) and one Instrument Control Unit (ICU). The FPAs are designed for two spectral ranges: UV-VIS (305 nm - 500 nm) and NIR (750 nm - 775 nm). In this publication, we will present in detail the set-up of the verification campaign of the Sentinel-4 Qualification Model (QM). This set up will also be used for the upcoming Flight Model (FM) verification, planned for early 2018. The FPAs have to be operated at 215 K +/- 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. The test campaign consists mainly of radiometric tests. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Selected test analyses and results will be presented.
Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing
USDA-ARS?s Scientific Manuscript database
Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...
A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks.
Baig, Ahmed Fraz; Hassan, Khwaja Mansoor Ul; Ghani, Anwar; Chaudhry, Shehzad Ashraf; Khan, Imran; Ashraf, Muhammad Usman
2018-01-01
Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.'s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols.
A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks
2018-01-01
Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.’s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols. PMID:29702675
GAIA payload module mechanical development
NASA Astrophysics Data System (ADS)
Touzeau, S.; Sein, E.; Lebranchu, C.
2017-11-01
Gaia is the European Space Agency's cornerstone mission for global space astrometry. Its goal is to make the largest, most precise three-dimensional map of our Galaxy by surveying an unprecedented number of stars. This paper gives an overview of the mechanical system engineering and verification of the payload module. This development includes several technical challenges. First of all, the very high stability performance as required for the mission is a key driver for the design, which incurs a high degree of stability. This is achieved through the extensive use of Silicon Carbide (Boostec® SiC) for both structures and mirrors, a high mechanical and thermal decoupling between payload and service modules, and the use of high-performance engineering tools. Compliance of payload mass and volume with launcher capability is another key challenge, as well as the development and manufacturing of the 3.2-meter diameter toroidal primary structure. The spacecraft mechanical verification follows an innovative approach, with direct testing on the flight model, without any dedicated structural model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...
2017-03-23
To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Croce Ferri, Lucilla
2003-06-01
Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.
Scatterometer-Calibrated Stability Verification Method
NASA Technical Reports Server (NTRS)
McWatters, Dalia A.; Cheetham, Craig M.; Huang, Shouhua; Fischman, Mark A.; CHu, Anhua J.; Freedman, Adam P.
2011-01-01
The requirement for scatterometer-combined transmit-receive gain variation knowledge is typically addressed by sampling a portion of the transmit signal, attenuating it with a known-stable attenuation, and coupling it into the receiver chain. This way, the gain variations of the transmit and receive chains are represented by this loop-back calibration signal, and can be subtracted from the received remote radar echo. Certain challenges are presented by this process, such as transmit and receive components that are outside of this loop-back path and are not included in this calibration, as well as the impracticality for measuring the transmit and receive chains stability and post fabrication separately, without the resulting measurement errors from the test set up exceeding the requirement for the flight instrument. To cover the RF stability design challenge, the portions of the scatterometer that are not calibrated by the loop-back, (e.g., attenuators, switches, diplexers, couplers, and coaxial cables) are tightly thermally controlled, and have been characterized over temperature to contribute less than 0.05 dB of calibration error over worst-case thermal variation. To address the verification challenge, including the components that are not calibrated by the loop-back, a stable fiber optic delay line (FODL) was used to delay the transmitted pulse, and to route it into the receiver. In this way, the internal loopback signal amplitude variations can be compared to the full transmit/receive external path, while the flight hardware is in the worst-case thermal environment. The practical delay for implementing the FODL is 100 s. The scatterometer pulse width is 1 ms so a test mode was incorporated early in the design phase to scale the 1 ms pulse at 100-Hz pulse repetition interval (PRI), by a factor of 18, to be a 55 s pulse with 556 s PRI. This scaling maintains the duty cycle, thus maintaining a representative thermal state for the RF components. The FODL consists of an RF-modulated fiber-optic transmitter, 20 km SMF- 28 standard single-mode fiber, and a photodetector. Thermoelectric cooling and insulating packaging are used to achieve high thermal stability of the FODL components. The chassis was insulated with 1-in. (.2.5-cm) thermal isolation foam. Nylon rods support the Micarta plate, onto which are mounted four 5-km fiber spool boxes. A copper plate heat sink was mounted on top of the fiber boxes (with thermal grease layer) and screwed onto the thermoelectric cooler plate. Another thermal isolation layer in the middle separates the fiberoptics chamber from the RF electronics components, which are also mounted on a copper plate that is screwed onto another thermoelectric cooler. The scatterometer subsystem fs overall stability was successfully verified to be calibratable to within 0.1 dB error in thermal vacuum (TVAC) testing with the fiber-optic delay line, while the scatterometer temperature was ramped from 10 to 30 C, which is a much larger temperature range than the worst-case expected seasonal variations.
The Use of Remote Sensing Satellites for Verification in International Law
NASA Astrophysics Data System (ADS)
Hettling, J. K.
The contribution is a very sensitive topic which is currently about to gain significance and importance in the international community. It implies questions of international law as well as the contemplation of new developments and decisions in international politics. The paper will begin with the meaning and current status of verification in international law as well as the legal basis of satellite remote sensing in international treaties and resolutions. For the verification part, this implies giving a definition of verification and naming its fields of application and the different means of verification. For the remote sensing part, it involves the identification of relevant provisions in the Outer Space Treaty and the United Nations General Assembly Principles on Remote Sensing. Furthermore it shall be looked at practical examples: in how far have remote sensing satellites been used to verify international obligations? Are there treaties which would considerably profit from the use of remote sensing satellites? In this respect, there are various examples which can be contemplated, such as the ABM Treaty (even though out of force now), the SALT and START Agreements, the Chemical Weapons Convention and the Conventional Test Ban Treaty. It will be mentioned also that NGOs have started to verify international conventions, e.g. Landmine Monitor is verifying the Mine-Ban Convention. Apart from verifying arms control and disarmament treaties, satellites can also strengthen the negotiation of peace agreements (such as the Dayton Peace Talks) and the prevention of international conflicts from arising. Verification has played an increasingly prominent role in high-profile UN operations. Verification and monitoring can be applied to the whole range of elements that constitute a peace implementation process, ranging from the military aspects through electoral monitoring and human rights monitoring, from negotiating an accord to finally monitoring it. Last but not least the problem of enforcing international obligations needs to be addressed, especially the dependence of international law on the will of political leaders and their respective national interests.
Autonomy Software: V&V Challenges and Characteristics
NASA Technical Reports Server (NTRS)
Schumann, Johann; Visser, Willem
2006-01-01
The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
Maintaining Continuity of Knowledge of Spent Fuel Pools: Tool Survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benz, Jacob M.; Smartt, Heidi A.; Tanner, Jennifer E.
This report examines supplemental tools that can be used in addition to optical surveillance cameras to maintain CoK in low-to-no light conditions, and increase the efficiency and effectiveness of spent fuel CoK, including item counting and ID verification, in challenging conditions.
NASA Technical Reports Server (NTRS)
Shull, Forrest; Bechtel, Andre; Feldmann, Raimund L.; Regardie, Myrna; Seaman, Carolyn
2008-01-01
This viewgraph presentation addresses the question of inspection and verification and validation (V&V) effectiveness of developing computer systems. A specific question is the relation between V&V effectiveness in the early lifecycle of development and the later testing of the developed system.
Automated Network Mapping and Topology Verification
2016-06-01
collection of information includes amplifying data about the networked devices such as hardware details, logical addressing schemes, 7 operating ...collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations ...maximum 200 words) The current military reliance on computer networks for operational missions and administrative duties makes network
Private Security Contractors: The Other Force
2011-03-22
improving PSC oversight. This paper will not address private contractors conducting Police force training , governmental use of PSCs outside of Iraq...theater entry requirements, conduct mandatory training , conduct weapons training and qualification and conduct security verification and criminal...an effective oversight program including contractor deployment tracking, limited contract oversight personnel, and untrained Contract Officer
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
...; modifies the reporting requirements associated with tracking domestic tuna canning and processing... products. The law addressed a Congressional finding that ``consumers would like to know if the tuna they... implement the DPCIA, including specifically the authority to establish a domestic tracking and verification...
On April 22, 2008, EPA issued the final Lead; Renovation, Repair, and Painting (RRP) Program Rule. The rule addresses lead-based paint hazards created by renovation, repair, and painting activities that disturb lead-based paint in target housing and child-occupied facilities. Und...
ERIC Educational Resources Information Center
Harel, Assaf; Bentin, Shlomo
2009-01-01
The type of visual information needed for categorizing faces and nonface objects was investigated by manipulating spatial frequency scales available in the image during a category verification task addressing basic and subordinate levels. Spatial filtering had opposite effects on faces and airplanes that were modulated by categorization level. The…
Moral Development Research Designed to Make a Difference: Some Gaps Waiting to be Filled.
ERIC Educational Resources Information Center
Kuhmerker, Lisa
1995-01-01
Encapsulates five brief reports on cutting edge issues in moral education research. Discusses strengths and weaknesses of different administrative approaches to creating a character education program. Addresses the inherent dichotomy between military service and democratic values. Considers issues of data verification and abuse of power. (MJP)
Selected Examples of LDRD Projects Supporting Test Ban Treaty Verification and Nonproliferation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, K.; Al-Ayat, R.; Walter, W. R.
The Laboratory Directed Research and Development (LDRD) Program at the DOE National Laboratories was established to ensure the scientific and technical vitality of these institutions and to enhance the their ability to respond to evolving missions and anticipate national needs. LDRD allows the Laboratory directors to invest a percentage of their total annual budget in cutting-edge research and development projects within their mission areas. We highlight a selected set of LDRD-funded projects, in chronological order, that have helped provide capabilities, people and infrastructure that contributed greatly to our ability to respond to technical challenges in support of test ban treatymore » verification and nonproliferation.« less
Abstract for 1999 Rational Software User Conference
NASA Technical Reports Server (NTRS)
Dunphy, Julia; Rouquette, Nicolas; Feather, Martin; Tung, Yu-Wen
1999-01-01
We develop spacecraft fault-protection software at NASA/JPL. Challenges exemplified by our task: 1) high-quality systems - need for extensive validation & verification; 2) multi-disciplinary context - involves experts from diverse areas; 3) embedded systems - must adapt to external practices, notations, etc.; and 4) development pressures - NASA's mandate of "better, faster, cheaper".
While aerosol radiative effects have been recognized as some of the largest sources of uncertainty among the forcers of climate change, the verification of the spatial and temporal variability of aerosol radiative forcing has remained challenging. Anthropogenic emissions of prima...
A significant challenge in environmental studies is to determine the onset and extent of MTBE bioremediation at an affected site, which may involve indirect approaches such as microcosm verification of microbial activities at a given site. Stable isotopic fractionation is cha...
Verification of the CFD simulation system SAUNA for complex aircraft configurations
NASA Astrophysics Data System (ADS)
Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.
1994-04-01
This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.
Special features of the CLUSTER antenna and radial booms design, development and verification
NASA Technical Reports Server (NTRS)
Gianfiglio, G.; Yorck, M.; Luhmann, H. J.
1995-01-01
CLUSTER is a scientific space mission to in-situ investigate the Earth's plasma environment by means of four identical spin-stabilized spacecraft. Each spacecraft is provided with a set of four rigid booms: two Antenna Booms and two Radial Booms. This paper presents a summary of the boom development and verification phases addressing the key aspects of the Radial Boom design. In particular, it concentrates on the difficulties encountered in fulfilling simultaneously the requirements of minimum torque ratio and maximum allowed shock loads at boom latching for this two degree of freedom boom. The paper also provides an overview of the analysis campaign and testing program performed to achieve sufficient confidence in the boom performance and operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarkar, Avik; Sun, Xin; Sundaresan, Sankaran
2014-04-23
The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatialmore » resolution of meso-scale clustering heterogeneities is sacrificed.« less
Toward a therapy for mitochondrial disease
Viscomi, Carlo
2016-01-01
Mitochondrial disorders are a group of genetic diseases affecting the energy-converting process of oxidative phosphorylation. The extreme variability of symptoms, organ involvement, and clinical course represent a challenge to the development of effective therapeutic interventions. However, new possibilities have recently been emerging from studies in model organisms and awaiting verification in humans. I will discuss here the most promising experimental approaches and the challenges we face to translate them into the clinics. The current clinical trials will also be briefly reviewed. PMID:27911730
Principles and Benefits of Explicitly Designed Medical Device Safety Architecture.
Larson, Brian R; Jones, Paul; Zhang, Yi; Hatcliff, John
The complexity of medical devices and the processes by which they are developed pose considerable challenges to producing safe designs and regulatory submissions that are amenable to effective reviews. Designing an appropriate and clearly documented architecture can be an important step in addressing this complexity. Best practices in medical device design embrace the notion of a safety architecture organized around distinct operation and safety requirements. By explicitly separating many safety-related monitoring and mitigation functions from operational functionality, the aspects of a device most critical to safety can be localized into a smaller and simpler safety subsystem, thereby enabling easier verification and more effective reviews of claims that causes of hazardous situations are detected and handled properly. This article defines medical device safety architecture, describes its purpose and philosophy, and provides an example. Although many of the presented concepts may be familiar to those with experience in realization of safety-critical systems, this article aims to distill the essence of the approach and provide practical guidance that can potentially improve the quality of device designs and regulatory submissions.
Xu, Xinxing; Li, Wen; Xu, Dong
2015-12-01
In this paper, we propose a new approach to improve face verification and person re-identification in the RGB images by leveraging a set of RGB-D data, in which we have additional depth images in the training data captured using depth cameras such as Kinect. In particular, we extract visual features and depth features from the RGB images and depth images, respectively. As the depth features are available only in the training data, we treat the depth features as privileged information, and we formulate this task as a distance metric learning with privileged information problem. Unlike the traditional face verification and person re-identification tasks that only use visual features, we further employ the extra depth features in the training data to improve the learning of distance metric in the training process. Based on the information-theoretic metric learning (ITML) method, we propose a new formulation called ITML with privileged information (ITML+) for this task. We also present an efficient algorithm based on the cyclic projection method for solving the proposed ITML+ formulation. Extensive experiments on the challenging faces data sets EUROCOM and CurtinFaces for face verification as well as the BIWI RGBD-ID data set for person re-identification demonstrate the effectiveness of our proposed approach.
Verification of the SENTINEL-4 Focal Plane Subsystem
NASA Astrophysics Data System (ADS)
Williges, C.; Hohn, R.; Rossmann, H.; Hilbert, S.; Uhlig, M.; Buchwinkler, K.; Reulke, R.
2017-05-01
The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR) in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS) on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs), one for the UV-VIS spectral range (305 nm … 500 nm), the second for NIR (750 nm … 775 nm). In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM) which will also be used for the upcoming Flight Model (FM) verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.
The NASA Commercial Crew Program (CCP) Mission Assurance Process
NASA Technical Reports Server (NTRS)
Canfield, Amy
2016-01-01
In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.
Ground based ISS payload microgravity disturbance assessments.
McNelis, Anne M; Heese, John A; Samorezov, Sergey; Moss, Larry A; Just, Marcus L
2005-01-01
In order to verify that the International Space Station (ISS) payload facility racks do not disturb the microgravity environment of neighboring facility racks and that the facility science operations are not compromised, a testing and analytical verification process must be followed. Currently no facility racks have taken this process from start to finish. The authors are participants in implementing this process for the NASA Glenn Research Center (GRC) Fluids and Combustion Facility (FCF). To address the testing part of the verification process, the Microgravity Emissions Laboratory (MEL) was developed at GRC. The MEL is a 6 degree of freedom inertial measurement system capable of characterizing inertial response forces (emissions) of components, sub-rack payloads, or rack-level payloads down to 10(-7) g's. The inertial force output data, generated from the steady state or transient operations of the test articles, are utilized in analytical simulations to predict the on-orbit vibratory environment at specific science or rack interface locations. Once the facility payload rack and disturbers are properly modeled an assessment can be made as to whether required microgravity levels are achieved. The modeling is utilized to develop microgravity predictions which lead to the development of microgravity sensitive ISS experiment operations once on-orbit. The on-orbit measurements will be verified by use of the NASA GRC Space Acceleration Measurement System (SAMS). The major topics to be addressed in this paper are: (1) Microgravity Requirements, (2) Microgravity Disturbers, (3) MEL Testing, (4) Disturbance Control, (5) Microgravity Control Process, and (6) On-Orbit Predictions and Verification. Published by Elsevier Ltd.
16 CFR 315.1 - Scope of regulations in this part.
Code of Federal Regulations, 2010 CFR
2010-01-01
... CONTACT LENS RULE § 315.1 Scope of regulations in this part. This part, which shall be called the “Contact Lens Rule,” implements the Fairness to Contact Lens Consumers Act, codified at 15 U.S.C. 7601-7610, which requires that rules be issued to address the release, verification, and sale of contact lens...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-12
... received noise. It has been shown that in most cases, TS occurs at the frequencies approximately one-octave... comments sent to addresses other than the one provided here. Comments sent via email, including all...) were measured by JASCO during a monitoring sound source verification (SSV) study conducted for Statoil...
A drinking water method for 12 chemicals, predominately pesticides, is presented that addresses the occurrence monitoring needs of the U.S. Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs solid phase ext...
ERIC Educational Resources Information Center
Herrington, Deborah G.; Yezierski, Ellen J.
2014-01-01
The recent revisions to the advanced placement (AP) chemistry curriculum promote deep conceptual understanding of chemistry content over more rote memorization of facts and algorithmic problem solving. For many teachers, this will mean moving away from traditional worksheets and verification lab activities that they have used to address the vast…
A drinking water method for seven pesticides and pesticide degradates is presented that addresses the occurrence monitoring needs of the US Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs online solid pha...
Rule groupings: An approach towards verification of expert systems
NASA Technical Reports Server (NTRS)
Mehrotra, Mala
1991-01-01
Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.
Using software security analysis to verify the secure socket layer (SSL) protocol
NASA Technical Reports Server (NTRS)
Powell, John D.
2004-01-01
nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Barriers, Challenges, and Decision-Making in the Letter Writing Process for Gender Transition.
Budge, Stephanie L; Dickey, Lore M
2017-03-01
This article addresses the challenges that clinicians face in writing letters of support for transgender and gender-diverse clients. It addresses common but challenging clinical representations to help the reader understand the nuances associated with writing letters. Three cases are presented. The first addresses systemic challenges, the second addresses management of care, and the third addresses co-occurring mental health concerns. Recommendations for practice are provided based on the experiences included within the 3 case examples. Copyright © 2016 Elsevier Inc. All rights reserved.
The SPoRT-WRF: Evaluating the Impact of NASA Datasets on Convective Forecasts
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Case, Jonathan; Kozlowski, Danielle; Molthan, Andrew
2012-01-01
The Short-term Prediction Research and Transition Center (SPoRT) is a collaborative partnership between NASA and operational forecasting entities, including a number of National Weather Service offices. SPoRT transitions real-time NASA products and capabilities to its partners to address specific operational forecast challenges. One challenge that forecasters face is applying convection-allowing numerical models to predict mesoscale convective weather. In order to address this specific forecast challenge, SPoRT produces real-time mesoscale model forecasts using the Weather Research and Forecasting (WRF) model that includes unique NASA products and capabilities. Currently, the SPoRT configuration of the WRF model (SPoRT-WRF) incorporates the 4-km Land Information System (LIS) land surface data, 1-km SPoRT sea surface temperature analysis and 1-km Moderate resolution Imaging Spectroradiometer (MODIS) greenness vegetation fraction (GVF) analysis, and retrieved thermodynamic profiles from the Atmospheric Infrared Sounder (AIRS). The LIS, SST, and GVF data are all integrated into the SPoRT-WRF through adjustments to the initial and boundary conditions, and the AIRS data are assimilated into a 9-hour SPoRT WRF forecast each day at 0900 UTC. This study dissects the overall impact of the NASA datasets and the individual surface and atmospheric component datasets on daily mesoscale forecasts. A case study covering the super tornado outbreak across the Ce ntral and Southeastern United States during 25-27 April 2011 is examined. Three different forecasts are analyzed including the SPoRT-WRF (NASA surface and atmospheric data), the SPoRT WRF without AIRS (NASA surface data only), and the operational National Severe Storms Laboratory (NSSL) WRF (control with no NASA data). The forecasts are compared qualitatively by examining simulated versus observed radar reflectivity. Differences between the simulated reflectivity are further investigated using convective parameters along with model soundings to determine the impacts of the various NASA datasets. Additionally, quantitative evaluation of select meteorological parameters is performed using the Meteorological Evaluation Tools model verification package to compare forecasts to in situ surface and upper air observations.
Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.
2014-01-01
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748
Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S
2013-12-06
Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.
UNSCOM faces entirely new verification challenges in Iraq
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trevan, T.
1993-04-01
Starting with the very first declarations and inspections, it became evident that Iraq was not acting in good faith, would use every possible pretext to reinterpret UNSCOM's inspection rights, and occasionally would use harassment tactics to make inspections as difficult as possible. Topics considered in detail include; initial assumptions, outstanding issues, and UNSCOM's future attitude.
Top DoD Management Challenges, Fiscal Year 2018
2018-01-01
Afghan Human Resource Information Management System to validate ANDSF personnel numbers and salaries; • Afghan Personnel Pay System to facilitate...unit strength accountability and personnel verification; and • Core Information Management System to improve accountability of equipment inventories...ACQUISITION AND CONTRACT MANAGEMENT Federal Acquisition Regulation requires contractor performance information be collected in the Contractor
Verification of Accurate Technical Insight: A Prerequisite for Self-Directed Surgical Training
ERIC Educational Resources Information Center
Hu, Yinin; Kim, Helen; Mahmutovic, Adela; Choi, Joanna; Le, Ivy; Rasmussen, Sara
2015-01-01
Simulation-based surgical skills training during preclinical education is a persistent challenge due to time constraints of trainees and instructors alike. Self-directed practice is resource-efficient and flexible; however, insight into technical proficiency among trainees is often lacking. The purpose of this study is to prospectively assess the…
Attention and Implicit Memory in the Category-Verification and Lexical Decision Tasks
ERIC Educational Resources Information Center
Mulligan, Neil W.; Peterson, Daniel
2008-01-01
Prior research on implicit memory appeared to support 3 generalizations: Conceptual tests are affected by divided attention, perceptual tasks are affected by certain divided-attention manipulations, and all types of priming are affected by selective attention. These generalizations are challenged in experiments using the implicit tests of category…
Forecast Verification: Identification of small changes in weather forecasting skill
NASA Astrophysics Data System (ADS)
Weatherhead, E. C.; Jensen, T. L.
2017-12-01
Global and regonal weather forecasts have improved over the past seven decades most often because of small, incrmental improvements. The identificaiton and verification of forecast improvement due to proposed small changes in forecasting can be expensive and, if not carried out efficiently, can slow progress in forecasting development. This presentation will look at the skill of commonly used verification techniques and show how the ability to detect improvements can depend on the magnitude of the improvement, the number of runs used to test the improvement, the location on the Earth and the statistical techniques used. For continuous variables, such as temperture, wind and humidity, the skill of a forecast can be directly compared using a pair-wise statistical test that accommodates the natural autocorrelation and magnitude of variability. For discrete variables, such as tornado outbreaks, or icing events, the challenges is to reduce the false alarm rate while improving the rate of correctly identifying th discrete event. For both continuus and discrete verification results, proper statistical approaches can reduce the number of runs needed to identify a small improvement in forecasting skill. Verification within the Next Generation Global Prediction System is an important component to the many small decisions needed to make stat-of-the-art improvements to weather forecasting capabilities. The comparison of multiple skill scores with often conflicting results requires not only appropriate testing, but also scientific judgment to assure that the choices are appropriate not only for improvements in today's forecasting capabilities, but allow improvements that will come in the future.
"Edge-on" MOSkin detector for stereotactic beam measurement and verification.
Jong, Wei Loong; Ung, Ngie Min; Vannyat, Ath; Jamalludin, Zulaikha; Rosenfeld, Anatoly; Wong, Jeannie Hsiu Ding
2017-01-01
Dosimetry in small radiation field is challenging and complicated because of dose volume averaging and beam perturbations in a detector. We evaluated the suitability of the "Edge-on" MOSkin (MOSFET) detector in small radiation field measurement. We also tested the feasibility for dosimetric verification in stereotactic radiosurgery (SRS) and stereotactic radiotherapy (SRT). "Edge-on" MOSkin detector was calibrated and the reproducibility and linearity were determined. Lateral dose profiles and output factors were measured using the "Edge-on" MOSkin detector, ionization chamber, SRS diode and EBT2 film. Dosimetric verification was carried out on two SRS and five SRT plans. In dose profile measurements, the "Edge-on" MOSkin measurements concurred with EBT2 film measurements. It showed full width at half maximum of the dose profile with average difference of 0.11mm and penumbral width with difference of ±0.2mm for all SRS cones as compared to EBT2 film measurement. For output factor measurements, a 1.1% difference was observed between the "Edge-on" MOSkin detector and EBT2 film for 4mm SRS cone. The "Edge-on" MOSkin detector provided reproducible measurements for dose verification in real-time. The measured doses concurred with the calculated dose for SRS (within 1%) and SRT (within 3%). A set of output correction factors for the "Edge-on" MOSkin detector for small radiation fields were derived from EBT2 film measurement and presented. This study showed that the "Edge-on" MOSkin detector is a suitable tool for dose verification in small radiation field. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire
Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less
NASA Technical Reports Server (NTRS)
Gupta, Pramod; Schumann, Johann
2004-01-01
High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.
Scope and verification of a Fissile Material (Cutoff) Treaty
von Hippel, Frank N.
2014-01-01
A Fissile Material Cutoff Treaty (FMCT) would ban the production of fissile material – in practice highly-enriched uranium and separated plutonium – for weapons. It has been supported by strong majorities in the United Nations. After it comes into force, newly produced fissile materials could only be produced under international – most likely International Atomic Energy Agency – monitoring. There are many non-weapon states that argue the treaty should also place under safeguards pre-existing stocks of fissile material in civilian use or declared excess for weapons so as to make nuclear-weapons reductions irreversible. Our paper discusses the scope of themore » FMCT, the ability to detect clandestine production and verification challenges in the nuclear-weapons states.« less
Runtime Verification in Context : Can Optimizing Error Detection Improve Fault Diagnosis
NASA Technical Reports Server (NTRS)
Dwyer, Matthew B.; Purandare, Rahul; Person, Suzette
2010-01-01
Runtime verification has primarily been developed and evaluated as a means of enriching the software testing process. While many researchers have pointed to its potential applicability in online approaches to software fault tolerance, there has been a dearth of work exploring the details of how that might be accomplished. In this paper, we describe how a component-oriented approach to software health management exposes the connections between program execution, error detection, fault diagnosis, and recovery. We identify both research challenges and opportunities in exploiting those connections. Specifically, we describe how recent approaches to reducing the overhead of runtime monitoring aimed at error detection might be adapted to reduce the overhead and improve the effectiveness of fault diagnosis.
NASA Astrophysics Data System (ADS)
Guo, Bing; Zhang, Yu; Documet, Jorge; Liu, Brent; Lee, Jasper; Shrestha, Rasu; Wang, Kevin; Huang, H. K.
2007-03-01
As clinical imaging and informatics systems continue to integrate the healthcare enterprise, the need to prevent patient mis-identification and unauthorized access to clinical data becomes more apparent especially under the Health Insurance Portability and Accountability Act (HIPAA) mandate. Last year, we presented a system to track and verify patients and staff within a clinical environment. This year, we further address the biometric verification component in order to determine which Biometric system is the optimal solution for given applications in the complex clinical environment. We install two biometric identification systems including fingerprint and facial recognition systems at an outpatient imaging facility, Healthcare Consultation Center II (HCCII). We evaluated each solution and documented the advantages and pitfalls of each biometric technology in this clinical environment.
NASA Technical Reports Server (NTRS)
Levine, S. R.
1982-01-01
A first-cut integrated environmental attack life prediction methodology for hot section components is addressed. The HOST program is concerned with oxidation and hot corrosion attack of metallic coatings as well as their degradation by interdiffusion with the substrate. The effects of the environment and coatings on creep/fatigue behavior are being addressed through a joint effort with the Fatigue sub-project. An initial effort will attempt to scope the problem of thermal barrier coating life prediction. Verification of models will be carried out through benchmark rig tests including a 4 atm. replaceable blade turbine and a 50 atm. pressurized burner rig.
Case Studies for Enhancing Student Engagement and Active Learning in Software V&V Education
ERIC Educational Resources Information Center
Manohar, Priyadarshan A.; Acharya, Sushil; Wu, Peter; Hansen, Mary; Ansari, Ali; Schilling, Walter
2015-01-01
Two critical problems facing the software (S/W) industry today are the lack of appreciation of the full benefits that can be derived from Software Verification and Validation (V&V) and an associated problem of shortage of adequately trained V&V practitioners. To address this situation, the software V&V course curriculum at the author's…
Markov Chains For Testing Redundant Software
NASA Technical Reports Server (NTRS)
White, Allan L.; Sjogren, Jon A.
1990-01-01
Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.
DOT National Transportation Integrated Search
1984-01-01
The study reported here addresses some of the earlier phases in the development of a pavement management system for the state of Virginia. Among the issues discussed are the development of an adequate data base and the implementation of a condition r...
Notes on modeling and simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redondo, Antonio
These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.
Verification and Validation of Flight-Critical Systems
NASA Technical Reports Server (NTRS)
Brat, Guillaume
2010-01-01
For the first time in many years, the NASA budget presented to congress calls for a focused effort on the verification and validation (V&V) of complex systems. This is mostly motivated by the results of the VVFCS (V&V of Flight-Critical Systems) study, which should materialize as a a concrete effort under the Aviation Safety program. This talk will present the results of the study, from requirements coming out of discussions with the FAA and the Joint Planning and Development Office (JPDO) to technical plan addressing the issue, and its proposed current and future V&V research agenda, which will be addressed by NASA Ames, Langley, and Dryden as well as external partners through NASA Research Announcements (NRA) calls. This agenda calls for pushing V&V earlier in the life cycle and take advantage of formal methods to increase safety and reduce cost of V&V. I will present the on-going research work (especially the four main technical areas: Safety Assurance, Distributed Systems, Authority and Autonomy, and Software-Intensive Systems), possible extensions, and how VVFCS plans on grounding the research in realistic examples, including an intended V&V test-bench based on an Integrated Modular Avionics (IMA) architecture and hosted by Dryden.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dodds, K.; Daley, T.; Freifeld, B.
2009-05-01
The Australian Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC) is currently injecting 100,000 tons of CO{sub 2} in a large-scale test of storage technology in a pilot project in southeastern Australia called the CO2CRC Otway Project. The Otway Basin, with its natural CO{sub 2} accumulations and many depleted gas fields, offers an appropriate site for such a pilot project. An 80% CO{sub 2} stream is produced from a well (Buttress) near the depleted gas reservoir (Naylor) used for storage (Figure 1). The goal of this project is to demonstrate that CO{sub 2} can be safely transported, stored underground, andmore » its behavior tracked and monitored. The monitoring and verification framework has been developed to monitor for the presence and behavior of CO{sub 2} in the subsurface reservoir, near surface, and atmosphere. This monitoring framework addresses areas, identified by a rigorous risk assessment, to verify conformance to clearly identifiable performance criteria. These criteria have been agreed with the regulatory authorities to manage the project through all phases addressing responsibilities, liabilities, and to assure the public of safe storage.« less
NASA Technical Reports Server (NTRS)
McComas, David C.; Strege, Susanne L.; Carpenter, Paul B. Hartman, Randy
2015-01-01
The core Flight System (cFS) is a flight software (FSW) product line developed by the Flight Software Systems Branch (FSSB) at NASA's Goddard Space Flight Center (GSFC). The cFS uses compile-time configuration parameters to implement variable requirements to enable portability across embedded computing platforms and to implement different end-user functional needs. The verification and validation of these requirements is proving to be a significant challenge. This paper describes the challenges facing the cFS and the results of a pilot effort to apply EXB Solution's testing approach to the cFS applications.
Feasibility of biochemical verification in a web-based smoking cessation study.
Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L
2017-10-01
Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
A verification strategy for web services composition using enhanced stacked automata model.
Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali
2015-01-01
Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the existing models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parodi, K; Dauvergne, D; Kruse, J
In this first inaugural joint ESTRO-AAPM session we will attempt to provide some answers to the problems encountered in the clinical application of particle therapy. Indeed the main advantage is that the physical properties of ion beams offer high ballistic accuracy for tightly conformal irradiation of the tumour volume, with excellent sparing of surrounding healthy tissue and critical organs, This also its Achilles' heel calling for an increasing role of imaging to ensure safe application of the intended dose to the targeted area during the entire course of fractionated therapy. We have three distinguished speakers addressing possible solutions. Katia Parodimore » (Ludwig Maximilians University, Munich, Germany) To date, Positron Emission Tomography (PET) is the only technique which has been already clinically investigated for in-vivo visualization of the beam range during or shortly after ion beam delivery. The method exploits the transient amount of β{sup 2}-activity induced in nuclear interactions between the primary beam and the irradiated tissue, depending on the ion beam species, the tissue elemental composition and physiological properties (in terms of biological clearance), as well as the time course of irradiation and imaging. This contribution will review initial results, ongoing methodological developments and remaining challenges related to the clinical usage of viable but often suboptimal instrumentation and workflows of PET-based treatment verification. Moreover, it will present and discuss promising new detector developments towards next-generation dedicated PET scanners relying on full-ring or dual-head designs for in-beam quasi real-time imaging. Denis Dauvergne (Institut de Physique Nucleaire de Lyon, Lyon, France) Prompt gamma radiation monitoring of hadron therapy presents the advantage of real time capability to measure the ion range. Both simulations and experiments show that millimetric verification of the range can be achieved at the pencil beam scale for active proton beam delivery in homogenous targets. The development of gamma cameras, that has been studied by several groups worldwide over the last years, now reaches - for some of them - the stage of being applicable in clinical conditions, with real size prototypes and count rate capability matching the therapeutic beam intensities. We will review the different concepts of gamma cameras, the advantages and limitations of this method, and the main challenges that should still be overcome before the widespread of prompt gamma quality assurance for proton and hadrontherapy. Jon Kruse (Mayo Clinic, Rochester, MN, USA) Treatment simulation images for proton therapy are used to determine proton stopping power and range in the patient. This talk will discuss the careful control of CT numbers and conversion of CT number to stopping power required in proton therapy. Imaging for treatment guidance of proton therapy also presents unique challenges which will be addressed. Among them are the enhanced relationship between internal anatomy changes and dosimetry, the need for imaging to support adaptive planning protocols, and high operational efficiency. Learning Objectives: To learn about the possibilities of using activation products to determine the range of particle beams in a patient treatment setting To be informed on an alternative methodology using prompt gamma detectors To understand the impact of the accuracy of the knowledge of the patient information with respect to the delivered treatment.« less
Attention and implicit memory in the category-verification and lexical decision tasks.
Mulligan, Neil W; Peterson, Daniel
2008-05-01
Prior research on implicit memory appeared to support 3 generalizations: Conceptual tests are affected by divided attention, perceptual tasks are affected by certain divided-attention manipulations, and all types of priming are affected by selective attention. These generalizations are challenged in experiments using the implicit tests of category verification and lexical decision. First, both tasks were unaffected by divided-attention tasks known to impact other priming tasks. Second, both tasks were unaffected by a manipulation of selective attention in which colored words were either named or their colors identified. Thus, category verification, unlike other conceptual tasks, appears unaffected by divided attention, and some selective-attention tasks, and lexical decision, unlike other perceptual tasks, appears unaffected by a difficult divided-attention task and some selective-attention tasks. Finally, both tasks were affected by a selective-attention task in which attention was manipulated across objects (rather than within objects), indicating some susceptibility to selective attention. The results contradict an analysis on the basis of the conceptual-perceptual distinction and other more specific hypotheses but are consistent with the distinction between production and identification priming.
NASA Astrophysics Data System (ADS)
Pogue, B. W.; Krishnaswamy, V.; Jermyn, M.; Bruza, P.; Miao, T.; Ware, William; Saunders, S. L.; Andreozzi, J. M.; Gladstone, D. J.; Jarvis, L. A.
2017-05-01
Cherenkov imaging has been shown to allow near real time imaging of the beam entrance and exit on patient tissue, with the appropriate intensified camera and associated image processing. A dedicated system has been developed for research into full torso imaging of whole breast irradiation, where the dual camera system captures the beam shape for all beamlets used in this treatment protocol. Particularly challenging verification measurement exists in dynamic wedge, field in field, and boost delivery, and the system was designed to capture these as they are delivered. Two intensified CMOS (ICMOS) cameras were developed and mounted in a breast treatment room, and pilot studies for intensity and stability were completed. Software tools to contour the treatment area have been developed and are being tested prior to initiation of the full trial. At present, it is possible to record delivery of individual beamlets as small as a single MLC thickness, and readout at 20 frames per second is achieved. Statistical analysis of system repeatibilty and stability is presented, as well as pilot human studies.
Skill of Predicting Heavy Rainfall Over India: Improvement in Recent Years Using UKMO Global Model
NASA Astrophysics Data System (ADS)
Sharma, Kuldeep; Ashrit, Raghavendra; Bhatla, R.; Mitra, A. K.; Iyengar, G. R.; Rajagopal, E. N.
2017-11-01
The quantitative precipitation forecast (QPF) performance for heavy rains is still a challenge, even for the most advanced state-of-art high-resolution Numerical Weather Prediction (NWP) modeling systems. This study aims to evaluate the performance of UK Met Office Unified Model (UKMO) over India for prediction of high rainfall amounts (>2 and >5 cm/day) during the monsoon period (JJAS) from 2007 to 2015 in short range forecast up to Day 3. Among the various modeling upgrades and improvements in the parameterizations during this period, the model horizontal resolution has seen an improvement from 40 km in 2007 to 17 km in 2015. Skill of short range rainfall forecast has improved in UKMO model in recent years mainly due to increased horizontal and vertical resolution along with improved physics schemes. Categorical verification carried out using the four verification metrics, namely, probability of detection (POD), false alarm ratio (FAR), frequency bias (Bias) and Critical Success Index, indicates that QPF has improved by >29 and >24% in case of POD and FAR. Additionally, verification scores like EDS (Extreme Dependency Score), EDI (Extremal Dependence Index) and SEDI (Symmetric EDI) are used with special emphasis on verification of extreme and rare rainfall events. These scores also show an improvement by 60% (EDS) and >34% (EDI and SEDI) during the period of study, suggesting an improved skill of predicting heavy rains.
Assessment of test methods for evaluating effectiveness of cleaning flexible endoscopes.
Washburn, Rebecca E; Pietsch, Jennifer J
2018-06-01
Strict adherence to each step of reprocessing is imperative to removing potentially infectious agents. Multiple methods for verifying proper reprocessing exist; however, each presents challenges and limitations, and best practice within the industry has not been established. Our goal was to evaluate endoscope cleaning verification tests with particular interest in the evaluation of the manual cleaning step. The results of the cleaning verification tests were compared with microbial culturing to see if a positive cleaning verification test would be predictive of microbial growth. This study was conducted at 2 high-volume endoscopy units within a multisite health care system. Each of the 90 endoscopes were tested for adenosine triphosphate, protein, microbial growth via agar plate, and rapid gram-negative culture via assay. The endoscopes were tested in 3 locations: the instrument channel, control knob, and elevator mechanism. This analysis showed substantial level of agreement between protein detection postmanual cleaning and protein detection post-high-level disinfection at the control head for scopes sampled sequentially. This study suggests that if protein is detected postmanual cleaning, there is a significant likelihood that protein will also be detected post-high-level disinfection. It also infers that a cleaning verification test is not predictive of microbial growth. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wijaya, Surya Li; Savvides, Marios; Vijaya Kumar, B. V. K.
2005-02-01
Face recognition on mobile devices, such as personal digital assistants and cell phones, is a big challenge owing to the limited computational resources available to run verifications on the devices themselves. One approach is to transmit the captured face images by use of the cell-phone connection and to run the verification on a remote station. However, owing to limitations in communication bandwidth, it may be necessary to transmit a compressed version of the image. We propose using the image compression standard JPEG2000, which is a wavelet-based compression engine used to compress the face images to low bit rates suitable for transmission over low-bandwidth communication channels. At the receiver end, the face images are reconstructed with a JPEG2000 decoder and are fed into the verification engine. We explore how advanced correlation filters, such as the minimum average correlation energy filter [Appl. Opt. 26, 3633 (1987)] and its variants, perform by using face images captured under different illumination conditions and encoded with different bit rates under the JPEG2000 wavelet-encoding standard. We evaluate the performance of these filters by using illumination variations from the Carnegie Mellon University's Pose, Illumination, and Expression (PIE) face database. We also demonstrate the tolerance of these filters to noisy versions of images with illumination variations.
The Purefecta™ was tested for removal of bacteria and viruses at NSF International's Drinking Water Treatment Systems Laboratory. Kinetico submitted 10 units for testing, which were split into two groups of five. One group received 25 days of conditioning prior to challeng...
Though aerosol radiative effects have been recognized as some of the largest sources of uncertainty among the forcers of climate change, the verification of the spatial and temporal variability of the magnitude and directionality of aerosol radiative forcing has remained challeng...
The Dow SFD-2880 UF module was tested for removal of microorganisms using live Cryptosporidium parvum oocysts, endospores of the bacteria Bacillus alrophaeus, and the MS2 coliphage virus according to the product-specific challenge testing requirements of the EPA Long-Term 2 Enhan...
Gender identity and sport: is the playing field level?
Reeser, J C
2005-10-01
This review examines gender identity issues in competitive sports, focusing on the evolution of policies relating to female gender verification and transsexual participation in sport. The issues are complex and continue to challenge sport governing bodies, including the International Olympic Committee, as they strive to provide a safe environment in which female athletes may compete fairly and equitably.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.
Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less
Knowledge-based system verification and validation
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1990-01-01
The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.
Working Group on Virtual Data Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, D. N.; Palanisamy, G.; van Dam, K. K.
2016-02-04
This report is the outcome of a workshop commissioned by the U.S. Department of Energy’s (DOE) Climate and Environmental Sciences Division (CESD) to examine current and future data infrastructure requirements foundational for achieving CESD scientific mission goals in advancing a robust, predictive understanding of Earth’s climate and environmental systems. Over the past several years, data volumes in CESD disciplines have risen sharply to unprecedented levels (tens of petabytes). Moreover, the complexity and diversity of this research data— including simulations, observations, and reanalysis— have grown significantly, posing new challenges for data capture, storage, verification, analysis, and integration. With the trends ofmore » increased data volume (in the hundreds of petabytes), more complex analysis processes, and growing cross-disciplinary collaborations, it is timely to investigate whether the CESD community has the computational and data support needed to fully realize the scientific potential of its data collections. In recognition of the challenges, a partnership is forming across CESD and among national and international agencies to examine the viability of creating an integrated, collaborative data infrastructure: a Virtual Laboratory. The overarching goal of this report is to identify the community’s key data technology requirements and high-priority development needs for sustaining and growing its scientific discovery potential. The report also aims to map these requirements to existing solutions and to identify gaps in current services, tools, and infrastructure that will need to be addressed in the short, medium, and long term to advance scientific progress.« less
VASIR: An Open-Source Research Platform for Advanced Iris Recognition Technologies.
Lee, Yooyoung; Micheals, Ross J; Filliben, James J; Phillips, P Jonathon
2013-01-01
The performance of iris recognition systems is frequently affected by input image quality, which in turn is vulnerable to less-than-optimal conditions due to illuminations, environments, and subject characteristics (e.g., distance, movement, face/body visibility, blinking, etc.). VASIR (Video-based Automatic System for Iris Recognition) is a state-of-the-art NIST-developed iris recognition software platform designed to systematically address these vulnerabilities. We developed VASIR as a research tool that will not only provide a reference (to assess the relative performance of alternative algorithms) for the biometrics community, but will also advance (via this new emerging iris recognition paradigm) NIST's measurement mission. VASIR is designed to accommodate both ideal (e.g., classical still images) and less-than-ideal images (e.g., face-visible videos). VASIR has three primary modules: 1) Image Acquisition 2) Video Processing, and 3) Iris Recognition. Each module consists of several sub-components that have been optimized by use of rigorous orthogonal experiment design and analysis techniques. We evaluated VASIR performance using the MBGC (Multiple Biometric Grand Challenge) NIR (Near-Infrared) face-visible video dataset and the ICE (Iris Challenge Evaluation) 2005 still-based dataset. The results showed that even though VASIR was primarily developed and optimized for the less-constrained video case, it still achieved high verification rates for the traditional still-image case. For this reason, VASIR may be used as an effective baseline for the biometrics community to evaluate their algorithm performance, and thus serves as a valuable research platform.
VASIR: An Open-Source Research Platform for Advanced Iris Recognition Technologies
Lee, Yooyoung; Micheals, Ross J; Filliben, James J; Phillips, P Jonathon
2013-01-01
The performance of iris recognition systems is frequently affected by input image quality, which in turn is vulnerable to less-than-optimal conditions due to illuminations, environments, and subject characteristics (e.g., distance, movement, face/body visibility, blinking, etc.). VASIR (Video-based Automatic System for Iris Recognition) is a state-of-the-art NIST-developed iris recognition software platform designed to systematically address these vulnerabilities. We developed VASIR as a research tool that will not only provide a reference (to assess the relative performance of alternative algorithms) for the biometrics community, but will also advance (via this new emerging iris recognition paradigm) NIST’s measurement mission. VASIR is designed to accommodate both ideal (e.g., classical still images) and less-than-ideal images (e.g., face-visible videos). VASIR has three primary modules: 1) Image Acquisition 2) Video Processing, and 3) Iris Recognition. Each module consists of several sub-components that have been optimized by use of rigorous orthogonal experiment design and analysis techniques. We evaluated VASIR performance using the MBGC (Multiple Biometric Grand Challenge) NIR (Near-Infrared) face-visible video dataset and the ICE (Iris Challenge Evaluation) 2005 still-based dataset. The results showed that even though VASIR was primarily developed and optimized for the less-constrained video case, it still achieved high verification rates for the traditional still-image case. For this reason, VASIR may be used as an effective baseline for the biometrics community to evaluate their algorithm performance, and thus serves as a valuable research platform. PMID:26401431
The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Hui; Shi, Tujin; Qian, Wei-Jun
2015-12-04
Mass spectrometry-based proteomics has become an indispensable tool in biomedical research with broad applications ranging from fundamental biology, systems biology, and biomarker discovery. Recent advances in LC-MS have made it become a major technology in clinical applications, especially in cancer biomarker discovery and verification. To overcome the challenges associated with the analysis of clinical samples, such as extremely wide dynamic range of protein concentrations in biofluids and the need to perform high throughput and accurate quantification, significant efforts have been devoted to improve the overall performance of LC-MS bases clinical proteomics. In this review, we summarize the recent advances inmore » LC-MS in the aspect of cancer biomarker discovery and quantification, and discuss its potentials, limitations, and future perspectives.« less
Application of Lightweight Formal Methods to Software Security
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt
2005-01-01
Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.
The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification.
Wang, Hui; Shi, Tujin; Qian, Wei-Jun; Liu, Tao; Kagan, Jacob; Srivastava, Sudhir; Smith, Richard D; Rodland, Karin D; Camp, David G
2016-01-01
Mass spectrometry (MS) -based proteomics has become an indispensable tool with broad applications in systems biology and biomedical research. With recent advances in liquid chromatography (LC) and MS instrumentation, LC-MS is making increasingly significant contributions to clinical applications, especially in the area of cancer biomarker discovery and verification. To overcome challenges associated with analyses of clinical samples (for example, a wide dynamic range of protein concentrations in bodily fluids and the need to perform high throughput and accurate quantification of candidate biomarker proteins), significant efforts have been devoted to improve the overall performance of LC-MS-based clinical proteomics platforms. Reviewed here are the recent advances in LC-MS and its applications in cancer biomarker discovery and quantification, along with the potentials, limitations and future perspectives.
Laurino, Mercy Y; Truitt, Anjali R; Tenney, Lederle; Fisher, Douglass; Lindor, Noralane M; Veenstra, David; Jarvik, Gail P; Newcomb, Polly A; Fullerton, Stephanie M
2017-11-01
The extent to which participants act to clinically verify research results is largely unknown. This study examined whether participants who received Lynch syndrome (LS)-related findings pursued researchers' recommendation to clinically verify results with testing performed by a CLIA-certified laboratory. The Fred Hutchinson Cancer Research Center site of the multinational Colon Cancer Family Registry offered non-CLIA individual genetic research results to select registry participants (cases and their enrolled relatives) from 2011 to 2013. Participants who elected to receive results were counseled on the importance of verifying results at a CLIA-certified laboratory. Twenty-six (76.5%) of the 34 participants who received genetic results completed 2- and 12-month postdisclosure surveys; 42.3% of these (11/26) participated in a semistructured follow-up interview. Within 12 months of result disclosure, only 4 (15.4%) of 26 participants reported having verified their results in a CLIA-certified laboratory; of these four cases, all research and clinical results were concordant. Reasons for pursuing clinical verification included acting on the recommendation of the research team and informing future clinical care. Those who did not verify results cited lack of insurance coverage and limited perceived personal benefit of clinical verification as reasons for inaction. These findings suggest researchers will need to address barriers to seeking clinical verification in order to ensure that the intended benefits of returning genetic research results are realized. © 2017 The Authors. Molecular Genetics & Genomic Medicine published by Wiley Periodicals, Inc.
Risk Mitigation Testing with the BepiColombo MPO SADA
NASA Astrophysics Data System (ADS)
Zemann, J.; Heinrich, B.; Skulicz, A.; Madsen, M.; Weisenstein, W.; Modugno, F.; Althaus, F.; Panhofer, T.; Osterseher, G.
2013-09-01
A Solar Array (SA) Drive Assembly (SADA) for the BepiColombo mission is being developed and qualified at RUAG Space Zürich (RSSZ). The system is consisting of the Solar Array Drive Mechanism (SADM) and the Solar Array Drive Electronics (SADE) which is subcontracted to RUAG Space Austria (RSA).This paper deals with the risk mitigation activities and the lesson learnt from this development. In specific following topics substantiated by bread board (BB) test results will be addressed in detail:Slipring Bread Board Test: Verification of lifetime and electrical performance of carbon brush technology Potentiometer BB Tests: Focus on lifetime verification (> 650000 revolution) and accuracy requirement SADM EM BB Test: Subcomponent (front-bearing and gearbox) characterization; complete test campaign equivalent to QM test.EM SADM/ SADE Combined Test: Verification of combined performance (accuracy, torque margin) and micro-vibration testing of SADA systemSADE Bread Board Test: Parameter optimization; Test campaign equivalent to QM testThe main improvements identified in frame of BB testing and already implemented in the SADM EM/QM and SADE EQM are:• Improved preload device for gearbox• Improved motor ball-bearing assembly• Position sensor improvements• Calibration process for potentiometer• SADE motor controller optimization toachieve required running smoothness• Overall improvement of test equipment.
Lightweight Towed Howitzer Demonstrator. Phase 1 and Partial Phase 2. Volume A. Overview.
1987-04-01
Reliability Floyd Manson............................... Test Plans Errol Quick................................. Systems Engrnq Coordi nati on Bob Schmidt ...FMC Structur*1 Verification o Beam stress calculations on the supporting trails which allow 70kpsi in a quasi-isotropic lay up of graphite epoxy...addressed utilizing a damage tolerant design criteria. o Strength calculations are questionable because of the dry room temperature values used. The
RIACS Workshop on the Verification and Validation of Autonomous and Adaptive Systems
NASA Technical Reports Server (NTRS)
Pecheur, Charles; Visser, Willem; Simmons, Reid
2001-01-01
The long-term future of space exploration at NASA is dependent on the full exploitation of autonomous and adaptive systems: careful monitoring of missions from earth, as is the norm now, will be infeasible due to the sheer number of proposed missions and the communication lag for deep-space missions. Mission managers are however worried about the reliability of these more intelligent systems. The main focus of the workshop was to address these worries and hence we invited NASA engineers working on autonomous and adaptive systems and researchers interested in the verification and validation (V&V) of software systems. The dual purpose of the meeting was to: (1) make NASA engineers aware of the V&V techniques they could be using; and (2) make the V&V community aware of the complexity of the systems NASA is developing.
Quality dependent fusion of intramodal and multimodal biometric experts
NASA Astrophysics Data System (ADS)
Kittler, J.; Poh, N.; Fatukasi, O.; Messer, K.; Kryszczuk, K.; Richiardi, J.; Drygajlo, A.
2007-04-01
We address the problem of score level fusion of intramodal and multimodal experts in the context of biometric identity verification. We investigate the merits of confidence based weighting of component experts. In contrast to the conventional approach where confidence values are derived from scores, we use instead raw measures of biometric data quality to control the influence of each expert on the final fused score. We show that quality based fusion gives better performance than quality free fusion. The use of quality weighted scores as features in the definition of the fusion functions leads to further improvements. We demonstrate that the achievable performance gain is also affected by the choice of fusion architecture. The evaluation of the proposed methodology involves 6 face and one speech verification experts. It is carried out on the XM2VTS data base.
Formal semantics for a subset of VHDL and its use in analysis of the FTPP scoreboard circuit
NASA Technical Reports Server (NTRS)
Bickford, Mark
1994-01-01
In the first part of the report, we give a detailed description of an operational semantics for a large subset of VHDL, the VHSIC Hardware Description Language. The semantics is written in the functional language Caliban, similar to Haskell, used by the theorem prover Clio. We also describe a translator from VHDL into Caliban semantics and give some examples of its use. In the second part of the report, we describe our experience in using the VHDL semantics to try to verify a large VHDL design. We were not able to complete the verification due to certain complexities of VHDL which we discuss. We propose a VHDL verification method that addresses the problems we encountered but which builds on the operational semantics described in the first part of the report.
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.
2016-01-01
Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated into the KMD-WRF runs, using the product generated by NOAA/NESDIS. Model verification capabilities are also being transitioned to KMD using NCAR's Model *Corresponding author address: Jonathan Case, ENSCO, Inc., 320 Sparkman Dr., Room 3008, Huntsville, AL, 35805. Email: Jonathan.Case-1@nasa.gov Evaluation Tools (MET; Brown et al. 2009) software in conjunction with a SPoRT-developed scripting package, in order to quantify and compare errors in simulated temperature, moisture and precipitation in the experimental WRF model simulations. This extended abstract and accompanying presentation summarizes the efforts and training done to date to support this unique regional modeling initiative at KMD. To honor the memory of Dr. Peter J. Lamb and his extensive efforts in bolstering weather and climate science and capacity-building in Africa, we offer this contribution to the special Peter J. Lamb symposium. The remainder of this extended abstract is organized as follows. The collaborating international organizations involved in the project are presented in Section 2. Background information on the unique land surface input datasets is presented in Section 3. The hands-on training sessions from March 2014 and June 2015 are described in Section 4. Sample experimental WRF output and verification from the June 2015 training are given in Section 5. A summary is given in Section 6, followed by Acknowledgements and References.
The Watts Premier Ultra 5 system was tested for removal of bacteria and viruses at NSF International's Laboratory. Watts Premier submitted ten units, which were split into two groups of five. One group received 25 days of conditioning prior to challenge testing, while the secon...
An overview of the V&V of Flight-Critical Systems effort at NASA
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.
2011-01-01
As the US is getting ready for the Next Generation (NextGen) of Air Traffic System, there is a growing concern that the current techniques for verification and validation will not be adequate for the changes to come. The JPDO (in charge of implementing NextGen) has given NASA a mandate to address the problem and it resulted in the formulation of the V&V of Flight-Critical Systems effort. This research effort is divided into four themes: argument-based safety assurance, distributed systems, authority and autonomy, and, software intensive systems. This paper presents an overview of the technologies that will address the problem.
Visualizing Safeguards: Software for Conceptualizing and Communicating Safeguards Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallucci, N.
2015-07-12
The nuclear programs of states are complex and varied, comprising a wide range of fuel cycles and facilities. Also varied are the types and terms of states’ safeguards agreements with the IAEA, each placing different limits on the inspectorate’s access to these facilities. Such nuances make it difficult to draw policy significance from the ground-level nuclear activities of states, or to attribute ground-level outcomes to the implementation of specific policies or initiatives. While acquiring a firm understanding of these relationships is critical to evaluating and formulating effective policy, doing so requires collecting and synthesizing large bodies of information. Maintaining amore » comprehensive working knowledge of the facilities comprising even a single state’s nuclear program poses a challenge, yet marrying this information with relevant safeguards and verification information is more challenging still. To facilitate this task, Brookhaven National Laboratory has developed a means of capturing the development, operation, and safeguards history of all the facilities comprising a state’s nuclear program in a single graphic. The resulting visualization offers a useful reference tool to policymakers and analysts alike, providing a chronology of states’ nuclear development and an easily digestible history of verification activities across their fuel cycles.« less
NASA Astrophysics Data System (ADS)
Dartevelle, S.
2006-12-01
Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.
Mapping a Path to Autonomous Flight in the National Airspace
NASA Technical Reports Server (NTRS)
Lodding, Kenneth N.
2011-01-01
The introduction of autonomous flight, whether military, commercial, or civilian, into the National Airspace System (NAS) will present significant challenges. Minimizing the impact and preventing the changes from becoming disruptive, rather than an enhancing technology will not be without difficulty. From obstacle detection and avoidance to real-time verification and validation of system behavior, there are significant problems which must be solved prior to the general acceptance of autonomous systems. This paper examines some of the key challenges and the multi-disciplinary collaboration which must occur for autonomous systems to be accepted as equal partners in the NAS.
Papadakis, G; Friedt, J M; Eck, M; Rabus, D; Jobst, G; Gizeli, E
2017-09-01
The development of integrated platforms incorporating an acoustic device as the detection element requires addressing simultaneously several challenges of technological and scientific nature. The present work was focused on the design of a microfluidic module, which, combined with a dual or array type Love wave acoustic chip could be applied to biomedical applications and molecular diagnostics. Based on a systematic study we optimized the mechanics of the flow cell attachment and the sealing material so that fluidic interfacing/encapsulation would impose minimal losses to the acoustic wave. We have also investigated combinations of operating frequencies with waveguide materials and thicknesses for maximum sensitivity during the detection of protein and DNA biomarkers. Within our investigations neutravidin was used as a model protein biomarker and unpurified PCR amplified Salmonella DNA as the model genetic target. Our results clearly indicate the need for experimental verification of the optimum engineering and analytical parameters, in order to develop commercially viable systems for integrated analysis. The good reproducibility of the signal together with the ability of the array biochip to detect multiple samples hold promise for the future use of the integrated system in a Lab-on-a-Chip platform for application to molecular diagnostics.
Provenance information as a tool for addressing engineered nanoparticle reproducibility challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Donald R.; Munusamy, Prabhakaran; Thrall, Brian D.
Nanoparticles of various types are of increasing research and technological importance in biological and other applications. Difficulties in the production and delivery of nanoparticles with consistent and well defined properties appear in many forms and have a variety of causes. Among several issues are those associated with incomplete information about the history of particles involved in research studies including the synthesis method, sample history after synthesis including time and nature of storage and the detailed nature of any sample processing or modification. In addition, the tendency of particles to change with time or environmental condition suggests that the time betweenmore » analysis and application is important and some type of consistency or verification process can be important. The essential history of a set of particles can be identified as provenance information tells the origin or source of a batch of nano-objects along with information related to handling and any changes that may have taken place since it was originated. A record of sample provenance information for a set of particles can play a useful role in identifying some of the sources and decreasing the extent of particle variability and the observed lack of reproducibility observed by many researchers.« less
Expediting Combinatorial Data Set Analysis by Combining Human and Algorithmic Analysis.
Stein, Helge Sören; Jiao, Sally; Ludwig, Alfred
2017-01-09
A challenge in combinatorial materials science remains the efficient analysis of X-ray diffraction (XRD) data and its correlation to functional properties. Rapid identification of phase-regions and proper assignment of corresponding crystal structures is necessary to keep pace with the improved methods for synthesizing and characterizing materials libraries. Therefore, a new modular software called htAx (high-throughput analysis of X-ray and functional properties data) is presented that couples human intelligence tasks used for "ground-truth" phase-region identification with subsequent unbiased verification by an algorithm to efficiently analyze which phases are present in a materials library. Identified phases and phase-regions may then be correlated to functional properties in an expedited manner. For the functionality of htAx to be proven, two previously published XRD benchmark data sets of the materials systems Al-Cr-Fe-O and Ni-Ti-Cu are analyzed by htAx. The analysis of ∼1000 XRD patterns takes less than 1 day with htAx. The proposed method reliably identifies phase-region boundaries and robustly identifies multiphase structures. The method also addresses the problem of identifying regions with previously unpublished crystal structures using a special daisy ternary plot.
Assessment of sensor performance
NASA Astrophysics Data System (ADS)
Waldmann, C.; Tamburri, M.; Prien, R. D.; Fietzek, P.
2010-02-01
There is an international commitment to develop a comprehensive, coordinated and sustained ocean observation system. However, a foundation for any observing, monitoring or research effort is effective and reliable in situ sensor technologies that accurately measure key environmental parameters. Ultimately, the data used for modelling efforts, management decisions and rapid responses to ocean hazards are only as good as the instruments that collect them. There is also a compelling need to develop and incorporate new or novel technologies to improve all aspects of existing observing systems and meet various emerging challenges. Assessment of Sensor Performance was a cross-cutting issues session at the international OceanSensors08 workshop in Warnemünde, Germany, which also has penetrated some of the papers published as a result of the workshop (Denuault, 2009; Kröger et al., 2009; Zielinski et al., 2009). The discussions were focused on how best to classify and validate the instruments required for effective and reliable ocean observations and research. The following is a summary of the discussions and conclusions drawn from this workshop, which specifically addresses the characterisation of sensor systems, technology readiness levels, verification of sensor performance and quality management of sensor systems.
Forest Carbon Monitoring and Reporting for REDD+: What Future for Africa?
Gizachew, Belachew; Duguma, Lalisa A
2016-11-01
A climate change mitigation mechanism for emissions reduction from reduced deforestation and forest degradation, plus forest conservation, sustainable management of forest, and enhancement of carbon stocks (REDD+), has received an international political support in the climate change negotiations. The mechanism will require, among others, an unprecedented technical capacity for monitoring, reporting and verification of carbon emissions from the forest sector. A functional monitoring, reporting and verification requires inventories of forest area, carbon stock and changes, both for the construction of forest reference emissions level and compiling the report on the actual emissions, which are essentially lacking in developing countries, particularly in Africa. The purpose of this essay is to contribute to a better understanding of the state and prospects of forest monitoring and reporting in the context of REDD+ in Africa. We argue that monitoring and reporting capacities in Africa fall short of the stringent requirements of the methodological guidance for monitoring, reporting and verification for REDD+, and this may weaken the prospects for successfully implementing REDD+ in the continent. We presented the challenges and prospects in the national forest inventory, remote sensing and reporting infrastructures. A North-South, South-South collaboration as well as governments own investments in monitoring, reporting and verification system could help Africa leapfrog in monitoring and reporting. These could be delivered through negotiations for the transfer of technology, technical capacities, and experiences that exist among developed countries that traditionally compile forest carbon reports in the context of the Kyoto protocol.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.
Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan
2018-04-01
Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.
Wiesemann, Claudia
2011-04-01
The paper discusses the current medical practice of 'gender verification' in sports from an ethical point of view. It takes the recent public discussion about 800 m runner Caster Semenya as a starting point. At the World Championships in Athletics 2009 in Berlin, Germany, Semenya was challenged by competitors as being a so called 'sex impostor'. A medical examination to verify her sex ensued. The author analyses whether athletes like Semenya could claim a right not to know that is generally acknowledged in human genetics and enforced by international and national genetic privacy laws. The relevance of this right for genetic diagnosis in sports is discussed. To this end, the interests of the athlete concerned and of third parties are balanced according to the expected benefits and harms.Harm is documented in a number of cases and includes unjustified disqualification, severe sex and gender identity crisis, demeaning reactions, social isolation, depression and suicide. Benefits are dubious as most cases of intersex are considered irrelevant for sports competition. It has to be concluded that the benefits to be gained from 'gender verification' in sports via genetic testing do not outweigh the grave individual disadvantages. The current practice of athletic associations to largely ignore the right of competitors not to know does not comply with prevailing ethical provisions on the protection of sensitive personal data. Therefore, genetic 'gender verification' in sports should be abolished.
Combustion Fundamentals Research
NASA Technical Reports Server (NTRS)
1983-01-01
Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.
Biological Weapons and Modern Warfare
1991-04-01
every preparation for reducing Its effectiveness and thereby reduce the likelihood of Its use. In order to plan such preparation, It is advantageous to...attack rates could be maximized and the forces using the weapon protected from its effects . In today’s climate, BW agents are also attractive weapons...questions about the agreement’s true effectiveness . Verification of compliance was not addressed. D. World War It: Events during and following World War
CIRM Alpha Stem Cell Clinics: Collaboratively Addressing Regenerative Medicine Challenges.
Jamieson, Catriona H M; Millan, Maria T; Creasey, Abla A; Lomax, Geoff; Donohoe, Mary E; Walters, Mark C; Abedi, Mehrdad; Bota, Daniela A; Zaia, John A; Adams, John S
2018-06-01
The California Institute for Regenerative Medicine (CIRM) Alpha Stem Cell Clinic (ASCC) Network was launched in 2015 to address a compelling unmet medical need for rigorous, FDA-regulated, stem cell-related clinical trials for patients with challenging, incurable diseases. Here, we describe our multi-center experiences addressing current and future challenges. Copyright © 2018 Elsevier Inc. All rights reserved.
Firefly: an optical lithographic system for the fabrication of holographic security labels
NASA Astrophysics Data System (ADS)
Calderón, Jorge; Rincón, Oscar; Amézquita, Ricardo; Pulido, Iván.; Amézquita, Sebastián.; Bernal, Andrés.; Romero, Luis; Agudelo, Viviana
2016-03-01
This paper introduces Firefly, an optical lithography origination system that has been developed to produce holographic masters of high quality. This mask-less lithography system has a resolution of 418 nm half-pitch, and generates holographic masters with the optical characteristics required for security applications of level 1 (visual verification), level 2 (pocket reader verification) and level 3 (forensic verification). The holographic master constitutes the main core of the manufacturing process of security holographic labels used for the authentication of products and documents worldwide. Additionally, the Firefly is equipped with a software tool that allows for the hologram design from graphic formats stored in bitmaps. The software is capable of generating and configuring basic optical effects such as animation and color, as well as effects of high complexity such as Fresnel lenses, engraves and encrypted images, among others. The Firefly technology gathers together optical lithography, digital image processing and the most advanced control systems, making possible a competitive equipment that challenges the best technologies in the industry of holographic generation around the world. In this paper, a general description of the origination system is provided as well as some examples of its capabilities.
Strategies for Validation Testing of Ground Systems
NASA Technical Reports Server (NTRS)
Annis, Tammy; Sowards, Stephanie
2009-01-01
In order to accomplish the full Vision for Space Exploration announced by former President George W. Bush in 2004, NASA will have to develop a new space transportation system and supporting infrastructure. The main portion of this supporting infrastructure will reside at the Kennedy Space Center (KSC) in Florida and will either be newly developed or a modification of existing vehicle processing and launch facilities, including Ground Support Equipment (GSE). This type of large-scale launch site development is unprecedented since the time of the Apollo Program. In order to accomplish this successfully within the limited budget and schedule constraints a combination of traditional and innovative strategies for Verification and Validation (V&V) have been developed. The core of these strategies consists of a building-block approach to V&V, starting with component V&V and ending with a comprehensive end-to-end validation test of the complete launch site, called a Ground Element Integration Test (GEIT). This paper will outline these strategies and provide the high level planning for meeting the challenges of implementing V&V on a large-scale development program. KEY WORDS: Systems, Elements, Subsystem, Integration Test, Ground Systems, Ground Support Equipment, Component, End Item, Test and Verification Requirements (TVR), Verification Requirements (VR)
Design and development of the 2m resolution camera for ROCSAT-2
NASA Astrophysics Data System (ADS)
Uguen, Gilbert; Luquet, Philippe; Chassat, François
2017-11-01
EADS-Astrium has recently completed the development of a 2m-resolution camera, so-called RSI (Remote Sensing Instrument), for the small-satellite ROCSAT-2, which is the second component of the long-term space program of the Republic of China. The National Space Program Office of Taïwan selected EADS-Astrium as the Prime Contractor for the development of the spacecraft, including the bus and the main instrument RSI. The main challenges for the RSI development were: - to introduce innovative technologies in order to meet the high performance requirements while achieving the design simplicity necessary for the mission (low mass, low power) - to have a development approach and verification compatible with the very tight development schedule This paper describes the instrument design together with the development and verification logic that were implemented to successfully meet these objectives.
Using Model Replication to Improve the Reliability of Agent-Based Models
NASA Astrophysics Data System (ADS)
Zhong, Wei; Kim, Yushim
The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.
A Practical Approach to Identity on Digital Ecosystems Using Claim Verification and Trust
NASA Astrophysics Data System (ADS)
McLaughlin, Mark; Malone, Paul
Central to the ethos of digital ecosystems (DEs) is that DEs should be distributed and have no central points of failure or control. This essentially mandates a decentralised system, which poses significant challenges for identity. Identity in decentralised environments must be treated very differently to identity in traditional environments, where centralised naming, authentication and authorisation can be assumed, and where identifiers can be considered global and absolute. In the absence of such guarantees we have expanded on the OPAALS identity model to produce a general implementation for the OPAALS DE that uses a combination of identity claim verification protocols and trust to give assurances in place of centralised servers. We outline how the components of this implementation function and give an illustrated workflow of how identity issues are solved on the OPAALS DE in practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Nathan; Menikoff, Ralph
2017-02-03
Equilibrium thermodynamics underpins many of the technologies used throughout theoretical physics, yet verification of the various theoretical models in the open literature remains challenging. EOSlib provides a single, consistent, verifiable implementation of these models, in a single, easy-to-use software package. It consists of three parts: a software library implementing various published equation-of-state (EOS) models; a database of fitting parameters for various materials for these models; and a number of useful utility functions for simplifying thermodynamic calculations such as computing Hugoniot curves or Riemann problem solutions. Ready availability of this library will enable reliable code-to- code testing of equation-of-state implementations, asmore » well as a starting point for more rigorous verification work. EOSlib also provides a single, consistent API for its analytic and tabular EOS models, which simplifies the process of comparing models for a particular application.« less
Efficient Verification of Holograms Using Mobile Augmented Reality.
Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter
2016-07-01
Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.
Verification study of an emerging fire suppression system
Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.; ...
2016-01-01
Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less
Precise and Scalable Static Program Analysis of NASA Flight Software
NASA Technical Reports Server (NTRS)
Brat, G.; Venet, A.
2005-01-01
Recent NASA mission failures (e.g., Mars Polar Lander and Mars Orbiter) illustrate the importance of having an efficient verification and validation process for such systems. One software error, as simple as it may be, can cause the loss of an expensive mission, or lead to budget overruns and crunched schedules. Unfortunately, traditional verification methods cannot guarantee the absence of errors in software systems. Therefore, we have developed the CGS static program analysis tool, which can exhaustively analyze large C programs. CGS analyzes the source code and identifies statements in which arrays are accessed out of bounds, or, pointers are used outside the memory region they should address. This paper gives a high-level description of CGS and its theoretical foundations. It also reports on the use of CGS on real NASA software systems used in Mars missions (from Mars PathFinder to Mars Exploration Rover) and on the International Space Station.
Verification study of an emerging fire suppression system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cournoyer, Michael E.; Waked, R. Ryan; Granzow, Howard N.
Self-contained fire extinguishers are a robust, reliable and minimally invasive means of fire suppression for gloveboxes. Moreover, plutonium gloveboxes present harsh environmental conditions for polymer materials; these include radiation damage and chemical exposure, both of which tend to degrade the lifetime of engineered polymer components. Several studies have been conducted to determine the robustness of selfcontained fire extinguishers in plutonium gloveboxes in a nuclear facility, verification tests must be performed. These tests include activation and mass loss calorimeter tests. In addition, compatibility issues with chemical components of the self-contained fire extinguishers need to be addressed. Our study presents activation andmore » mass loss calorimeter test results. After extensive studies, no critical areas of concern have been identified for the plutonium glovebox application of Fire Foe™, except for glovebox operations that use large quantities of bulk plutonium or uranium metal such as metal casting and pyro-chemistry operations.« less
Verification and Validation of NASA-Supported Enhancements to Decision Support Tools of PECAD
NASA Technical Reports Server (NTRS)
Ross, Kenton W.; McKellip, Rodney; Moore, Roxzana F.; Fendley, Debbie
2005-01-01
This section of the evaluation report summarizes the verification and validation (V&V) of recently implemented, NASA-supported enhancements to the decision support tools of the Production Estimates and Crop Assessment Division (PECAD). The implemented enhancements include operationally tailored Moderate Resolution Imaging Spectroradiometer (MODIS) products and products of the Global Reservoir and Lake Monitor (GRLM). The MODIS products are currently made available through two separate decision support tools: the MODIS Image Gallery and the U.S. Department of Agriculture (USDA) Foreign Agricultural Service (FAS) MODIS Normalized Difference Vegetation Index (NDVI) Database. Both the Global Reservoir and Lake Monitor and MODIS Image Gallery provide near-real-time products through PECAD's CropExplorer. This discussion addresses two areas: 1. Assessments of the standard NASA products on which these enhancements are based. 2. Characterizations of the performance of the new operational products.
Verification testing of the ZENON Environmental Inc. ZeeWeed ZW500 UF Drinking Water System was conducted from 2/6-3/7/99. The treatment system underwent Giardia and Cryptosporidium removal challenge testing on 3/2/99 and demonstrated a 5.3 log10 removal of Giardia cysts and a 6...
Jens T. Stevens; Hugh D. Safford; Malcolm P. North; Jeremy S. Fried; Andrew N. Gray; Peter M. Brown; Christopher R. Dolanc; Solomon Z. Dobrowski; Donald A. Falk; Calvin A. Farris; Jerry F. Franklin; Peter Z. Fulé; R. Keala Hagmann; Eric E. Knapp; Jay D. Miller; Douglas F. Smith; Thomas W. Swetnam; Alan H. Taylor; Julia A. Jones
2016-01-01
Quantifying historical fire regimes provides important information for managing contemporary forests. Historical fire frequency and severity can be estimated using several methods; each method has strengths and weaknesses and presents challenges for interpretation and verification. Recent efforts to quantify the timing of historical high-severity fire events in forests...
Freight transportation : strategies needed to address planning and financing limitations
DOT National Transportation Integrated Search
2003-12-01
The General Accounting Office (GAO) was asked to address (1) the challenges to freight mobility, (2) the limitations key stakeholders have encountered in addressing these challenges, and (3) strategies that may aid decision makers in enhancing freigh...
Performance Limits of Non-Line-of-Sight Optical Communications
2015-05-01
high efficiency solar blind photo detectors. In this project, we address the main challenges towards optimizing the UV communication system...LEDs), solar blind filters, and high efficiency solar blind photo detectors. In this project, we address the main challenges towards optimizing the UV...solar blind filters, and high efficiency solar blind photo detectors. In this project, we address the main challenges towards optimizing the UV
Verification of Meteorological and Oceanographic Ensemble Forecasts in the U.S. Navy
NASA Astrophysics Data System (ADS)
Klotz, S.; Hansen, J.; Pauley, P.; Sestak, M.; Wittmann, P.; Skupniewicz, C.; Nelson, G.
2013-12-01
The Navy Ensemble Forecast Verification System (NEFVS) has been promoted recently to operational status at the U.S. Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC). NEFVS processes FNMOC and National Centers for Environmental Prediction (NCEP) meteorological and ocean wave ensemble forecasts, gridded forecast analyses, and innovation (observational) data output by FNMOC's data assimilation system. The NEFVS framework consists of statistical analysis routines, a variety of pre- and post-processing scripts to manage data and plot verification metrics, and a master script to control application workflow. NEFVS computes metrics that include forecast bias, mean-squared error, conditional error, conditional rank probability score, and Brier score. The system also generates reliability and Receiver Operating Characteristic diagrams. In this presentation we describe the operational framework of NEFVS and show examples of verification products computed from ensemble forecasts, meteorological observations, and forecast analyses. The construction and deployment of NEFVS addresses important operational and scientific requirements within Navy Meteorology and Oceanography. These include computational capabilities for assessing the reliability and accuracy of meteorological and ocean wave forecasts in an operational environment, for quantifying effects of changes and potential improvements to the Navy's forecast models, and for comparing the skill of forecasts from different forecast systems. NEFVS also supports the Navy's collaboration with the U.S. Air Force, NCEP, and Environment Canada in the North American Ensemble Forecast System (NAEFS) project and with the Air Force and the National Oceanic and Atmospheric Administration (NOAA) in the National Unified Operational Prediction Capability (NUOPC) program. This program is tasked with eliminating unnecessary duplication within the three agencies, accelerating the transition of new technology, such as multi-model ensemble forecasting, to U.S. Department of Defense use, and creating a superior U.S. global meteorological and oceanographic prediction capability. Forecast verification is an important component of NAEFS and NUOPC. Distribution Statement A: Approved for Public Release; distribution is unlimited
Verification of Meteorological and Oceanographic Ensemble Forecasts in the U.S. Navy
NASA Astrophysics Data System (ADS)
Klotz, S. P.; Hansen, J.; Pauley, P.; Sestak, M.; Wittmann, P.; Skupniewicz, C.; Nelson, G.
2012-12-01
The Navy Ensemble Forecast Verification System (NEFVS) has been promoted recently to operational status at the U.S. Navy's Fleet Numerical Meteorology and Oceanography Center (FNMOC). NEFVS processes FNMOC and National Centers for Environmental Prediction (NCEP) meteorological and ocean wave ensemble forecasts, gridded forecast analyses, and innovation (observational) data output by FNMOC's data assimilation system. The NEFVS framework consists of statistical analysis routines, a variety of pre- and post-processing scripts to manage data and plot verification metrics, and a master script to control application workflow. NEFVS computes metrics that include forecast bias, mean-squared error, conditional error, conditional rank probability score, and Brier score. The system also generates reliability and Receiver Operating Characteristic diagrams. In this presentation we describe the operational framework of NEFVS and show examples of verification products computed from ensemble forecasts, meteorological observations, and forecast analyses. The construction and deployment of NEFVS addresses important operational and scientific requirements within Navy Meteorology and Oceanography (METOC). These include computational capabilities for assessing the reliability and accuracy of meteorological and ocean wave forecasts in an operational environment, for quantifying effects of changes and potential improvements to the Navy's forecast models, and for comparing the skill of forecasts from different forecast systems. NEFVS also supports the Navy's collaboration with the U.S. Air Force, NCEP, and Environment Canada in the North American Ensemble Forecast System (NAEFS) project and with the Air Force and the National Oceanic and Atmospheric Administration (NOAA) in the National Unified Operational Prediction Capability (NUOPC) program. This program is tasked with eliminating unnecessary duplication within the three agencies, accelerating the transition of new technology, such as multi-model ensemble forecasting, to U.S. Department of Defense use, and creating a superior U.S. global meteorological and oceanographic prediction capability. Forecast verification is an important component of NAEFS and NUOPC.
2010-01-01
The 6th Meeting of the Global Alliance to Eliminate Lymphatic Filariasis (GAELF6) was held 1-3 June, 2010 in Seoul, Korea, with 150 participants from 38 countries. The year 2010 marks the midpoint between the first GAELF meeting, in 2000, and the World Health Organization (WHO) 2020 goal of global elimination of lymphatic filariasis (LF) as a public health problem. The theme of the meeting, "Half-time in LF Elimination: Teaming Up with Neglected Tropical Diseases (NTDs)," reflected significant integration of LF elimination programmes into a comprehensive initiative to control NTDs. Presentations on LF epidemiology, treatment, research, and programmes highlighted both accomplishments and remaining challenges. The WHO strategy to interrupt LF transmission is based on annual mass drug administration (MDA) using two-drug combinations. After mapping the geographic distribution of LF, MDA is implemented for ≥ 5 years, followed by a period of post-MDA surveillance, and, ultimately, verification of LF elimination. Morbidity management further reduces disease burden. Of 81 countries considered LF-endemic in 2000, 52 (64.2%) have begun MDA; 10 (12.3%) others with low-level transmission are unlikely to require MDA. In 2008, ~695 million people were offered treatment (51.7% of the at-risk population); ~496 million participated. Approximately 22 million people have been protected from LF infection and disease, with savings of ~US $24.2 billion. Morbidity management programmes have been implemented in 27 (33.3%) countries. Significant challenges to LF elimination remain. These include: initiating MDA in the remaining 19 countries that require it; achieving full geographic coverage in countries where MDA has started; finding alternative strategies to address the problem of Loa loa co-endemicity in Central Africa; developing strategies to treat urban populations; initiating and sustaining MDA in settings of armed conflict; developing refined guidelines and procedures for stopping MDA, for post-MDA surveillance, and for verifying the elimination of LF; and integrating morbidity management into all LF elimination programmes. Scientific research and enhanced advocacy for NTDs remain critical for addressing these challenges. GAELF6 was characterized by enthusiasm and recognition that "teaming up with NTDs" offers opportunities for new partnerships, fresh perspectives, enhanced advocacy, and greater programmatic integration in a rapidly changing global health environment. PMID:20961435
Quantitative assessment of the physical potential of proton beam range verification with PET/CT.
Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T
2008-08-07
A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.
Quantitative assessment of the physical potential of proton beam range verification with PET/CT
NASA Astrophysics Data System (ADS)
Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.
2008-08-01
A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.
NASA Astrophysics Data System (ADS)
Bonfiglio, D.; Chacón, L.; Cappello, S.
2010-08-01
With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacón, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code in cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonfiglio, Daniele; Chacon, Luis; Cappello, Susanna
2010-01-01
With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacon, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code inmore » cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.« less
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
NASA Technical Reports Server (NTRS)
Toral, Marco; Wesdock, John; Kassa, Abby; Pogorelc, Patsy; Jenkens, Robert (Technical Monitor)
2002-01-01
In June 2000, NASA launched the first of three next generation Tracking and Data Relay Satellites (TDRS-H) equipped with a Ka-band forward and return service capability. This Ka-band service supports forward data rates up to 25 Mb/sec using the 22.55 - 23.55 GHz space-to-space allocation. Return services are supported via channel bandwidths of 225 and 650 MHz for data rates up to 800 Mb/sec (QPSK) using the 25.25 - 27.5 GHz space-to-space allocation. As part of NASA's acceptance of the TDRS-H spacecraft, an extensive on-orbit calibration, verification and characterization effort was performed to ensure that on-orbit spacecraft performance is within specified limits. This process verified the compliance of the Ka-band communications payload with all performance specifications and demonstrated an end-to-end Ka-band service capability. This paper summarizes the results of the TDRS-H Ka-band communications payload on-orbit performance verification and end-to-end service characterization. Performance parameters addressed include Effective Isotropically Radiated Power (EIRP), antenna Gain-to-System Noise Temperature (G/T), antenna gain pattern, frequency tunability and accuracy, channel magnitude response, and Ka-band service Bit-Error-Rate (BER) performance.
NASA Technical Reports Server (NTRS)
Toral, Marco; Wesdock, John; Kassa, Abby; Pogorelc, Patsy; Jenkens, Robert (Technical Monitor)
2002-01-01
In June 2000, NASA launched the first of three next generation Tracking and Data Relay Satellites (TDRS-H) equipped with a Ka-band forward and return service capability. This Ka-band service supports forward data rates of up to 25 Mb/sec using the 22.55-23.55 GHz space-to-space allocation. Return services are supported via channel bandwidths of 225 and 650 MHz for data rates up to at least 800 Mb/sec using the 25.25 - 27.5 GHz space-to-space allocation. As part of NASA's acceptance of the TDRS-H spacecraft, an extensive on-orbit calibration, verification and characterization effort was performed to ensure that on-orbit spacecraft performance is within specified limits. This process verified the compliance of the Ka-band communications payload with all performance specifications, and demonstrated an end-to-end Ka-band service capability. This paper summarizes the results of the TDRS-H Ka-band communications payload on-orbit performance verification and end-to-end service characterization. Performance parameters addressed include antenna gain pattern, antenna Gain-to-System Noise Temperature (G/T), Effective Isotropically Radiated Power (EIRP), antenna pointing accuracy, frequency tunability, channel magnitude response, and Ka-band service Bit-Error-Rate (BER) performance.
Power System Test and Verification at Satellite Level
NASA Astrophysics Data System (ADS)
Simonelli, Giulio; Mourra, Olivier; Tonicello, Ferdinando
2008-09-01
Most of the articles on Power Systems deal with the architecture and technical solutions related to the functionalities of the power system and their performances. Very few articles, if none, address integration and verification aspects of the Power System at satellite level and the related issues with the Power EGSE (Electrical Ground Support Equipment), which, also, have to support the AIT/AIV (Assembly Integration Test and Verification) program of the satellite and, eventually, the launch campaign. In the last years a more complex development and testing concept based on MDVE (Model Based Development and Verification Environment) has been introduced. In the MDVE approach the simulation software is used to simulate the Satellite environment and, in the early stages, the satellites units. This approach changed significantly the Power EGSE requirements. Power EGSEs or, better, Power SCOEs (Special Check Out Equipment) are now requested to provide the instantaneous power generated by the solar array throughout the orbit. To achieve that, the Power SCOE interfaces to the RTS (Real Time Simulator) of the MDVE. The RTS provides the instantaneous settings, which belong to that point along the orbit, to the Power SCOE so that the Power SCOE generates the instantaneous {I,V} curve of the SA (Solar Array). That means a real time test for the power system, which is even more valuable for EO (Earth Observation) satellites where the Solar Array aspect angle to the sun is rarely fixed, and the power load profile can be particularly complex (for example, in radar applications). In this article the major issues related to integration and testing of Power Systems will be discussed taking into account different power system topologies (i.e. regulated bus, unregulated bus, battery bus, based on MPPT or S3R…). Also aspects about Power System AIT I/Fs (interfaces) and Umbilical I/Fs with the launcher and the Power SCOE I/Fs will be addressed. Last but not least, protection strategy of the Power System during AIT/AIV program will also be discussed. The objective of this discussion is also to provide the Power System Engineer with a checklist of key aspects linked to the satellite AIT/AIV program, that have to be considered in the early phases of a new power system development.
2011-01-01
On February 15, 2008, the National Academy of Engineering unveiled their list of 14 Grand Challenges for Engineering. Building off of tremendous advancements in the past century, these challenges were selected for their role in assuring a sustainable existence for the rapidly increasing global community. It is no accident that the first five Challenges on the list involve the development of sustainable energy sources and management of environmental resources. While the focus of this review is to address the single Grand Challenge of "develop carbon sequestration methods", is will soon be clear that several other Challenges are intrinsically tied to it through the principles of sustainability. How does the realm of biological engineering play a role in addressing these Grand Challenges? PMID:22047501
Workshop on Assurance for Autonomous Systems for Aviation
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Davies, Misty; Giannakopoulou, Dimitra; Neogi, Natasha
2016-01-01
This report describes the workshop on Assurance for Autonomous Systems for Aviation that was held in January 2016 in conjunction with the SciTech 2016 conference held in San Diego, CA. The workshop explored issues related to assurance for autonomous systems and also the idea of trust in these systems. Specifically, we focused on discussing current practices for assurance of autonomy, identifying barriers specific to autonomy as related to assurance as well as operational scenarios demonstrating the need to address the barriers. Furthermore, attention was given to identifying verification techniques that may be applicable to autonomy, as well as discussing new research directions needed to address barriers, thereby involving potential shifts in current practices.
Minimum accommodation for aerobrake assembly, phase 2
NASA Technical Reports Server (NTRS)
Katzberg, Stephen J.; Haynes, Davy A.; Tutterow, Robin D.; Watson, Judith J.; Russell, James W.
1994-01-01
A multi-element study was done to assess the practicality of a Space Station Freedom-based aerobrake system for the Space Exploration Initiative. The study was organized into six parts related to structure, aerodynamics, robotics and assembly, thermal protection system, inspection, and verification, all tied together by an integration study. The integration activity managed the broad issues related to meeting mission requirements. This report is a summary of the issues addressed by the integration team.
A System for Mailpiece ZIP Code Assignment through Contextual Analysis. Phase 2
1991-03-01
Segmentation Address Block Interpretation Automatic Feature Generation Word Recognition Feature Detection Word Verification Optical Character Recognition Directory...in the Phase III effort. 1.1 Motivation The United States Postal Service (USPS) deploys large numbers of optical character recognition (OCR) machines...4):208-218, November 1986. [2] Gronmeyer, L. K., Ruffin, B. W., Lybanon, M. A., Neely, P. L., and Pierce, S. E. An Overview of Optical Character Recognition (OCR
Riley, Mark R; Gerba, Charles P; Elimelech, Menachem
2011-03-31
The U.S. National Academy of Engineering (NAE) recently published a document presenting "Grand Challenges for Engineering". This list was proposed by leading engineers and scientists from around the world at the request of the U.S. National Science Foundation (NSF). Fourteen topics were selected for these grand challenges, and at least seven can be addressed using the tools and methods of biological engineering. Here we describe how biological engineers can address the challenge of providing access to clean drinking water. This issue must be addressed in part by removing or inactivating microbial and chemical contaminants in order to properly deliver water safe for human consumption. Despite many advances in technologies this challenge is expanding due to increased pressure on fresh water supplies and to new opportunities for growth of potentially pathogenic organisms.
Biological approaches for addressing the grand challenge of providing access to clean drinking water
2011-01-01
The U.S. National Academy of Engineering (NAE) recently published a document presenting "Grand Challenges for Engineering". This list was proposed by leading engineers and scientists from around the world at the request of the U.S. National Science Foundation (NSF). Fourteen topics were selected for these grand challenges, and at least seven can be addressed using the tools and methods of biological engineering. Here we describe how biological engineers can address the challenge of providing access to clean drinking water. This issue must be addressed in part by removing or inactivating microbial and chemical contaminants in order to properly deliver water safe for human consumption. Despite many advances in technologies this challenge is expanding due to increased pressure on fresh water supplies and to new opportunities for growth of potentially pathogenic organisms. PMID:21453515
Tackling the x-ray cargo inspection challenge using machine learning
NASA Astrophysics Data System (ADS)
Jaccard, Nicolas; Rogers, Thomas W.; Morton, Edward J.; Griffin, Lewis D.
2016-05-01
The current infrastructure for non-intrusive inspection of cargo containers cannot accommodate exploding com-merce volumes and increasingly stringent regulations. There is a pressing need to develop methods to automate parts of the inspection workflow, enabling expert operators to focus on a manageable number of high-risk images. To tackle this challenge, we developed a modular framework for automated X-ray cargo image inspection. Employing state-of-the-art machine learning approaches, including deep learning, we demonstrate high performance for empty container verification and specific threat detection. This work constitutes a significant step towards the partial automation of X-ray cargo image inspection.
NASA Technical Reports Server (NTRS)
Hebert, Phillip W.
2008-01-01
NASA/SSC's Mission in Rocket Propulsion Testing Is to Acquire Test Performance Data for Verification, Validation and Qualification of Propulsion Systems Hardware: Accurate, Reliable, Comprehensive, and Timely. Data Acquisition in a Rocket Propulsion Test Environment Is Challenging: a) Severe Temporal Transient Dynamic Environments; b) Large Thermal Gradients; c) Vacuum to high pressure regimes. A-3 Test Stand Development is equally challenging with respect to accommodating vacuum environment, operation of a CSG system, and a large quantity of data system and control channels to determine proper engine performance as well as Test Stand operation. SSC is currently in the process of providing modernized DAS, Control Systems, Video, and network systems for the A-3 Test Stand to overcome these challenges.
Nuclear Nonproliferation Ontology Assessment Team Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strasburg, Jana D.; Hohimer, Ryan E.
Final Report for the NA22 Simulations, Algorithm and Modeling (SAM) Ontology Assessment Team's efforts from FY09-FY11. The Ontology Assessment Team began in May 2009 and concluded in September 2011. During this two-year time frame, the Ontology Assessment team had two objectives: (1) Assessing the utility of knowledge representation and semantic technologies for addressing nuclear nonproliferation challenges; and (2) Developing ontological support tools that would provide a framework for integrating across the Simulation, Algorithm and Modeling (SAM) program. The SAM Program was going through a large assessment and strategic planning effort during this time and as a result, the relative importancemore » of these two objectives changed, altering the focus of the Ontology Assessment Team. In the end, the team conducted an assessment of the state of art, created an annotated bibliography, and developed a series of ontological support tools, demonstrations and presentations. A total of more than 35 individuals from 12 different research institutions participated in the Ontology Assessment Team. These included subject matter experts in several nuclear nonproliferation-related domains as well as experts in semantic technologies. Despite the diverse backgrounds and perspectives, the Ontology Assessment team functioned very well together and aspects could serve as a model for future inter-laboratory collaborations and working groups. While the team encountered several challenges and learned many lessons along the way, the Ontology Assessment effort was ultimately a success that led to several multi-lab research projects and opened up a new area of scientific exploration within the Office of Nuclear Nonproliferation and Verification.« less
Verification testing of the Aquasource Ultrafiltration Treatment System Model A35 was conducted from 12/1 - 12/31/98. The treatment system underwent microbial challenge testing on 1/22/99 and demonstrated a 5.5 log10 removal of Giardia cysts and a 6.5 log10 removal of Cryptospori...
Hayabusa: Navigation Challenges for Earth Return
NASA Technical Reports Server (NTRS)
Haw, Robert J.; Bhaskaran, S.; Strauss, W.; Sklyanskiy, E.; Graat, E. J.; Smith, J. J.; Menom, P.; Ardalan, S.; Ballard, C.; Williams, P.;
2011-01-01
Hayabusa was a JAXA sample-return mission to Itokawa navigated, in part, by JPL personnel. Hayabusa survived several near mission-ending failures at Itokawa yet returned to Earth with an asteroid regolith sample on June 13, 2010. This paper describes NASA/JPL's participation in the Hayabusa mission during the last 100 days of its mission, wherein JPL provided tracking data and orbit determination, plus verification of maneuver design and entry, descent and landing.
1987-06-01
described the state )f ruaturity of software engineering as being equivalent to the state of maturity of Civil Engineering before Pythagoras invented the...formal verification languages, theorem provers or secure configuration 0 management tools would have to be maintained and used in the PDSS Center to
Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon
2018-03-02
Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.
2018-01-01
Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641
Qiao, Guixiu; Weiss, Brian A.
2016-01-01
Unexpected equipment downtime is a ‘pain point’ for manufacturers, especially in that this event usually translates to financial losses. To minimize this pain point, manufacturers are developing new health monitoring, diagnostic, prognostic, and maintenance (collectively known as prognostics and health management (PHM)) techniques to advance the state-of-the-art in their maintenance strategies. The manufacturing community has a wide-range of needs with respect to the advancement and integration of PHM technologies to enhance manufacturing robotic system capabilities. Numerous researchers, including personnel from the National Institute of Standards and Technology (NIST), have identified a broad landscape of barriers and challenges to advancing PHM technologies. One such challenge is the verification and validation of PHM technology through the development of performance metrics, test methods, reference datasets, and supporting tools. Besides documenting and presenting the research landscape, NIST personnel are actively researching PHM for robotics to promote the development of innovative sensing technology and prognostic decision algorithms and to produce a positional accuracy test method that emphasizes the identification of static and dynamic positional accuracy. The test method development will provide manufacturers with a methodology that will allow them to quickly assess the positional health of their robot systems along with supporting the verification and validation of PHM techniques for the robot system. PMID:28058172
Field Evaluation of the Performance of the RTU Challenge Unit: Daikin Rebel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katipamula, Srinivas; Wang, W.; Ngo, Hung
2017-05-31
Packaged rooftop air-conditioning units (RTUs) are used in 44% (2.5 million) of all commercial buildings, serving over 57% (46 billion square feet) of the commercial building floor space in the United States (EIA 2012). The primary energy consumption associated with RTUs is over 2.2 quads annually. Therefore, even a small improvement in efficiency or part-load operation of these units can lead to significant reductions in energy use and carbon emissions. Starting in 2011, the U.S. Department of Energy’s (DOE’s) Building Technologies Office funded a series of projects related to RTUs. Some projects were intended to improve the operating efficiency ofmore » the existing RTUs, while others were focused on improving the operating efficiency of new units. This report documents the field-testing and comparison of the seasonal efficiency of a state-of-art RTU Challenge unit and a standard unit. Section II provides the background for the work. Section III describes the measurement and verification plan for the field tests. Section IV describes the measurement and verification evaluation plan. The results are described in Section V. The lessons learned and recommendations for future work are presented in Section VI. A list of references is provided in Section VII.« less
Qiao, Guixiu; Weiss, Brian A
2016-01-01
Unexpected equipment downtime is a 'pain point' for manufacturers, especially in that this event usually translates to financial losses. To minimize this pain point, manufacturers are developing new health monitoring, diagnostic, prognostic, and maintenance (collectively known as prognostics and health management (PHM)) techniques to advance the state-of-the-art in their maintenance strategies. The manufacturing community has a wide-range of needs with respect to the advancement and integration of PHM technologies to enhance manufacturing robotic system capabilities. Numerous researchers, including personnel from the National Institute of Standards and Technology (NIST), have identified a broad landscape of barriers and challenges to advancing PHM technologies. One such challenge is the verification and validation of PHM technology through the development of performance metrics, test methods, reference datasets, and supporting tools. Besides documenting and presenting the research landscape, NIST personnel are actively researching PHM for robotics to promote the development of innovative sensing technology and prognostic decision algorithms and to produce a positional accuracy test method that emphasizes the identification of static and dynamic positional accuracy. The test method development will provide manufacturers with a methodology that will allow them to quickly assess the positional health of their robot systems along with supporting the verification and validation of PHM techniques for the robot system.
Experimental verification of layout physical verification of silicon photonics
NASA Astrophysics Data System (ADS)
El Shamy, Raghi S.; Swillam, Mohamed A.
2018-02-01
Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.
TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childerson, M.T.; Fujita, R.K.
1985-01-01
A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scribner, R.A.
Sea-launched cruise missiles (SLCMs) present some particularly striking problems for both national security and arms control. These small, dual-purpose, difficult to detect weapons present some formidable challenges for verification in any scheme that attempts to limit rather than eliminate them. Conventionally armed SLCMs offer to the navies of both superpowers important offensive and defensive capabilities. Nuclear armed, long-range, land-attack SLCMs, on the other hand, seem to pose destabilizing threats and otherwise have questionable value, despite strong US support for extensive deployment of them. If these weapons are not constrained, their deployment could circumvent gains which might be made in agreementsmore » directly reducing of strategic nuclear weapons. This paper reviews the technology and planned deployments of SLCMs, the verification schemes which have been discussed and are being investigated to try to deal with the problem, and examines the proposed need for and possible uses of SLCMs. It presents an overview of the problem technically, militarily, and politically.« less
EVA Design, Verification, and On-Orbit Operations Support Using Worksite Analysis
NASA Technical Reports Server (NTRS)
Hagale, Thomas J.; Price, Larry R.
2000-01-01
The International Space Station (ISS) design is a very large and complex orbiting structure with thousands of Extravehicular Activity (EVA) worksites. These worksites are used to assemble and maintain the ISS. The challenge facing EVA designers was how to design, verify, and operationally support such a large number of worksites within cost and schedule. This has been solved through the practical use of computer aided design (CAD) graphical techniques that have been developed and used with a high degree of success over the past decade. The EVA design process allows analysts to work concurrently with hardware designers so that EVA equipment can be incorporated and structures configured to allow for EVA access and manipulation. Compliance with EVA requirements is strictly enforced during the design process. These techniques and procedures, coupled with neutral buoyancy underwater testing, have proven most valuable in the development, verification, and on-orbit support of planned or contingency EVA worksites.
Dual-mode capability for hardware-in-the-loop
NASA Astrophysics Data System (ADS)
Vamivakas, A. N.; Jackson, Ron L.
2000-07-01
This paper details a Hardware-in-the-Loop Facility (HIL) developed for evaluation and verification of a missile system with dual mode capability. The missile has the capability of tracking and intercepting a target using either an RF antenna or an IR sensor. The testing of a dual mode system presents a significant challenge in the development of the HIL facility. An IR and RF target environment must be presented simultaneously to the missile under test. These targets, simulated by IR and RF sources, must be presented to the missile under test without interference from each other. The location of each source is critical in the development of the HIL facility. The requirements for building a HIL facility with dual mode capability and the methodology for testing the dual mode system are defined within this paper. Methods for the verification and validation of the facility are discussed.
On Crowd-verification of Biological Networks
Ansari, Sam; Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Hayes, William; Hoeng, Julia; Iskandar, Anita; Kleiman, Robin; Norel, Raquel; O’Neel, Bruce; Peitsch, Manuel C.; Poussin, Carine; Pratt, Dexter; Rhrissorrakrai, Kahn; Schlage, Walter K.; Stolovitzky, Gustavo; Talikka, Marja
2013-01-01
Biological networks with a structured syntax are a powerful way of representing biological information generated from high density data; however, they can become unwieldy to manage as their size and complexity increase. This article presents a crowd-verification approach for the visualization and expansion of biological networks. Web-based graphical interfaces allow visualization of causal and correlative biological relationships represented using Biological Expression Language (BEL). Crowdsourcing principles enable participants to communally annotate these relationships based on literature evidences. Gamification principles are incorporated to further engage domain experts throughout biology to gather robust peer-reviewed information from which relationships can be identified and verified. The resulting network models will represent the current status of biological knowledge within the defined boundaries, here processes related to human lung disease. These models are amenable to computational analysis. For some period following conclusion of the challenge, the published models will remain available for continuous use and expansion by the scientific community. PMID:24151423
L(sub 1) Adaptive Flight Control System: Flight Evaluation and Technology Transition
NASA Technical Reports Server (NTRS)
Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Isaac; Gregory, Irene M.; Cao, Chengyu
2010-01-01
Certification of adaptive control technologies for both manned and unmanned aircraft represent a major challenge for current Verification and Validation techniques. A (missing) key step towards flight certification of adaptive flight control systems is the definition and development of analysis tools and methods to support Verification and Validation for nonlinear systems, similar to the procedures currently used for linear systems. In this paper, we describe and demonstrate the advantages of L(sub l) adaptive control architectures for closing some of the gaps in certification of adaptive flight control systems, which may facilitate the transition of adaptive control into military and commercial aerospace applications. As illustrative examples, we present the results of a piloted simulation evaluation on the NASA AirSTAR flight test vehicle, and results of an extensive flight test program conducted by the Naval Postgraduate School to demonstrate the advantages of L(sub l) adaptive control as a verifiable robust adaptive flight control system.
Schumann, A; Priegnitz, M; Schoene, S; Enghardt, W; Rohling, H; Fiedler, F
2016-10-07
Range verification and dose monitoring in proton therapy is considered as highly desirable. Different methods have been developed worldwide, like particle therapy positron emission tomography (PT-PET) and prompt gamma imaging (PGI). In general, these methods allow for a verification of the proton range. However, quantification of the dose from these measurements remains challenging. For the first time, we present an approach for estimating the dose from prompt γ-ray emission profiles. It combines a filtering procedure based on Gaussian-powerlaw convolution with an evolutionary algorithm. By means of convolving depth dose profiles with an appropriate filter kernel, prompt γ-ray depth profiles are obtained. In order to reverse this step, the evolutionary algorithm is applied. The feasibility of this approach is demonstrated for a spread-out Bragg-peak in a water target.
Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics
NASA Astrophysics Data System (ADS)
Camati, Patrice A.; Serra, Roberto M.
2018-04-01
Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.
An Investigation into Solution Verification for CFD-DEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fullmer, William D.; Musser, Jordan
This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of anmore » experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6 th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing different randomized particle configurations of the same general problem (for the fictitious case) or different instances of freezing a transient simulation, the numerical uncertainties appeared to be on the same order of magnitude as ensemble or time averaging uncertainties. By testing different drag laws, almost all cases studied show that model form uncertainty in this one, very important closure relation was larger than the numerical uncertainty, at least with a reasonable CFD grid, roughly five particle diameters. In this study, the diffusion width (filtering length scale) was mostly set at a constant of six particle diameters. A few exploratory tests were performed to show that similar convergence behavior was observed for diffusion widths greater than approximately two particle diameters. However, this subject was not investigated in great detail because determining an appropriate filter size is really a validation question which must be determined by comparison to experimental or highly accurate numerical data. Future studies are being considered targeting solution verification of transient simulations as well as validation of the filter size with direct numerical simulation data.« less
Degeling, Koen; Koffijberg, Hendrik; IJzerman, Maarten J
2017-02-01
The ongoing development of genomic medicine and the use of molecular and imaging markers in personalized medicine (PM) has arguably challenged the field of health economic modeling (HEM). This study aims to provide detailed insights into the current status of HEM in PM, in order to identify if and how modeling methods are used to address the challenges described in literature. Areas covered: A review was performed on studies that simulate health economic outcomes for personalized clinical pathways. Decision tree modeling and Markov modeling were the most observed methods. Not all identified challenges were frequently found, challenges regarding companion diagnostics, diagnostic performance, and evidence gaps were most often found. However, the extent to which challenges were addressed varied considerably between studies. Expert commentary: Challenges for HEM in PM are not yet routinely addressed which may indicate that either (1) their impact is less severe than expected, (2) they are hard to address and therefore not managed appropriately, or (3) HEM in PM is still in an early stage. As evidence on the impact of these challenges is still lacking, we believe that more concrete examples are needed to illustrate the identified challenges and to demonstrate methods to handle them.
2010 Panel on the Biomaterials Grand Challenges
Reichert, William “Monty”; Ratner, Buddy D.; Anderson, James; Coury, Art; Hoffman, Allan S.; Laurencin, Cato T.; Tirrell, David
2014-01-01
In 2009, the National Academy for Engineering issued the Grand Challenges for Engineering in the 21st Century comprised of 14 technical challenges that must be addressed to build a healthy, profitable, sustainable, and secure global community (http://www.engineeringchallenges.org). Although crucial, none of the NEA Grand Challenges adequately addressed the challenges that face the biomaterials community. In response to the NAE Grand Challenges, Monty Reichert of Duke University organized a panel entitled Grand Challenges in Biomaterials at the at the 2010 Society for Biomaterials Annual Meeting in Seattle. Six members of the National Academies—Buddy Ratner, James Anderson, Allan Hoffman, Art Coury, Cato Laurencin, and David Tirrell—were asked to propose a grand challenge to the audience that, if met, would significantly impact the future of biomaterials and medical devices. Successfully meeting these challenges will speed the 60-plus year transition from commodity, off-the-shelf biomaterials to bioengineered chemistries, and biomaterial devices that will significantly advance our ability to address patient needs and also to create new market opportunities. PMID:21171147
Orion GN&C Fault Management System Verification: Scope And Methodology
NASA Technical Reports Server (NTRS)
Brown, Denise; Weiler, David; Flanary, Ronald
2016-01-01
In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.
A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables
NASA Astrophysics Data System (ADS)
Huang, Laura X.; Isaac, George A.; Sheng, Grant
2014-01-01
This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.
Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.
Li, Haoxiang; Hua, Gang
2018-04-01
Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.
4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study
NASA Astrophysics Data System (ADS)
Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia
2015-08-01
At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.
Advanced Avionics Verification and Validation Phase II (AAV&V-II)
1999-01-01
Algorithm 2-8 2.7 The Weak Control Dependence Algorithm 2-8 2.8 The Indirect Dependence Algorithms 2-9 2.9 Improvements to the Pleiades Object...describes some modifications made to the Pleiades object management system to increase the speed of the analysis. 2.1 THE INTERPROCEDURAL CONTROL FLOW...slow as the edges in the graph increased. The time to insert edges was addressed by enhancements to the Pleiades object management system, which are
NASA Technical Reports Server (NTRS)
Thresher, R. W. (Editor)
1981-01-01
Recent progress in the analysis and prediction of the dynamic behavior of wind turbine generators is discussed. The following areas were addressed: (1) the adequacy of state of the art analysis tools for designing the next generation of wind power systems; (2) the use of state of the art analysis tools designers; and (3) verifications of theory which might be lacking or inadequate. Summaries of these informative discussions as well as the questions and answers which followed each paper are documented in the proceedings.
Advanced orbiting systems test-bedding and protocol verification
NASA Technical Reports Server (NTRS)
Noles, James; De Gree, Melvin
1989-01-01
The Consultative Committee for Space Data Systems (CCSDS) has begun the development of a set of protocol recommendations for Advanced Orbiting Systems (SOS). The AOS validation program and formal definition of AOS protocols are reviewed, and the configuration control of the AOS formal specifications is summarized. Independent implementations of the AOS protocols by NASA and ESA are discussed, and cross-support/interoperability tests which will allow the space agencies of various countries to share AOS communication facilities are addressed.
Shah, Peer Azmat; Hasbullah, Halabi B; Lawal, Ibrahim A; Aminu Mu'azu, Abubakar; Tang Jung, Low
2014-01-01
Due to the proliferation of handheld mobile devices, multimedia applications like Voice over IP (VoIP), video conferencing, network music, and online gaming are gaining popularity in recent years. These applications are well known to be delay sensitive and resource demanding. The mobility of mobile devices, running these applications, across different networks causes delay and service disruption. Mobile IPv6 was proposed to provide mobility support to IPv6-based mobile nodes for continuous communication when they roam across different networks. However, the Route Optimization procedure in Mobile IPv6 involves the verification of mobile node's reachability at the home address and at the care-of address (home test and care-of test) that results in higher handover delays and signalling overhead. This paper presents an enhanced procedure, time-based one-time password Route Optimization (TOTP-RO), for Mobile IPv6 Route Optimization that uses the concepts of shared secret Token, time based one-time password (TOTP) along with verification of the mobile node via direct communication and maintaining the status of correspondent node's compatibility. The TOTP-RO was implemented in network simulator (NS-2) and an analytical analysis was also made. Analysis showed that TOTP-RO has lower handover delays, packet loss, and signalling overhead with an increased level of security as compared to the standard Mobile IPv6's Return-Routability-based Route Optimization (RR-RO).
Diagnostic decision-making and strategies to improve diagnosis.
Thammasitboon, Satid; Cutrer, William B
2013-10-01
A significant portion of diagnostic errors arises through cognitive errors resulting from inadequate knowledge, faulty data gathering, and/or faulty verification. Experts estimate that 75% of diagnostic failures can be attributed to clinician diagnostic thinking failure. The cognitive processes that underlie diagnostic thinking of clinicians are complex and intriguing, and it is imperative that clinicians acquire explicit appreciation and application of different cognitive approaches to make decisions better. A dual-process model that unifies many theories of decision-making has emerged as a promising template for understanding how clinicians think and judge efficiently in a diagnostic reasoning process. The identification and implementation of strategies for decreasing or preventing such diagnostic errors has become a growing area of interest and research. Suggested strategies to decrease diagnostic error incidence include increasing clinician's clinical expertise and avoiding inherent cognitive errors to make decisions better. Implementing Interventions focused solely on avoiding errors may work effectively for patient safety issues such as medication errors. Addressing cognitive errors, however, requires equal effort on expanding the individual clinician's expertise. Providing cognitive support to clinicians for robust diagnostic decision-making serves as the final strategic target for decreasing diagnostic errors. Clinical guidelines and algorithms offer another method for streamlining decision-making and decreasing likelihood of cognitive diagnostic errors. Addressing cognitive processing errors is undeniably the most challenging task in reducing diagnostic errors. While many suggested approaches exist, they are mostly based on theories and sciences in cognitive psychology, decision-making, and education. The proposed interventions are primarily suggestions and very few of them have been tested in the actual practice settings. Collaborative research effort is required to effectively address cognitive processing errors. Researchers in various areas, including patient safety/quality improvement, decision-making, and problem solving, must work together to make medical diagnosis more reliable. © 2013 Mosby, Inc. All rights reserved.
ERIC Educational Resources Information Center
Dolezel, Diane M.; Morrison, Eileen E.
2017-01-01
Health information management (HIM) professionals must address ethical challenges in their role as guardians of patients' personal information and organizations' proprietary information. Because of this need, HIM educators strive prepare their students to address these challenges. Unfortunately, little evidence exists about specific areas of…
Ethical Challenges in the Teaching of Multicultural Course Work
ERIC Educational Resources Information Center
Fier, Elizabeth Boyer; Ramsey, MaryLou
2005-01-01
The authors explore the ethical issues and challenges frequently encountered by counselor educators of multicultural course work. Existing ethics codes are examined, and the need for greater specificity with regard to teaching courses of multicultural content is addressed. Options for revising existing codes to better address the challenges of…
78 FR 68449 - Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With Multiple Chronic... applications for the ``AHRQ RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With...
Quantitative safety assessment of air traffic control systems through system control capacity
NASA Astrophysics Data System (ADS)
Guo, Jingjing
Quantitative Safety Assessments (QSA) are essential to safety benefit verification and regulations of developmental changes in safety critical systems like the Air Traffic Control (ATC) systems. Effectiveness of the assessments is particularly desirable today in the safe implementations of revolutionary ATC overhauls like NextGen and SESAR. QSA of ATC systems are however challenged by system complexity and lack of accident data. Extending from the idea "safety is a control problem" in the literature, this research proposes to assess system safety from the control perspective, through quantifying a system's "control capacity". A system's safety performance correlates to this "control capacity" in the control of "safety critical processes". To examine this idea in QSA of the ATC systems, a Control-capacity Based Safety Assessment Framework (CBSAF) is developed which includes two control capacity metrics and a procedural method. The two metrics are Probabilistic System Control-capacity (PSC) and Temporal System Control-capacity (TSC); each addresses an aspect of a system's control capacity. And the procedural method consists three general stages: I) identification of safety critical processes, II) development of system control models and III) evaluation of system control capacity. The CBSAF was tested in two case studies. The first one assesses an en-route collision avoidance scenario and compares three hypothetical configurations. The CBSAF was able to capture the uncoordinated behavior between two means of control, as was observed in a historic midair collision accident. The second case study compares CBSAF with an existing risk based QSA method in assessing the safety benefits of introducing a runway incursion alert system. Similar conclusions are reached between the two methods, while the CBSAF has the advantage of simplicity and provides a new control-based perspective and interpretation to the assessments. The case studies are intended to investigate the potential and demonstrate the utilities of CBSAF and are not intended for thorough studies of collision avoidance and runway incursions safety, which are extremely challenging problems. Further development and thorough validations are required to allow CBSAF to reach implementation phases, e.g. addressing the issues of limited scalability and subjectivity.
Verifying the Comprehensive Nuclear-Test-Ban Treaty by Radioxenon Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ringbom, Anders
2005-05-24
The current status of the ongoing establishment of a verification system for the Comprehensive Nuclear-Test-Ban Treaty using radioxenon detection is discussed. As an example of equipment used in this application the newly developed fully automatic noble gas sampling and detection system SAUNA is described, and data collected with this system are discussed. It is concluded that the most important remaining scientific challenges in the field concern event categorization and meteorological backtracking.
Navy DDG-51 and DDG-1000 Destroyer Programs: Background and Issues for Congress
2013-10-22
two technologies previously identified as the most challenging — digital-beam-forming and transmit-receive modules—have been demonstrated in a...job of coming up with an affordable solution to a leap-ahead capability for the fleet.”31 In his presentation, Vandroff showed a slide comparing the...foreign ballistic missile data in support of international treaty verification. CJR represents an integrated mission solution : ship, radar suite, and
NASA Technical Reports Server (NTRS)
Jung, David S,; Lee, Leonine S.; Manzo, Michelle A.
2010-01-01
This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 3 - Volume II Appendices to Part 3 - Volume I.
NASA Technical Reports Server (NTRS)
Jung, David S.; Manzo, Michelle A.
2010-01-01
This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 2 - Volume II Appendix A to Part 2 - Volume I.
NASA Astrophysics Data System (ADS)
Fredlund, T.; Linder, C.; Airey, J.
2015-09-01
In this article we characterize transient learning challenges as learning challenges that arise out of teaching situations rather than conflicts with prior knowledge. We propose that these learning challenges can be identified by paying careful attention to the representations that students produce. Once a transient learning challenge has been identified, teachers can create interventions to address it. By illustration, we argue that an appropriate way to design such interventions is to create variation around the disciplinary-relevant aspects associated with the transient learning challenge.
ERIC Educational Resources Information Center
Comptroller General of the U.S., Washington, DC.
This report addresses the major performance and management challenges that have limited the effectiveness of the Department of Education in carrying out its mission. The booklet addresses corrective actions that Education has taken or initiated on these challenges--including a number of management initiatives to improve controls over the…
NREL, SolarCity Addressing Challenges of High Penetrations of Distributed
Companies NREL, SolarCity Addressing Challenges of High Penetrations of Distributed Photovoltaics NREL is , reliability, and stability challenges of interconnecting high penetrations of distributed photovoltaics (PV country that distributed solar is not a liability for reliability-and can even be an asset. Project Impact
ERIC Educational Resources Information Center
Alkaher, Iris; Tal, Tali
2016-01-01
This interpretive study identifies challenges of working with Bedouin and Jewish Israeli youth in two multicultural projects: education for sustainability and place-conscious education. It also describes the ways the adult project leaders addressed these challenges and their views on the effectiveness of their decisions. Participants comprised 16…
Quantifying and managing uncertainty in operational modal analysis
NASA Astrophysics Data System (ADS)
Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.
2018-03-01
Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.
Validation of a multi-layer Green's function code for ion beam transport
NASA Astrophysics Data System (ADS)
Walker, Steven; Tweed, John; Tripathi, Ram; Badavi, Francis F.; Miller, Jack; Zeitlin, Cary; Heilbronn, Lawrence
To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiations is needed. In consequence, a new version of the HZETRN code capable of simulating high charge and energy (HZE) ions with either laboratory or space boundary conditions is currently under development. The new code, GRNTRN, is based on a Green's function approach to the solution of Boltzmann's transport equation and like its predecessor is deterministic in nature. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and down shift. Code validation in the laboratory environment is addressed by showing that GRNTRN accurately predicts energy loss spectra as measured by solid-state detectors in ion beam experiments with multi-layer targets. In order to validate the code with space boundary conditions, measured particle fluences are propagated through several thicknesses of shielding using both GRNTRN and the current version of HZETRN. The excellent agreement obtained indicates that GRNTRN accurately models the propagation of HZE ions in the space environment as well as in laboratory settings and also provides verification of the HZETRN propagator.
The Hydrologic Ensemble Prediction Experiment (HEPEX)
NASA Astrophysics Data System (ADS)
Wood, Andy; Wetterhall, Fredrik; Ramos, Maria-Helena
2015-04-01
The Hydrologic Ensemble Prediction Experiment was established in March, 2004, at a workshop hosted by the European Center for Medium Range Weather Forecasting (ECMWF), and co-sponsored by the US National Weather Service (NWS) and the European Commission (EC). The HEPEX goal was to bring the international hydrological and meteorological communities together to advance the understanding and adoption of hydrological ensemble forecasts for decision support. HEPEX pursues this goal through research efforts and practical implementations involving six core elements of a hydrologic ensemble prediction enterprise: input and pre-processing, ensemble techniques, data assimilation, post-processing, verification, and communication and use in decision making. HEPEX has grown through meetings that connect the user, forecast producer and research communities to exchange ideas, data and methods; the coordination of experiments to address specific challenges; and the formation of testbeds to facilitate shared experimentation. In the last decade, HEPEX has organized over a dozen international workshops, as well as sessions at scientific meetings (including AMS, AGU and EGU) and special issues of scientific journals where workshop results have been published. Through these interactions and an active online blog (www.hepex.org), HEPEX has built a strong and active community of nearly 400 researchers & practitioners around the world. This poster presents an overview of recent and planned HEPEX activities, highlighting case studies that exemplify the focus and objectives of HEPEX.
Model Checking A Self-Stabilizing Synchronization Protocol for Arbitrary Digraphs
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2012-01-01
This report presents the mechanical verification of a self-stabilizing distributed clock synchronization protocol for arbitrary digraphs in the absence of faults. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. The system under study is an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. This protocol deterministically converges within a time bound that is a linear function of the self-stabilization period. A bounded model of the protocol is verified using the Symbolic Model Verifier (SMV) for a subset of digraphs. Modeling challenges of the protocol and the system are addressed. The model checking effort is focused on verifying correctness of the bounded model of the protocol as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period.
NASA Astrophysics Data System (ADS)
Zhao, Yongli; Hu, Liyazhou; Wang, Wei; Li, Yajie; Zhang, Jie
2017-01-01
With the continuous opening of resource acquisition and application, there are a large variety of network hardware appliances deployed as the communication infrastructure. To lunch a new network application always implies to replace the obsolete devices and needs the related space and power to accommodate it, which will increase the energy and capital investment. Network function virtualization1 (NFV) aims to address these problems by consolidating many network equipment onto industry standard elements such as servers, switches and storage. Many types of IT resources have been deployed to run Virtual Network Functions (vNFs), such as virtual switches and routers. Then how to deploy NFV in optical transport networks is a of great importance problem. This paper focuses on this problem, and gives an implementation architecture of NFV-enabled optical transport networks based on Software Defined Optical Networking (SDON) with the procedure of vNFs call and return. Especially, an implementation solution of NFV-enabled optical transport node is designed, and a parallel processing method for NFV-enabled OTN nodes is proposed. To verify the performance of NFV-enabled SDON, the protocol interaction procedures of control function virtualization and node function virtualization are demonstrated on SDON testbed. Finally, the benefits and challenges of the parallel processing method for NFV-enabled OTN nodes are simulated and analyzed.
Library of Advanced Materials for Engineering (LAME) 4.44.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scherzinger, William M.; Lester, Brian T.
Accurate and efficient constitutive modeling remains a cornerstone issues for solid mechanics analysis. Over the years, the LAME advanced material model library has grown to address this challenge by implementing models capable of describing material systems spanning soft polymers to s ti ff ceramics including both isotropic and anisotropic responses. Inelastic behaviors including (visco) plasticity, damage, and fracture have all incorporated for use in various analyses. This multitude of options and flexibility, however, comes at the cost of many capabilities, features, and responses and the ensuing complexity in the resulting implementation. Therefore, to enhance confidence and enable the utilization ofmore » the LAME library in application, this effort seeks to document and verify the various models in the LAME library. Specifically, the broader strategy, organization, and interface of the library itself is first presented. The physical theory, numerical implementation, and user guide for a large set of models is then discussed. Importantly, a number of verification tests are performed with each model to not only have confidence in the model itself but also highlight some important response characteristics and features that may be of interest to end-users. Finally, in looking ahead to the future, approaches to add material models to this library and further expand the capabilities are presented.« less
Induced polarization for characterizing and monitoring soil stabilization processes
NASA Astrophysics Data System (ADS)
Saneiyan, S.; Ntarlagiannis, D.; Werkema, D. D., Jr.
2017-12-01
Soil stabilization is critical in addressing engineering problems related to building foundation support, road construction and soil erosion among others. To increase soil strength, the stiffness of the soil is enhanced through injection/precipitation of a chemical agents or minerals. Methods such as cement injection and microbial induced carbonate precipitation (MICP) are commonly applied. Verification of a successful soil stabilization project is often challenging as treatment areas are spatially extensive and invasive sampling is expensive, time consuming and limited to sporadic points at discrete times. The geophysical method, complex conductivity (CC), is sensitive to mineral surface properties, hence a promising method to monitor soil stabilization projects. Previous laboratory work has established the sensitivity of CC on MICP processes. We performed a MICP soil stabilization projects and collected CC data for the duration of the treatment (15 days). Subsurface images show small, but very clear changes, in the area of MICP treatment; the changes observed fully agree with the bio-geochemical monitoring, and previous laboratory experiments. Our results strongly suggest that CC is sensitive to field MICP treatments. Finally, our results show that good quality data alone are not adequate for the correct interpretation of field CC data, at least when the signals are low. Informed data processing routines and the inverse modeling parameters are required to produce optimal results.
Mazzoleni, Stefano; Toth, Andras; Munih, Marko; Van Vaerenbergh, Jo; Cavallo, Giuseppe; Micera, Silvestro; Dario, Paolo; Guglielmelli, Eugenio
2009-10-30
One of the main scientific and technological challenges of rehabilitation bioengineering is the development of innovative methodologies, based on the use of appropriate technological devices, for an objective assessment of patients undergoing a rehabilitation treatment. Such tools should be as fast and cheap to use as clinical scales, which are currently the daily instruments most widely used in the routine clinical practice. A human-centered approach was used in the design and development of a mechanical structure equipped with eight force/torque sensors that record quantitative data during the initiation of a predefined set of Activities of Daily Living (ADL) tasks, in isometric conditions. Preliminary results validated the appropriateness, acceptability and functionality of the proposed platform, that has become now a tool used for clinical research in three clinical centres. This paper presented the design and development of an innovative platform for whole-body force and torque measurements on human subjects. The platform has been designed to perform accurate quantitative measurements in isometric conditions with the specific aim to address the needs for functional assessment tests of patients undergoing a rehabilitation treatment as a consequence of a stroke.The versatility of the system also enlightens several other interesting possible areas of application for therapy in neurorehabilitation, for research in basic neuroscience, and more.
Modeling the Transfer Function for the Dark Energy Survey
Chang, C.
2015-03-04
We present a forward-modeling simulation framework designed to model the data products from the Dark Energy Survey (DES). This forward-model process can be thought of as a transfer function—a mapping from cosmological/astronomical signals to the final data products used by the scientists. Using output from the cosmological simulations (the Blind Cosmology Challenge), we generate simulated images (the Ultra Fast Image Simulator) and catalogs representative of the DES data. In this work we demonstrate the framework by simulating the 244 deg 2 coadd images and catalogs in five bands for the DES Science Verification data. The simulation output is compared withmore » the corresponding data to show that major characteristics of the images and catalogs can be captured. We also point out several directions of future improvements. Two practical examples—star-galaxy classification and proximity effects on object detection—are then used to illustrate how one can use the simulations to address systematics issues in data analysis. With clear understanding of the simplifications in our model, we show that one can use the simulations side-by-side with data products to interpret the measurements. This forward modeling approach is generally applicable for other upcoming and future surveys. It provides a powerful tool for systematics studies that is sufficiently realistic and highly controllable.« less
Scheirer, Walter J; de Rezende Rocha, Anderson; Sapkota, Archana; Boult, Terrance E
2013-07-01
To date, almost all experimental evaluations of machine learning-based recognition algorithms in computer vision have taken the form of "closed set" recognition, whereby all testing classes are known at training time. A more realistic scenario for vision applications is "open set" recognition, where incomplete knowledge of the world is present at training time, and unknown classes can be submitted to an algorithm during testing. This paper explores the nature of open set recognition and formalizes its definition as a constrained minimization problem. The open set recognition problem is not well addressed by existing algorithms because it requires strong generalization. As a step toward a solution, we introduce a novel "1-vs-set machine," which sculpts a decision space from the marginal distances of a 1-class or binary SVM with a linear kernel. This methodology applies to several different applications in computer vision where open set recognition is a challenging problem, including object recognition and face verification. We consider both in this work, with large scale cross-dataset experiments performed over the Caltech 256 and ImageNet sets, as well as face matching experiments performed over the Labeled Faces in the Wild set. The experiments highlight the effectiveness of machines adapted for open set evaluation compared to existing 1-class and binary SVMs for the same tasks.
A coupled vegetation/sediment transport model for dryland environments
NASA Astrophysics Data System (ADS)
Mayaud, Jerome R.; Bailey, Richard M.; Wiggs, Giles F. S.
2017-04-01
Dryland regions are characterized by patchy vegetation, erodible surfaces, and erosive aeolian processes. Understanding how these constituent factors interact and shape landscape evolution is critical for managing potential environmental and anthropogenic impacts in drylands. However, modeling wind erosion on partially vegetated surfaces is a complex problem that has remained challenging for researchers. We present the new, coupled cellular automaton Vegetation and Sediment TrAnsport (ViSTA) model, which is designed to address fundamental questions about the development of arid and semiarid landscapes in a spatially explicit way. The technical aspects of the ViSTA model are described, including a new method for directly imposing oblique wind and transport directions onto a cell-based domain. Verification tests for the model are reported, including stable state solutions, the impact of drought and fire stress, wake flow dynamics, temporal scaling issues, and the impact of feedbacks between sediment movement and vegetation growth on landscape morphology. The model is then used to simulate an equilibrium nebkha dune field, and the resultant bed forms are shown to have very similar size and spacing characteristics to nebkhas observed in the Skeleton Coast, Namibia. The ViSTA model is a versatile geomorphological tool that could be used to predict threshold-related transitions in a range of dryland ecogeomorphic systems.
NASA Astrophysics Data System (ADS)
Nikolopoulos, Georgios M.
2018-01-01
We consider a recently proposed entity authentication protocol in which a physical unclonable key is interrogated by random coherent states of light, and the quadratures of the scattered light are analyzed by means of a coarse-grained homodyne detection. We derive a sufficient condition for the protocol to be secure against an emulation attack in which an adversary knows the challenge-response properties of the key and moreover, he can access the challenges during the verification. The security analysis relies on Holevo's bound and Fano's inequality, and suggests that the protocol is secure against the emulation attack for a broad range of physical parameters that are within reach of today's technology.
NASA Technical Reports Server (NTRS)
Hughes, Mark S.; Davis, Dawn M.; Bakker, Henry J.; Jensen, Scott L.
2007-01-01
This viewgraph presentation reviews the design of the electrical systems that are required for the testing of rockets at the Rocket Propulsion Facility at NASA Stennis Space Center (NASA SSC). NASA/SSC s Mission in Rocket Propulsion Testing Is to Acquire Test Performance Data for Verification, Validation and Qualification of Propulsion Systems Hardware. These must be accurate reliable comprehensive and timely. Data acquisition in a rocket propulsion test environment is challenging: severe temporal transient dynamic environments, large thermal gradients, vacuum to 15 ksi pressure regimes SSC has developed and employs DAS, control systems and control systems and robust instrumentation that effectively satisfies these challenges.
Comparison of Traditional and Innovative Techniques to Solve Technical Challenges
NASA Technical Reports Server (NTRS)
Perchonok, Michele
2011-01-01
This slide presentation reviews the use of traditional and innovative techniques to solve technical challenges in food storage technology. The planning for a mission to Mars is underway, and the food storage technology improvements requires that improvements be made. This new technology is required, because current food storage technology is inadequate,refrigerators or freezers are not available for food preservation, and that a shelf life of 5 years is expected. A 10 year effort to improve food packaging technology has not enhanced significantly food packaging capabilities. Two innovation techniques were attempted InnoCentive and Yet2.com and have provided good results, and are still under due diligence for solver verification.
Strategies to Address Common Challenges When Teaching in an Active Learning Classroom
ERIC Educational Resources Information Center
Petersen, Christina I.; Gorman, Kristen S.
2014-01-01
This chapter provides practical strategies for addressing common challenges that arise for teachers in active learning classrooms. Our strategies come from instructors with experience teaching in these environments.
Evaluation of streamflow forecast for the National Water Model of U.S. National Weather Service
NASA Astrophysics Data System (ADS)
Rafieeinasab, A.; McCreight, J. L.; Dugger, A. L.; Gochis, D.; Karsten, L. R.; Zhang, Y.; Cosgrove, B.; Liu, Y.
2016-12-01
The National Water Model (NWM), an implementation of the community WRF-Hydro modeling system, is an operational hydrologic forecasting model for the contiguous United States. The model forecasts distributed hydrologic states and fluxes, including soil moisture, snowpack, ET, and ponded water. In particular, the NWM provides streamflow forecasts at more than 2.7 million river reaches for three forecast ranges: short (15 hr), medium (10 days), and long (30 days). In this study, we verify short and medium range streamflow forecasts in the context of the verification of their respective quantitative precipitation forecasts/forcing (QPF), the High Resolution Rapid Refresh (HRRR) and the Global Forecast System (GFS). The streamflow evaluation is performed for summer of 2016 at more than 6,000 USGS gauges. Both individual forecasts and forecast lead times are examined. Selected case studies of extreme events aim to provide insight into the quality of the NWM streamflow forecasts. A goal of this comparison is to address how much streamflow bias originates from precipitation forcing bias. To this end, precipitation verification is performed over the contributing areas above (and between assimilated) USGS gauge locations. Precipitation verification is based on the aggregated, blended StageIV/StageII data as the "reference truth". We summarize the skill of the streamflow forecasts, their skill relative to the QPF, and make recommendations for improving NWM forecast skill.
Singh, Anushikha; Dutta, Malay Kishore
2017-12-01
The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Analysis of human scream and its impact on text-independent speaker verification.
Hansen, John H L; Nandwana, Mahesh Kumar; Shokouhi, Navid
2017-04-01
Scream is defined as sustained, high-energy vocalizations that lack phonological structure. Lack of phonological structure is how scream is identified from other forms of loud vocalization, such as "yell." This study investigates the acoustic aspects of screams and addresses those that are known to prevent standard speaker identification systems from recognizing the identity of screaming speakers. It is well established that speaker variability due to changes in vocal effort and Lombard effect contribute to degraded performance in automatic speech systems (i.e., speech recognition, speaker identification, diarization, etc.). However, previous research in the general area of speaker variability has concentrated on human speech production, whereas less is known about non-speech vocalizations. The UT-NonSpeech corpus is developed here to investigate speaker verification from scream samples. This study considers a detailed analysis in terms of fundamental frequency, spectral peak shift, frame energy distribution, and spectral tilt. It is shown that traditional speaker recognition based on the Gaussian mixture models-universal background model framework is unreliable when evaluated with screams.
Verification of a Remaining Flying Time Prediction System for Small Electric Aircraft
NASA Technical Reports Server (NTRS)
Hogge, Edward F.; Bole, Brian M.; Vazquez, Sixto L.; Celaya, Jose R.; Strom, Thomas H.; Hill, Boyd L.; Smalling, Kyle M.; Quach, Cuong C.
2015-01-01
This paper addresses the problem of building trust in online predictions of a battery powered aircraft's remaining available flying time. A set of ground tests is described that make use of a small unmanned aerial vehicle to verify the performance of remaining flying time predictions. The algorithm verification procedure described here uses a fully functional vehicle that is restrained to a platform for repeated run-to-functional-failure experiments. The vehicle under test is commanded to follow a predefined propeller RPM profile in order to create battery demand profiles similar to those expected in flight. The fully integrated aircraft is repeatedly operated until the charge stored in powertrain batteries falls below a specified lower-limit. The time at which the lower-limit on battery charge is crossed is then used to measure the accuracy of remaining flying time predictions. Accuracy requirements are considered in this paper for an alarm that warns operators when remaining flying time is estimated to fall below a specified threshold.
Verification of Numerical Programs: From Real Numbers to Floating Point Numbers
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar; Kirchner, Florent; Correnson, Loiec
2013-01-01
Numerical algorithms lie at the heart of many safety-critical aerospace systems. The complexity and hybrid nature of these systems often requires the use of interactive theorem provers to verify that these algorithms are logically correct. Usually, proofs involving numerical computations are conducted in the infinitely precise realm of the field of real numbers. However, numerical computations in these algorithms are often implemented using floating point numbers. The use of a finite representation of real numbers introduces uncertainties as to whether the properties veri ed in the theoretical setting hold in practice. This short paper describes work in progress aimed at addressing these concerns. Given a formally proven algorithm, written in the Program Verification System (PVS), the Frama-C suite of tools is used to identify sufficient conditions and verify that under such conditions the rounding errors arising in a C implementation of the algorithm do not affect its correctness. The technique is illustrated using an algorithm for detecting loss of separation among aircraft.
Learning Assumptions for Compositional Verification
NASA Technical Reports Server (NTRS)
Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)
2002-01-01
Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.
Lithium-Ion Cell Charge-Control Unit Developed
NASA Technical Reports Server (NTRS)
Reid, Concha M.; Manzo, Michelle A.; Buton, Robert M.; Gemeiner, Russel
2005-01-01
A lithium-ion (Li-ion) cell charge-control unit was developed as part of a Li-ion cell verification program. This unit manages the complex charging scheme that is required when Li-ion cells are charged in series. It enables researchers to test cells together as a pack, while allowing each cell to charge individually. This allows the inherent cell-to-cell variations to be addressed on a series string of cells and reduces test costs substantially in comparison to individual cell testing.
Missile and Space Systems Reliability versus Cost Trade-Off Study
1983-01-01
F00-1C09 Robert C. Schneider F00-1C09 V . PERFORMING ORGANIZATION NAME AM0 ADDRESS 16 PRGRAM ELEMENT. PROJECT. TASK BoeingAerosace CmpAnyA CA WORK UNIT...reliability problems, which has the - real bearing on program effectiveness. A well planned and funded reliability effort can prevent or ferret out...failure analysis, and the in- corporation and verification of design corrections to prevent recurrence of failures. 302.2.2 A TMJ test plan shall be