Sample records for dynamic verification strategy

  1. INDEPENDENT VERIFICATION SURVEY REPORT FOR ZONE 1 OF THE EAST TENNESSEE TECHNOLOGY PARK IN OAK RIDGE, TENNESSEE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, David A.

    2012-08-16

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).

  2. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  3. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  4. Typology of Strategies of Personality Meaning-Making during Professional Education

    ERIC Educational Resources Information Center

    Shchipanova, Dina Ye.; Lebedeva, Ekaterina V.; Sukhinin, Valentin P.; Valieva, Elizaveta N.

    2016-01-01

    The importance of the studied issue is conditioned by the fact that high dynamic of processes in the labour market requires constant work of an individual on self-determination and search for significance of his/her professional activity. The purpose of research is theoretical development and empirical verification of the types of strategies of…

  5. Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics

    NASA Astrophysics Data System (ADS)

    Camati, Patrice A.; Serra, Roberto M.

    2018-04-01

    Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.

  6. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  7. Manipulation strategies for massive space payloads

    NASA Technical Reports Server (NTRS)

    Book, Wayne J.

    1989-01-01

    Control for the bracing strategy is being examined. It was concluded earlier that trajectory planning must be improved to best achieve the bracing motion. Very interesting results were achieved which enable the inverse dynamics of flexible arms to be calculated for linearized motion in a more efficient manner than previously published. The desired motion of the end point beginning at t=0 and ending at t=t sub f is used to calculate the required torque at the joint. The solution is separated into a causal function that is zero for t is less than 0 and an accusal function which is zero for t is greater than t sub f. A number of alternative end point trajectories were explored in terms of the peak torque required, the amount of anticipatory action, and other issues. The single link case is the immediate subject and an experimental verification of that case is being performed. Modeling with experimental verification of closed chain dynamics continues. Modeling effort has pointed out inaccuracies that result from the choice of numerical techniques used to incorporate the closed chain constraints when modeling our experimental prototype RALF (Robotic Arm Large and Flexible). Results were compared to TREETOPS, a multi body code. The experimental verification work is suggesting new ways to make comparisons with systems having structural linearity and joint and geometric nonlinearity. The generation of inertial forces was studied with a small arm that will damp the large arm's vibration.

  8. Space shuttle flying qualities and criteria assessment

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Johnston, D. E.; Mcruer, Duane T.

    1987-01-01

    Work accomplished under a series of study tasks for the Flying Qualities and Flight Control Systems Design Criteria Experiment (OFQ) of the Shuttle Orbiter Experiments Program (OEX) is summarized. The tasks involved review of applicability of existing flying quality and flight control system specification and criteria for the Shuttle; identification of potentially crucial flying quality deficiencies; dynamic modeling of the Shuttle Orbiter pilot/vehicle system in the terminal flight phases; devising a nonintrusive experimental program for extraction and identification of vehicle dynamics, pilot control strategy, and approach and landing performance metrics, and preparation of an OEX approach to produce a data archive and optimize use of the data to develop flying qualities for future space shuttle craft in general. Analytic modeling of the Orbiter's unconventional closed-loop dynamics in landing, modeling pilot control strategies, verification of vehicle dynamics and pilot control strategy from flight data, review of various existent or proposed aircraft flying quality parameters and criteria in comparison with the unique dynamic characteristics and control aspects of the Shuttle in landing; and finally a summary of conclusions and recommendations for developing flying quality criteria and design guides for future Shuttle craft.

  9. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  10. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  11. Practical UXO Classification: Enhanced Data Processing Strategies for Technology Transition - Fort Ord: Dynamic and Cued Metalmapper Processing and Classification

    DTIC Science & Technology

    2017-06-06

    OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...Geophysical Mapping, Electromagnetic Induction, Instrument Verification Strip, Time Domain Electromagnetic, Unexploded Ordnance 16. SECURITY...Munitions Response QA Quality Assurance QC Quality Control ROC Receiver Operating Characteristic RTK Real- time Kinematic s Second SNR

  12. Independent verification survey report for exposure units Z2-24, Z2-31, Z2-32, AND Z2-36 in zone 2 of the East Tennessee technology park Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, David A.

    The U.S. Department of Energy (DOE) Oak Ridge Office of Environmental Management selected Oak Ridge Associated Universities (ORAU), through the Oak Ridge Institute for Science and Education (ORISE) contract, to perform independent verification (IV) at Zone 2 of the East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. ORAU has concluded IV surveys, per the project-specific plan (PSP) (ORAU 2013a) covering exposure units (EUs) Z2-24, -31, -32, and -36. The objective of this effort was to verify the target EUs comply with requirements in the Zone 2 Record of Decision (ROD) (DOE 2005), as implemented by using the dynamic verificationmore » strategy presented in the dynamic work plan (DWP) (BJC 2007); and confirm commitments in the DWP were adequately implemented, as verified via IV surveys and soil sampling.« less

  13. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  14. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  15. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  16. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  17. Dosimetric verification for primary focal hypermetabolism of nasopharyngeal carcinoma patients treated with dynamic intensity-modulated radiation therapy.

    PubMed

    Xin, Yong; Wang, Jia-Yang; Li, Liang; Tang, Tian-You; Liu, Gui-Hong; Wang, Jian-She; Xu, Yu-Mei; Chen, Yong; Zhang, Long-Zhen

    2012-01-01

    To make sure the feasibility with (18F)FDG PET/CT to guided dynamic intensity-modulated radiation therapy (IMRT) for nasopharyngeal carcinoma patients, by dosimetric verification before treatment. Chose 11 patients in III~IVA nasopharyngeal carcinoma treated with functional image-guided IMRT and absolute and relative dosimetric verification by Varian 23EX LA, ionization chamber, 2DICA of I'mRT Matrixx and IBA detachable phantom. Drawing outline and making treatment plan were by different imaging techniques (CT and (18F)FDG PET/CT). The dose distributions of the various regional were realized by SMART. The absolute mean errors of interest area were 2.39%±0.66 using 0.6 cc ice chamber. Results using DTA method, the average relative dose measurements within our protocol (3%, 3 mm) were 87.64% at 300 MU/min in all filed. Dosimetric verification before IMRT is obligatory and necessary. Ionization chamber and 2DICA of I'mRT Matrixx was the effective dosimetric verification tool for primary focal hyper metabolism in functional image-guided dynamic IMRT for nasopharyngeal carcinoma. Our preliminary evidence indicates that functional image-guided dynamic IMRT is feasible.

  18. Long-Term Pavement Performance Materials Characterization Program: Verification of Dynamic Test Systems with an Emphasis on Resilient Modulus

    DOT National Transportation Integrated Search

    2005-09-01

    This document describes a procedure for verifying a dynamic testing system (closed-loop servohydraulic). The procedure is divided into three general phases: (1) electronic system performance verification, (2) calibration check and overall system perf...

  19. Inconsistent Investment and Consumption Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kronborg, Morten Tolver, E-mail: mtk@atp.dk; Steffensen, Mogens, E-mail: mogens@math.ku.dk

    In a traditional Black–Scholes market we develop a verification theorem for a general class of investment and consumption problems where the standard dynamic programming principle does not hold. The theorem is an extension of the standard Hamilton–Jacobi–Bellman equation in the form of a system of non-linear differential equations. We derive the optimal investment and consumption strategy for a mean-variance investor without pre-commitment endowed with labor income. In the case of constant risk aversion it turns out that the optimal amount of money to invest in stocks is independent of wealth. The optimal consumption strategy is given as a deterministic bang-bangmore » strategy. In order to have a more realistic model we allow the risk aversion to be time and state dependent. Of special interest is the case were the risk aversion is inversely proportional to present wealth plus the financial value of future labor income net of consumption. Using the verification theorem we give a detailed analysis of this problem. It turns out that the optimal amount of money to invest in stocks is given by a linear function of wealth plus the financial value of future labor income net of consumption. The optimal consumption strategy is again given as a deterministic bang-bang strategy. We also calculate, for a general time and state dependent risk aversion function, the optimal investment and consumption strategy for a mean-standard deviation investor without pre-commitment. In that case, it turns out that it is optimal to take no risk at all.« less

  20. Multi-mode energy management strategy for fuel cell electric vehicles based on driving pattern identification using learning vector quantization neural network algorithm

    NASA Astrophysics Data System (ADS)

    Song, Ke; Li, Feiqiang; Hu, Xiao; He, Lin; Niu, Wenxu; Lu, Sihao; Zhang, Tong

    2018-06-01

    The development of fuel cell electric vehicles can to a certain extent alleviate worldwide energy and environmental issues. While a single energy management strategy cannot meet the complex road conditions of an actual vehicle, this article proposes a multi-mode energy management strategy for electric vehicles with a fuel cell range extender based on driving condition recognition technology, which contains a patterns recognizer and a multi-mode energy management controller. This paper introduces a learning vector quantization (LVQ) neural network to design the driving patterns recognizer according to a vehicle's driving information. This multi-mode strategy can automatically switch to the genetic algorithm optimized thermostat strategy under specific driving conditions in the light of the differences in condition recognition results. Simulation experiments were carried out based on the model's validity verification using a dynamometer test bench. Simulation results show that the proposed strategy can obtain better economic performance than the single-mode thermostat strategy under dynamic driving conditions.

  1. Study of Measurement Strategies of Geometric Deviation of the Position of the Threaded Holes

    NASA Astrophysics Data System (ADS)

    Drbul, Mário; Martikan, Pavol; Sajgalik, Michal; Czan, Andrej; Broncek, Jozef; Babik, Ondrej

    2017-12-01

    Verification of product and quality control is an integral part of current production process. In terms of functional requirements and product interoperability, it is necessary to analyze their dimensional and also geometric specifications. Threaded holes are verified elements too, which are a substantial part of detachable screw connections and have a broad presence in engineering products. This paper deals with on the analysing of measurement strategies of verification geometric deviation of the position of the threaded holes, which are the indirect method of measuring threaded pins when applying different measurement strategies which can affect the result of the verification of the product..

  2. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  3. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  4. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  5. Optimal Verification of Entangled States with Local Measurements

    NASA Astrophysics Data System (ADS)

    Pallister, Sam; Linden, Noah; Montanaro, Ashley

    2018-04-01

    Consider the task of verifying that a given quantum device, designed to produce a particular entangled state, does indeed produce that state. One natural approach would be to characterize the output state by quantum state tomography, or alternatively, to perform some kind of Bell test, tailored to the state of interest. We show here that neither approach is optimal among local verification strategies for 2-qubit states. We find the optimal strategy in this case and show that quadratically fewer total measurements are needed to verify to within a given fidelity than in published results for quantum state tomography, Bell test, or fidelity estimation protocols. We also give efficient verification protocols for any stabilizer state. Additionally, we show that requiring that the strategy be constructed from local, nonadaptive, and noncollective measurements only incurs a constant-factor penalty over a strategy without these restrictions.

  6. Cooperative Networked Control of Dynamical Peer-to-Peer Vehicle Systems

    DTIC Science & Technology

    2007-12-28

    dynamic deployment and task allocation;verification and hybrid systems; and information management for cooperative control. The activity of the...32 5.3 Decidability Results on Discrete and Hybrid Systems ...... .................. 33 5.4 Switched Systems...solved. Verification and hybrid systems. The program has produced significant advances in the theory of hybrid input-output automata, (HIOA) and the

  7. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  8. Jitter Test Program and On-Orbit Mitigation Strategies for Solar Dynamic Observatory

    NASA Technical Reports Server (NTRS)

    Liu, Kuo-Chia; Kenney, Thomas; Maghami, Peiman; Mule, Pete; Blaurock, Carl; Haile, William B.

    2007-01-01

    The Solar Dynamic Observatory (SDO) aims to study the Sun's influence on the Earth, the source, storage, and release of the solar energy, and the interior structure of the Sun. During science observations, the jitter stability at the instrument focal plane must be maintained to less than a fraction of an arcsecond for two of the SDO instruments. To meet these stringent requirements, a significant amount of analysis and test effort has been devoted to predicting the jitter induced from various disturbance sources. This paper presents an overview of the SDO jitter analysis approach and test effort performed to date. It emphasizes the disturbance modeling, verification, calibration, and validation of the high gain antenna stepping mechanism and the reaction wheels, which are the two largest jitter contributors. This paper also describes on-orbit mitigation strategies to protect the system from analysis model uncertainties. Lessons learned from the SDO jitter analyses and test programs are included in the paper to share the knowledge gained with the community.

  9. Function and dynamics of aptamers: A case study on the malachite green aptamer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Tianjiao

    Aptamers are short single-stranded nucleic acids that can bind to their targets with high specificity and high affinity. To study aptamer function and dynamics, the malachite green aptamer was chosen as a model. Malachite green (MG) bleaching, in which an OH- attacks the central carbon (C1) of MG, was inhibited in the presence of the malachite green aptamer (MGA). The inhibition of MG bleaching by MGA could be reversed by an antisense oligonucleotide (AS) complementary to the MGA binding pocket. Computational cavity analysis of the NMR structure of the MGA-MG complex predicted that the OH - is sterically excluded frommore » the C1 of MG. The prediction was confirmed experimentally using variants of the MGA with changes in the MG binding pocket. This work shows that molecular reactivity can be reversibly regulated by an aptamer-AS pair based on steric hindrance. In addition to demonstrate that aptamers could control molecular reactivity, aptamer dynamics was studied with a strategy combining molecular dynamics (MD) simulation and experimental verification. MD simulation predicted that the MG binding pocket of the MGA is largely pre-organized and that binding of MG involves reorganization of the pocket and a simultaneous twisting of the MGA terminal stems around the pocket. MD simulation also provided a 3D-structure model of unoccupied MGA that has not yet been obtained by biophysical measurements. These predictions were consistent with biochemical and biophysical measurements of the MGA-MG interaction including RNase I footprinting, melting curves, thermodynamic and kinetic constants measurement. This work shows that MD simulation can be used to extend our understanding of the dynamics of aptamer-target interaction which is not evident from static 3D-structures. To conclude, I have developed a novel concept to control molecular reactivity by an aptamer based on steric protection and a strategy to study the dynamics of aptamer-target interaction by combining MD simulation and experimental verification. The former has potential application in controlling metabolic reactions and protein modifications by small reactants and the latter may serve as a general approach to study the dynamics of aptamer-target interaction for new insights into mechanisms of aptamer-target recognition.« less

  10. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  11. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING AND CODING VERIFICATION (HAND ENTRY) (UA-D-14.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for coding and coding verification of hand-entered data. It applies to the coding of all physical forms, especially those coded by hand. The strategy was developed for use in the Arizona NHEXAS project and the "Border" st...

  12. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  13. Seismic design verification of LMFBR structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-07-01

    The report provides an assessment of the seismic design verification procedures currently used for nuclear power plant structures, a comparison of dynamic test methods available, and conclusions and recommendations for future LMFB structures.

  14. A dynamic human water and electrolyte balance model for verification and optimization of life support systems in space flight applications

    NASA Astrophysics Data System (ADS)

    Hager, P.; Czupalla, M.; Walter, U.

    2010-11-01

    In this paper we report on the development of a dynamic MATLAB SIMULINK® model for the water and electrolyte balance inside the human body. This model is part of an environmentally sensitive dynamic human model for the optimization and verification of environmental control and life support systems (ECLSS) in space flight applications. An ECLSS provides all vital supplies for supporting human life on board a spacecraft. As human space flight today focuses on medium- to long-term missions, the strategy in ECLSS is shifting to closed loop systems. For these systems the dynamic stability and function over long duration are essential. However, the only evaluation and rating methods for ECLSS up to now are either expensive trial and error breadboarding strategies or static and semi-dynamic simulations. In order to overcome this mismatch the Exploration Group at Technische Universität München (TUM) is developing a dynamic environmental simulation, the "Virtual Habitat" (V-HAB). The central element of this simulation is the dynamic and environmentally sensitive human model. The water subsystem simulation of the human model discussed in this paper is of vital importance for the efficiency of possible ECLSS optimizations, as an over- or under-scaled water subsystem would have an adverse effect on the overall mass budget. On the other hand water has a pivotal role in the human organism. Water accounts for about 60% of the total body mass and is educt and product of numerous metabolic reactions. It is a transport medium for solutes and, due to its high evaporation enthalpy, provides the most potent medium for heat load dissipation. In a system engineering approach the human water balance was worked out by simulating the human body's subsystems and their interactions. The body fluids were assumed to reside in three compartments: blood plasma, interstitial fluid and intracellular fluid. In addition, the active and passive transport of water and solutes between those compartments was modeled dynamically. A kidney model regulates the electrolyte concentration in body fluids (osmolality) in narrow confines and a thirst mechanism models the urge to ingest water. A controlled exchange of water and electrolytes with other human subsystems, as well as with the environment, is implemented. Finally, the changes in body composition due to muscle growth are accounted for. The outcome of this is a dynamic water and electrolyte balance, which is capable of representing body reactions like thirst and headaches, as well as heat stroke and collapse, as a response to its work load and environment.

  15. INF and IAEA: A comparative analysis of verification strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  16. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    PubMed Central

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  17. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  18. Tethered satellite system dynamics and control review panel and related activities, phase 3

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Two major tests of the Tethered Satellite System (TSS) engineering and flight units were conducted to demonstrate the functionality of the hardware and software. Deficiencies in the hardware/software integration tests (HSIT) led to a recommendation for more testing to be performed. Selected problem areas of tether dynamics were analyzed, including verification of the severity of skip rope oscillations, verification or comparison runs to explore dynamic phenomena observed in other simulations, and data generation runs to explore the performance of the time domain and frequency domain skip rope observers.

  19. Gaia challenging performances verification: combination of spacecraft models and test results

    NASA Astrophysics Data System (ADS)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  20. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    NASA Technical Reports Server (NTRS)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  1. INF and IAEA: A comparative analysis of verification strategy. [Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  2. Common aero vehicle autonomous reentry trajectory optimization satisfying waypoint and no-fly zone constraints

    NASA Astrophysics Data System (ADS)

    Jorris, Timothy R.

    2007-12-01

    To support the Air Force's Global Reach concept, a Common Aero Vehicle is being designed to support the Global Strike mission. "Waypoints" are specified for reconnaissance or multiple payload deployments and "no-fly zones" are specified for geopolitical restrictions or threat avoidance. Due to time critical targets and multiple scenario analysis, an autonomous solution is preferred over a time-intensive, manually iterative one. Thus, a real-time or near real-time autonomous trajectory optimization technique is presented to minimize the flight time, satisfy terminal and intermediate constraints, and remain within the specified vehicle heating and control limitations. This research uses the Hypersonic Cruise Vehicle (HCV) as a simplified two-dimensional platform to compare multiple solution techniques. The solution techniques include a unique geometric approach developed herein, a derived analytical dynamic optimization technique, and a rapidly emerging collocation numerical approach. This up-and-coming numerical technique is a direct solution method involving discretization then dualization, with pseudospectral methods and nonlinear programming used to converge to the optimal solution. This numerical approach is applied to the Common Aero Vehicle (CAV) as the test platform for the full three-dimensional reentry trajectory optimization problem. The culmination of this research is the verification of the optimality of this proposed numerical technique, as shown for both the two-dimensional and three-dimensional models. Additionally, user implementation strategies are presented to improve accuracy and enhance solution convergence. Thus, the contributions of this research are the geometric approach, the user implementation strategies, and the determination and verification of a numerical solution technique for the optimal reentry trajectory problem that minimizes time to target while satisfying vehicle dynamics and control limitation, and heating, waypoint, and no-fly zone constraints.

  3. Dynamic analysis for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  4. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  5. Dynamic Modeling of the SMAP Rotating Flexible Antenna

    NASA Technical Reports Server (NTRS)

    Nayeri, Reza D.

    2012-01-01

    Dynamic model development in ADAMS for the SMAP project is explained: The main objective of the dynamic models are for pointing error assessment, and the control/stability margin requirement verifications

  6. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  7. FIR signature verification system characterizing dynamics of handwriting features

    NASA Astrophysics Data System (ADS)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  8. Loads and Structural Dynamics Requirements for Spaceflight Hardware

    NASA Technical Reports Server (NTRS)

    Schultz, Kenneth P.

    2011-01-01

    The purpose of this document is to establish requirements relating to the loads and structural dynamics technical discipline for NASA and commercial spaceflight launch vehicle and spacecraft hardware. Requirements are defined for the development of structural design loads and recommendations regarding methodologies and practices for the conduct of load analyses are provided. As such, this document represents an implementation of NASA STD-5002. Requirements are also defined for structural mathematical model development and verification to ensure sufficient accuracy of predicted responses. Finally, requirements for model/data delivery and exchange are specified to facilitate interactions between Launch Vehicle Providers (LVPs), Spacecraft Providers (SCPs), and the NASA Technical Authority (TA) providing insight/oversight and serving in the Independent Verification and Validation role. In addition to the analysis-related requirements described above, a set of requirements are established concerning coupling phenomena or other interaction between structural dynamics and aerodynamic environments or control or propulsion system elements. Such requirements may reasonably be considered structure or control system design criteria, since good engineering practice dictates consideration of and/or elimination of the identified conditions in the development of those subsystems. The requirements are included here, however, to ensure that such considerations are captured in the design space for launch vehicles (LV), spacecraft (SC) and the Launch Abort Vehicle (LAV). The requirements in this document are focused on analyses to be performed to develop data needed to support structural verification. As described in JSC 65828, Structural Design Requirements and Factors of Safety for Spaceflight Hardware, implementation of the structural verification requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The requirement for and expected contents of the SVP are defined in JSC 65828. The SVP may also document unique verifications that meet or exceed these requirements with Technical Authority approval.

  9. Developing a NASA strategy for the verification of large space telescope observatories

    NASA Astrophysics Data System (ADS)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  10. A tuberculosis biomarker database: the key to novel TB diagnostics.

    PubMed

    Yerlikaya, Seda; Broger, Tobias; MacLean, Emily; Pai, Madhukar; Denkinger, Claudia M

    2017-03-01

    New diagnostic innovations for tuberculosis (TB), including point-of-care solutions, are critical to reach the goals of the End TB Strategy. However, despite decades of research, numerous reports on new biomarker candidates, and significant investment, no well-performing, simple and rapid TB diagnostic test is yet available on the market, and the search for accurate, non-DNA biomarkers remains a priority. To help overcome this 'biomarker pipeline problem', FIND and partners are working on the development of a well-curated and user-friendly TB biomarker database. The web-based database will enable the dynamic tracking of evidence surrounding biomarker candidates in relation to target product profiles (TPPs) for needed TB diagnostics. It will be able to accommodate raw datasets and facilitate the verification of promising biomarker candidates and the identification of novel biomarker combinations. As such, the database will simplify data and knowledge sharing, empower collaboration, help in the coordination of efforts and allocation of resources, streamline the verification and validation of biomarker candidates, and ultimately lead to an accelerated translation into clinically useful tools. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. Inverse dynamics of underactuated mechanical systems: A simple case study and experimental verification

    NASA Astrophysics Data System (ADS)

    Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.

    2011-05-01

    Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m < n. The determination of an input control strategy that forces such a system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.

  12. Status on the Verification of Combustion Stability for the J-2X Engine Thrust Chamber Assembly

    NASA Technical Reports Server (NTRS)

    Casiano, Matthew; Hinerman, Tim; Kenny, R. Jeremy; Hulka, Jim; Barnett, Greg; Dodd, Fred; Martin, Tom

    2013-01-01

    Development is underway of the J -2X engine, a liquid oxygen/liquid hydrogen rocket engine for use on the Space Launch System. The Engine E10001 began hot fire testing in June 2011 and testing will continue with subsequent engines. The J -2X engine main combustion chamber contains both acoustic cavities and baffles. These stability aids are intended to dampen the acoustics in the main combustion chamber. Verification of the engine thrust chamber stability is determined primarily by examining experimental data using a dynamic stability rating technique; however, additional requirements were included to guard against any spontaneous instability or rough combustion. Startup and shutdown chug oscillations are also characterized for this engine. This paper details the stability requirements and verification including low and high frequency dynamics, a discussion on sensor selection and sensor port dynamics, and the process developed to assess combustion stability. A status on the stability results is also provided and discussed.

  13. Self-Verification of Ability through Biased Performance Memory.

    ERIC Educational Resources Information Center

    Karabenick, Stuart A.; LeBlanc, Daniel

    Evidence points to a pervasive tendency for persons to behave to maintain their existing cognitive structures. One strategy by which this self-verification is made more probable involves information processing. Through attention, encoding and retrieval, and the interpretation of events, persons process information so that self-confirmatory…

  14. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  15. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  16. High-entropy alloys in hexagonal close-packed structure

    DOE PAGES

    Gao, Michael C.; Zhang, B.; Guo, S. M.; ...

    2015-08-28

    The microstructures and properties of high-entropy alloys (HEAs) based on the face-centered cubic and body-centered cubic structures have been studied extensively in the literature, but reports on HEAs in the hexagonal close-packed (HCP) structure are very limited. Using an efficient strategy in combining phase diagram inspection, CALPHAD modeling, and ab initio molecular dynamics simulations, a variety of new compositions are suggested that may hold great potentials in forming single-phase HCP HEAs that comprise rare earth elements and transition metals, respectively. Lastly, experimental verification was carried out on CoFeReRu and CoReRuV using X-ray diffraction, scanning electron microscopy, and energy dispersion spectroscopy.

  17. Strategies for Validation Testing of Ground Systems

    NASA Technical Reports Server (NTRS)

    Annis, Tammy; Sowards, Stephanie

    2009-01-01

    In order to accomplish the full Vision for Space Exploration announced by former President George W. Bush in 2004, NASA will have to develop a new space transportation system and supporting infrastructure. The main portion of this supporting infrastructure will reside at the Kennedy Space Center (KSC) in Florida and will either be newly developed or a modification of existing vehicle processing and launch facilities, including Ground Support Equipment (GSE). This type of large-scale launch site development is unprecedented since the time of the Apollo Program. In order to accomplish this successfully within the limited budget and schedule constraints a combination of traditional and innovative strategies for Verification and Validation (V&V) have been developed. The core of these strategies consists of a building-block approach to V&V, starting with component V&V and ending with a comprehensive end-to-end validation test of the complete launch site, called a Ground Element Integration Test (GEIT). This paper will outline these strategies and provide the high level planning for meeting the challenges of implementing V&V on a large-scale development program. KEY WORDS: Systems, Elements, Subsystem, Integration Test, Ground Systems, Ground Support Equipment, Component, End Item, Test and Verification Requirements (TVR), Verification Requirements (VR)

  18. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  19. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  20. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    PubMed

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  1. SU-E-I-56: Scan Angle Reduction for a Limited-Angle Intrafraction Verification (LIVE) System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, L; Zhang, Y; Yin, F

    Purpose: To develop a novel adaptive reconstruction strategy to further reduce the scanning angle required by the limited-angle intrafraction verification (LIVE) system for intrafraction verification. Methods: LIVE acquires limited angle MV projections from the exit fluence of the arc treatment beam or during gantry rotation between static beams. Orthogonal limited-angle kV projections are also acquired simultaneously to provide additional information. LIVE considers the on-board 4D-CBCT images as a deformation of the prior 4D-CT images, and solves the deformation field based on deformation models and data fidelity constraint. LIVE reaches a checkpoint after a limited-angle scan, and reconstructs 4D-CBCT for intrafractionmore » verification at the checkpoint. In adaptive reconstruction strategy, a larger scanning angle of 30° is used for the first checkpoint, and smaller scanning angles of 15° are used for subsequent checkpoints. The onboard images reconstructed at the previous adjacent checkpoint are used as the prior images for reconstruction at the current checkpoint. As the algorithm only needs to reconstruct the small deformation occurred between adjacent checkpoints, projections from a smaller scan angle provide enough information for the reconstruction. XCAT was used to simulate tumor motion baseline drift of 2mm along sup-inf direction at every subsequent checkpoint, which are 15° apart. Adaptive reconstruction strategy was used to reconstruct the images at each checkpoint using orthogonal 15° kV and MV projections. Results: Results showed that LIVE reconstructed the tumor volumes accurately using orthogonal 15° kV-MV projections. Volume percentage differences (VPDs) were within 5% and center of mass shifts (COMS) were within 1mm for reconstruction at all checkpoints. Conclusion: It's feasible to use an adaptive reconstruction strategy to further reduce the scan angle needed by LIVE to allow faster and more frequent intrafraction verification to minimize the treatment errors in lung cancer treatments. Grant from Varian Medical System.« less

  2. The SCEC/USGS dynamic earthquake rupture code verification exercise

    USGS Publications Warehouse

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous rupture (hereafter called “spontaneous rupture”) solutions. For these types of numerical simulations, rather than prescribing the slip function at each location on the fault(s), just the friction constitutive properties and initial stress conditions are prescribed. The subsequent stresses and fault slip spontaneously evolve over time as part of the elasto-dynamic solution. Therefore, spontaneous rupture computer simulations of earthquakes allow us to include everything that we know, or think that we know, about earthquake dynamics and to test these ideas against earthquake observations.

  3. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  4. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  5. General Dynamic (GD) Launch Waveform On-Orbit Performance Report

    NASA Technical Reports Server (NTRS)

    Briones, Janette C.; Shalkhauser, Mary Jo

    2014-01-01

    The purpose of this report is to present the results from the GD SDR on-orbit performance testing using the launch waveform over TDRSS. The tests include the evaluation of well-tested waveform modes, the operation of RF links that are expected to have high margins, the verification of forward return link operation (including full duplex), the verification of non-coherent operational models, and the verification of radio at-launch operational frequencies. This report also outlines the launch waveform tests conducted and comparisons to the results obtained from ground testing.

  6. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  7. Handling performance control for hybrid 8-wheel-drive vehicle and simulation verification

    NASA Astrophysics Data System (ADS)

    Ni, Jun; Hu, Jibin

    2016-08-01

    In order to improve handling performance of a hybrid 8-Wheel-Drive vehicle, the handling performance control strategy was proposed. For armoured vehicle, besides handling stability in high speed, the minimum steer radius in low speed is also a key tactical and technical index. Based on that, the proposed handling performance control strategy includes 'Handling Stability' and 'Radius Minimization' control modes. In 'Handling Stability' control mode, 'Neutralsteer Radio' is defined to adjust the steering characteristics to satisfy different demand in different speed range. In 'Radius Minimization' control mode, the independent motors are controlled to provide an additional yaw moment to decrease the minimum steer radius. In order to verify the strategy, a simulation platform was built including engine and continuously variable transmission systems, generator and battery systems, independent motors and controllers systems, vehicle dynamic and tyre mechanical systems. The simulation results show that the handling performance of the vehicle can be enhanced significantly, and the minimum steer radius can be decreased by 20% which is significant improvement compared to the common level of main battle armoured vehicle around the world.

  8. Memory for Verbally Presented Routes: A Comparison of Strategies Used by Blind and Sighted People.

    ERIC Educational Resources Information Center

    Easton, R. D.; Bentzen, B. L.

    1987-01-01

    Congenitally-blind (N=16) and sighted (N=16) young adults listened to descriptions of routes and then finger traced routes through a raised line matrix. Route tracing speed and accuracy revealed that spatial sentence verification interfered with route memory more than abstract/verbal sentence verification for all subjects. (Author/CB)

  9. Content analysis of age verification, purchase and delivery methods of internet e-cigarette vendors, 2013 and 2014.

    PubMed

    Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M

    2018-05-01

    Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (p<0.01), with a corresponding significant decrease in US-based vendors (71.9% in 2013 and 65% in 2014). Most vendors did little to prevent youth access in either year, with 67.6% in 2013 and 63.2% in 2014 employing no age verification or relying exclusively on strategies that cannot effectively verify age. Effective age verification strategies such as online age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, p<0.01) or age verification at delivery (6.4% in 2013 and 8.1% in 2104) were rarely advertised on IEV websites. Nearly all vendors advertised accepting credit cards, and about ¾ shipping via United States Postal Service, similar to the internet cigarette industry prior to federal bans. The number of IEVs grew sharply from 2013 to 2014, with poor age verification practices. New and expanded regulations for online e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  11. Physico-Chemical Dynamics of Nanoparticle Formation during Laser Decontamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, M.D.

    2005-06-01

    Laser-ablation based decontamination is a new and effective approach for simultaneous removal and characterization of contaminants from surfaces (e.g., building interior and exterior walls, ground floors, etc.). The scientific objectives of this research are to: (1) characterize particulate matter generated during the laser-ablation based decontamination, (2) develop a technique for simultaneous cleaning and spectroscopic verification, and (3) develop an empirical model for predicting particle generation for the size range from 10 nm to tens of micrometers. This research project provides fundamental data obtained through a systematic study on the particle generation mechanism, and also provides a working model for predictionmore » of particle generation such that an effective operational strategy can be devised to facilitate worker protection.« less

  12. Dynamic CFD Simulations of the Supersonic Inflatable Aerodynamic Decelerator (SIAD) Ballistic Range Tests

    NASA Technical Reports Server (NTRS)

    Brock, Joseph M; Stern, Eric

    2016-01-01

    Dynamic CFD simulations of the SIAD ballistic test model were performed using US3D flow solver. Motivation for performing these simulations is for the purpose of validation and verification of the US3D flow solver as a viable computational tool for predicting dynamic coefficients.

  13. Nonlinear 3D MHD verification study: SpeCyl and PIXIE3D codes for RFP and Tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Cappello, S.; Chacon, L.

    2010-11-01

    A strong emphasis is presently placed in the fusion community on reaching predictive capability of computational models. An essential requirement of such endeavor is the process of assessing the mathematical correctness of computational tools, termed verification [1]. We present here a successful nonlinear cross-benchmark verification study between the 3D nonlinear MHD codes SpeCyl [2] and PIXIE3D [3]. Excellent quantitative agreement is obtained in both 2D and 3D nonlinear visco-resistive dynamics for reversed-field pinch (RFP) and tokamak configurations [4]. RFP dynamics, in particular, lends itself as an ideal non trivial test-bed for 3D nonlinear verification. Perspectives for future application of the fully-implicit parallel code PIXIE3D to RFP physics, in particular to address open issues on RFP helical self-organization, will be provided. [4pt] [1] M. Greenwald, Phys. Plasmas 17, 058101 (2010) [0pt] [2] S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996) [0pt] [3] L. Chac'on, Phys. Plasmas 15, 056103 (2008) [0pt] [4] D. Bonfiglio, L. Chac'on and S. Cappello, Phys. Plasmas 17 (2010)

  14. Development and verification of an agent-based model of opinion leadership.

    PubMed

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The development and testing of agent-based models is an iterative process. The opinion leader model presented here provides a basic structure for continued model development, ongoing verification, and the establishment of validation procedures, including empirical data collection.

  15. Application of Dynamic Analysis in Semi-Analytical Finite Element Method.

    PubMed

    Liu, Pengfei; Xing, Qinyan; Wang, Dawei; Oeser, Markus

    2017-08-30

    Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement's state.

  16. 4D ML reconstruction as a tool for volumetric PET-based treatment verification in ion beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bernardi, E., E-mail: elisabetta.debernardi@unimib.it; Ricotti, R.; Riboldi, M.

    2016-02-15

    Purpose: An innovative strategy to improve the sensitivity of positron emission tomography (PET)-based treatment verification in ion beam radiotherapy is proposed. Methods: Low counting statistics PET images acquired during or shortly after the treatment (Measured PET) and a Monte Carlo estimate of the same PET images derived from the treatment plan (Expected PET) are considered as two frames of a 4D dataset. A 4D maximum likelihood reconstruction strategy was adapted to iteratively estimate the annihilation events distribution in a reference frame and the deformation motion fields that map it in the Expected PET and Measured PET frames. The outputs generatedmore » by the proposed strategy are as follows: (1) an estimate of the Measured PET with an image quality comparable to the Expected PET and (2) an estimate of the motion field mapping Expected PET to Measured PET. The details of the algorithm are presented and the strategy is preliminarily tested on analytically simulated datasets. Results: The algorithm demonstrates (1) robustness against noise, even in the worst conditions where 1.5 × 10{sup 4} true coincidences and a random fraction of 73% are simulated; (2) a proper sensitivity to different kind and grade of mismatches ranging between 1 and 10 mm; (3) robustness against bias due to incorrect washout modeling in the Monte Carlo simulation up to 1/3 of the original signal amplitude; and (4) an ability to describe the mismatch even in presence of complex annihilation distributions such as those induced by two perpendicular superimposed ion fields. Conclusions: The promising results obtained in this work suggest the applicability of the method as a quantification tool for PET-based treatment verification in ion beam radiotherapy. An extensive assessment of the proposed strategy on real treatment verification data is planned.« less

  17. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.

  18. Identifying Rhodamine Dye Plume Sources in Near-Shore Oceanic Environments by Integration of Chemical and Visual Sensors

    PubMed Central

    Tian, Yu; Kang, Xiaodong; Li, Yunyi; Li, Wei; Zhang, Aiqun; Yu, Jiangchen; Li, Yiping

    2013-01-01

    This article presents a strategy for identifying the source location of a chemical plume in near-shore oceanic environments where the plume is developed under the influence of turbulence, tides and waves. This strategy includes two modules: source declaration (or identification) and source verification embedded in a subsumption architecture. Algorithms for source identification are derived from the moth-inspired plume tracing strategies based on a chemical sensor. The in-water test missions, conducted in November 2002 at San Clemente Island (California, USA) in June 2003 in Duck (North Carolina, USA) and in October 2010 at Dalian Bay (China), successfully identified the source locations after autonomous underwater vehicles tracked the rhodamine dye plumes with a significant meander over 100 meters. The objective of the verification module is to verify the declared plume source using a visual sensor. Because images taken in near shore oceanic environments are very vague and colors in the images are not well-defined, we adopt a fuzzy color extractor to segment the color components and recognize the chemical plume and its source by measuring color similarity. The source verification module is tested by images taken during the CPT missions. PMID:23507823

  19. Piezoelectric sensor pen for dynamic signature verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    EerNisse, E.P.; Land, C.E.; Snelling, J.B.

    The concept of using handwriting dynamics for electronic identification is discussed. A piezoelectric sensor pen for obtaining the pen point dynamics during writing is described. Design equations are derived and details of an operating device are presented. Typical output waveforms are shown to demonstrate the operation of the pen and to show the dissimilarities between dynamics of a genuine signature and an attempted forgery.

  20. Dynamic (Vibration) Testing: Design-Certification of Aerospace System

    NASA Technical Reports Server (NTRS)

    Aggarwal, Pravin K.

    2010-01-01

    Various types of dynamic testing of structures for certification purposes are described, including vibration, shock and acoustic testing. Modal testing is discussed as it frequently complements dynamic testing and is part of the structural verification/validation process leading up to design certification. Examples of dynamic and modal testing are presented as well as the common practices, procedures and standards employed.

  1. Verification Challenges of Dynamic Testing of Space Flight Hardware

    NASA Technical Reports Server (NTRS)

    Winnitoy, Susan

    2010-01-01

    The Six Degree-of-Freedom Dynamic Test System (SDTS) is a test facility at the National Aeronautics and Space Administration (NASA) Johnson Space Center in Houston, Texas for performing dynamic verification of space structures and hardware. Some examples of past and current tests include the verification of on-orbit robotic inspection systems, space vehicle assembly procedures and docking/berthing systems. The facility is able to integrate a dynamic simulation of on-orbit spacecraft mating or demating using flight-like mechanical interface hardware. A force moment sensor is utilized for input to the simulation during the contact phase, thus simulating the contact dynamics. While the verification of flight hardware presents many unique challenges, one particular area of interest is with respect to the use of external measurement systems to ensure accurate feedback of dynamic contact. There are many commercial off-the-shelf (COTS) measurement systems available on the market, and the test facility measurement systems have evolved over time to include two separate COTS systems. The first system incorporates infra-red sensing cameras, while the second system employs a laser interferometer to determine position and orientation data. The specific technical challenges with the measurement systems in a large dynamic environment include changing thermal and humidity levels, operational area and measurement volume, dynamic tracking, and data synchronization. The facility is located in an expansive high-bay area that is occasionally exposed to outside temperature when large retractable doors at each end of the building are opened. The laser interferometer system, in particular, is vulnerable to the environmental changes in the building. The operational area of the test facility itself is sizeable, ranging from seven meters wide and five meters deep to as much as seven meters high. Both facility measurement systems have desirable measurement volumes and the accuracies vary within the respective volumes. In addition, because this is a dynamic facility with a moving test bed, direct line-of-sight may not be available at all times between the measurement sensors and the tracking targets. Finally, the feedback data from the active test bed along with the two external measurement systems must be synchronized to allow for data correlation. To ensure the desired accuracy and resolution of these systems, calibration of the systems must be performed regularly. New innovations in sensor technology itself are periodically incorporated into the facility s overall measurement scheme. In addressing the challenges of the measurement systems, the facility is able to provide essential position and orientation data to verify the dynamic performance of space flight hardware.

  2. Evaluation of verification and testing tools for FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Smith, K. A.

    1980-01-01

    Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.

  3. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    NASA Astrophysics Data System (ADS)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  4. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  5. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  6. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  7. SSME lifetime prediction and verification, integrating environments, structures, materials: The challenge

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.; Salter, L. D.; Young, G. M., III; Munafo, P. M.

    1985-01-01

    The planned missions for the space shuttle dictated a unique and technology-extending rocket engine. The high specific impulse requirements in conjunction with a 55-mission lifetime, plus volume and weight constraints, produced unique structural design, manufacturing, and verification requirements. Operations from Earth to orbit produce severe dynamic environments, which couple with the extreme pressure and thermal environments associated with the high performance, creating large low cycle loads and high alternating stresses above endurance limit which result in high sensitivity to alternating stresses. Combining all of these effects resulted in the requirements for exotic materials, which are more susceptible to manufacturing problems, and the use of an all-welded structure. The challenge of integrating environments, dynamics, structures, and materials into a verified SSME structure is discussed. The verification program and developmental flight results are included. The first six shuttle flights had engine performance as predicted with no failures. The engine system has met the basic design challenges.

  8. Constrained structural dynamic model verification using free vehicle suspension testing methods

    NASA Technical Reports Server (NTRS)

    Blair, Mark A.; Vadlamudi, Nagarjuna

    1988-01-01

    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  9. Collapse of Experimental Colloidal Aging using Record Dynamics

    NASA Astrophysics Data System (ADS)

    Robe, Dominic; Boettcher, Stefan; Sibani, Paolo; Yunker, Peter

    The theoretical framework of record dynamics (RD) posits that aging behavior in jammed systems is controlled by short, rare events involving activation of only a few degrees of freedom. RD predicts dynamics in an aging system to progress with the logarithm of t /tw . This prediction has been verified through new analysis of experimental data on an aging 2D colloidal system. MSD and persistence curves spanning three orders of magnitude in waiting time are collapsed. These predictions have also been found consistent with a number of experiments and simulations, but verification of the specific assumptions that RD makes about the underlying statistics of these rare events has been elusive. Here the observation of individual particles allows for the first time the direct verification of the assumptions about event rates and sizes. This work is suppoted by NSF Grant DMR-1207431.

  10. A Verification System for Distributed Objects with Asynchronous Method Calls

    NASA Astrophysics Data System (ADS)

    Ahrendt, Wolfgang; Dylla, Maximilian

    We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.

  11. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian F.; Robertson, Amy N.

    2016-07-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  12. Recent literature on structural modeling, identification, and analysis

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.

    1990-01-01

    The literature on the mathematical modeling of large space structures is first reviewed, with attention given to continuum models, model order reduction, substructuring, and computational techniques. System identification and mode verification are then discussed with reference to the verification of mathematical models of large space structures. In connection with analysis, the paper surveys recent research on eigensolvers and dynamic response solvers for large-order finite-element-based models.

  13. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  14. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  15. Application of Dynamic Analysis in Semi-Analytical Finite Element Method

    PubMed Central

    Oeser, Markus

    2017-01-01

    Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement’s state. PMID:28867813

  16. Advancing Measurement Science to Assess Monitoring, Diagnostics, and Prognostics for Manufacturing Robotics

    PubMed Central

    Qiao, Guixiu; Weiss, Brian A.

    2016-01-01

    Unexpected equipment downtime is a ‘pain point’ for manufacturers, especially in that this event usually translates to financial losses. To minimize this pain point, manufacturers are developing new health monitoring, diagnostic, prognostic, and maintenance (collectively known as prognostics and health management (PHM)) techniques to advance the state-of-the-art in their maintenance strategies. The manufacturing community has a wide-range of needs with respect to the advancement and integration of PHM technologies to enhance manufacturing robotic system capabilities. Numerous researchers, including personnel from the National Institute of Standards and Technology (NIST), have identified a broad landscape of barriers and challenges to advancing PHM technologies. One such challenge is the verification and validation of PHM technology through the development of performance metrics, test methods, reference datasets, and supporting tools. Besides documenting and presenting the research landscape, NIST personnel are actively researching PHM for robotics to promote the development of innovative sensing technology and prognostic decision algorithms and to produce a positional accuracy test method that emphasizes the identification of static and dynamic positional accuracy. The test method development will provide manufacturers with a methodology that will allow them to quickly assess the positional health of their robot systems along with supporting the verification and validation of PHM techniques for the robot system. PMID:28058172

  17. Advancing Measurement Science to Assess Monitoring, Diagnostics, and Prognostics for Manufacturing Robotics.

    PubMed

    Qiao, Guixiu; Weiss, Brian A

    2016-01-01

    Unexpected equipment downtime is a 'pain point' for manufacturers, especially in that this event usually translates to financial losses. To minimize this pain point, manufacturers are developing new health monitoring, diagnostic, prognostic, and maintenance (collectively known as prognostics and health management (PHM)) techniques to advance the state-of-the-art in their maintenance strategies. The manufacturing community has a wide-range of needs with respect to the advancement and integration of PHM technologies to enhance manufacturing robotic system capabilities. Numerous researchers, including personnel from the National Institute of Standards and Technology (NIST), have identified a broad landscape of barriers and challenges to advancing PHM technologies. One such challenge is the verification and validation of PHM technology through the development of performance metrics, test methods, reference datasets, and supporting tools. Besides documenting and presenting the research landscape, NIST personnel are actively researching PHM for robotics to promote the development of innovative sensing technology and prognostic decision algorithms and to produce a positional accuracy test method that emphasizes the identification of static and dynamic positional accuracy. The test method development will provide manufacturers with a methodology that will allow them to quickly assess the positional health of their robot systems along with supporting the verification and validation of PHM techniques for the robot system.

  18. P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)

    NASA Astrophysics Data System (ADS)

    Kropp, Derek L.

    2009-05-01

    One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.

  19. Scalable Adaptive Architectures for Maritime Operations Center Command and Control

    DTIC Science & Technology

    2011-05-06

    the project to investigate the possibility of using earlier work on the validation and verification of rule bases in addressing the dynamically ...support the organization. To address the dynamically changing rules of engagement of a maritime force as it crosses different geographical areas, GMU... dynamic analysis, makes use of an Occurrence Graph that corresponds to the dynamics (or execution) of the Petri Net, to capture properties

  20. A Verification-Driven Approach to Control Analysis and Tuning

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2008-01-01

    This paper proposes a methodology for the analysis and tuning of controllers using control verification metrics. These metrics, which are introduced in a companion paper, measure the size of the largest uncertainty set of a given class for which the closed-loop specifications are satisfied. This framework integrates deterministic and probabilistic uncertainty models into a setting that enables the deformation of sets in the parameter space, the control design space, and in the union of these two spaces. In regard to control analysis, we propose strategies that enable bounding regions of the design space where the specifications are satisfied by all the closed-loop systems associated with a prescribed uncertainty set. When this is unfeasible, we bound regions where the probability of satisfying the requirements exceeds a prescribed value. In regard to control tuning, we propose strategies for the improvement of the robust characteristics of a baseline controller. Some of these strategies use multi-point approximations to the control verification metrics in order to alleviate the numerical burden of solving a min-max problem. Since this methodology targets non-linear systems having an arbitrary, possibly implicit, functional dependency on the uncertain parameters and for which high-fidelity simulations are available, they are applicable to realistic engineering problems..

  1. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  2. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  3. Personal Identification by Keystroke Dynamics in Japanese Free Text Typing

    NASA Astrophysics Data System (ADS)

    Samura, Toshiharu; Nishimura, Haruhiko

    Biometrics is classified into verification and identification. Many researchers on the keystroke dynamics have treated the verification of a fixed short password which is used for the user login. In this research, we pay attention to the identification and investigate several characteristics of the keystroke dynamics in Japanese free text typing. We developed Web-based typing software in order to collect the keystroke data on the Local Area Network and performed experiments on a total of 112 subjects, from which three groups of typing level, the beginner's level and above, the normal level and above and the middle level and above were constructed. Based on the identification methods by the weighted Euclid distance and the neural network for the extracted feature indexes in Japanese texts, we evaluated identification performances for the three groups. As a result, high accuracy of personal identification was confirmed in both methods, in proportion to the typing level of the group.

  4. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less

  5. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE PAGES

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    2017-10-16

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less

  6. Structural Dynamic Analyses And Test Predictions For Spacecraft Structures With Non-Linearities

    NASA Astrophysics Data System (ADS)

    Vergniaud, Jean-Baptiste; Soula, Laurent; Newerla, Alfred

    2012-07-01

    The overall objective of the mechanical development and verification process is to ensure that the spacecraft structure is able to sustain the mechanical environments encountered during launch. In general the spacecraft structures are a-priori assumed to behave linear, i.e. the responses to a static load or dynamic excitation, respectively, will increase or decrease proportionally to the amplitude of the load or excitation induced. However, past experiences have shown that various non-linearities might exist in spacecraft structures and the consequences of their dynamic effects can significantly affect the development and verification process. Current processes are mainly adapted to linear spacecraft structure behaviour. No clear rules exist for dealing with major structure non-linearities. They are handled outside the process by individual analysis and margin policy, and analyses after tests to justify the CLA coverage. Non-linearities can primarily affect the current spacecraft development and verification process on two aspects. Prediction of flights loads by launcher/satellite coupled loads analyses (CLA): only linear satellite models are delivered for performing CLA and no well-established rules exist how to properly linearize a model when non- linearities are present. The potential impact of the linearization on the results of the CLA has not yet been properly analyzed. There are thus difficulties to assess that CLA results will cover actual flight levels. Management of satellite verification tests: the CLA results generated with a linear satellite FEM are assumed flight representative. If the internal non- linearities are present in the tested satellite then there might be difficulties to determine which input level must be passed to cover satellite internal loads. The non-linear behaviour can also disturb the shaker control, putting the satellite at risk by potentially imposing too high levels. This paper presents the results of a test campaign performed in the frame of an ESA TRP study [1]. A bread-board including typical non-linearities has been designed, manufactured and tested through a typical spacecraft dynamic test campaign. The study has demonstrate the capabilities to perform non-linear dynamic test predictions on a flight representative spacecraft, the good correlation of test results with respect to Finite Elements Model (FEM) prediction and the possibility to identify modal behaviour and to characterize non-linearities characteristics from test results. As a synthesis for this study, overall guidelines have been derived on the mechanical verification process to improve level of expertise on tests involving spacecraft including non-linearity.

  7. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program

    NASA Technical Reports Server (NTRS)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby

    2017-01-01

    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  8. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  9. Reconstruction of dynamic structures of experimental setups based on measurable experimental data only

    NASA Astrophysics Data System (ADS)

    Chen, Tian-Yu; Chen, Yang; Yang, Hu-Jiang; Xiao, Jing-Hua; Hu, Gang

    2018-03-01

    Nowadays, massive amounts of data have been accumulated in various and wide fields, it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data. It is often that the output data of systems are measurable while dynamic structures producing these data are hidden, and thus studies to reveal system structures by analyzing available data, i.e., reconstructions of systems become one of the most important tasks of information extractions. In the past, most of the works in this respect were based on theoretical analyses and numerical verifications. Direct analyses of experimental data are very rare. In physical science, most of the analyses of experimental setups were based on the first principles of physics laws, i.e., so-called top-down analyses. In this paper, we conducted an experiment of “Boer resonant instrument for forced vibration” (BRIFV) and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data, i.e., by applying the bottom-up strategy. Dynamics of the experimental set is strongly nonlinear and chaotic, and itʼs subjects to inevitable noises. We proposed to use high-order correlation computations to treat nonlinear dynamics; use two-time correlations to treat noise effects. By applying these approaches, we have successfully reconstructed the structure of the experimental setup, and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.

  10. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database

    PubMed Central

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions. PMID:28475590

  11. Benchmarking desktop and mobile handwriting across COTS devices: The e-BioSign biometric database.

    PubMed

    Tolosana, Ruben; Vera-Rodriguez, Ruben; Fierrez, Julian; Morales, Aythami; Ortega-Garcia, Javier

    2017-01-01

    This paper describes the design, acquisition process and baseline evaluation of the new e-BioSign database, which includes dynamic signature and handwriting information. Data is acquired from 5 different COTS devices: three Wacom devices (STU-500, STU-530 and DTU-1031) specifically designed to capture dynamic signatures and handwriting, and two general purpose tablets (Samsung Galaxy Note 10.1 and Samsung ATIV 7). For the two Samsung tablets, data is collected using both pen stylus and also the finger in order to study the performance of signature verification in a mobile scenario. Data was collected in two sessions for 65 subjects, and includes dynamic information of the signature, the full name and alpha numeric sequences. Skilled forgeries were also performed for signatures and full names. We also report a benchmark evaluation based on e-BioSign for person verification under three different real scenarios: 1) intra-device, 2) inter-device, and 3) mixed writing-tool. We have experimented the proposed benchmark using the main existing approaches for signature verification: feature- and time functions-based. As a result, new insights into the problem of signature biometrics in sensor-interoperable scenarios have been obtained, namely: the importance of specific methods for dealing with device interoperability, and the necessity of a deeper analysis on signatures acquired using the finger as the writing tool. This e-BioSign public database allows the research community to: 1) further analyse and develop signature verification systems in realistic scenarios, and 2) investigate towards a better understanding of the nature of the human handwriting when captured using electronic COTS devices in realistic conditions.

  12. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  13. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  14. Mesh and Time-Step Independent Computational Fluid Dynamics (CFD) Solutions

    ERIC Educational Resources Information Center

    Nijdam, Justin J.

    2013-01-01

    A homework assignment is outlined in which students learn Computational Fluid Dynamics (CFD) concepts of discretization, numerical stability and accuracy, and verification in a hands-on manner by solving physically realistic problems of practical interest to engineers. The students solve a transient-diffusion problem numerically using the common…

  15. Development tests for the 2.5 megawatt Mod-2 wind turbine generator

    NASA Technical Reports Server (NTRS)

    Andrews, J. S.; Baskin, J. M.

    1982-01-01

    The 2.5 megawatt MOD-2 wind turbine generator test program is discussed. The development of the 2.5 megawatt MOD-2 wind turbine generator included an extensive program of testing which encompassed verification of analytical procedures, component development, and integrated system verification. The test program was to assure achievement of the thirty year design operational life of the wind turbine system as well as to minimize costly design modifications which would otherwise have been required during on site system testing. Computer codes were modified, fatigue life of structure and dynamic components were verified, mechanical and electrical component and subsystems were functionally checked and modified where necessary to meet system specifications, and measured dynamic responses of coupled systems confirmed analytical predictions.

  16. Research on HDR image fusion algorithm based on Laplace pyramid weight transform with extreme low-light CMOS

    NASA Astrophysics Data System (ADS)

    Guan, Wen; Li, Li; Jin, Weiqi; Qiu, Su; Zou, Yan

    2015-10-01

    Extreme-Low-Light CMOS has been widely applied in the field of night-vision as a new type of solid image sensor. But if the illumination in the scene has drastic changes or the illumination is too strong, Extreme-Low-Light CMOS can't both clearly present the high-light scene and low-light region. According to the partial saturation problem in the field of night-vision, a HDR image fusion algorithm based on the Laplace Pyramid was researched. The overall gray value and the contrast of the low light image is very low. We choose the fusion strategy based on regional average gradient for the top layer of the long exposure image and short exposure image, which has rich brightness and textural features. The remained layers which represent the edge feature information of the target are based on the fusion strategy based on regional energy. In the process of source image reconstruction with Laplacian pyramid image, we compare the fusion results with four kinds of basal images. The algorithm is tested using Matlab and compared with the different fusion strategies. We use information entropy, average gradient and standard deviation these three objective evaluation parameters for the further analysis of the fusion result. Different low illumination environment experiments show that the algorithm in this paper can rapidly get wide dynamic range while keeping high entropy. Through the verification of this algorithm features, there is a further application prospect of the optimized algorithm. Keywords: high dynamic range imaging, image fusion, multi-exposure image, weight coefficient, information fusion, Laplacian pyramid transform.

  17. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barahona, B.; Jonkman, J.; Damiani, R.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshoremore » Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.« less

  18. Air Force is Developing Risk-Mitigation Strategies to Manage Potential Loss of the RD-180 Engine (REDACTED)

    DTIC Science & Technology

    2015-03-05

    launched on its rocket- estimated completion date of May 2015. Air Force will require verification that SpaceX can meet payload integration...design and accelerate integration capability at Space Exploration Technologies Corporation ( SpaceX )1 launch sites. o The Air Force does not intend to...accelerate integration capabilities at SpaceX launch sites because of the studies it directed, but will require verification that SpaceX can meet

  19. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  20. Combustion Stability Verification for the Thrust Chamber Assembly of J-2X Developmental Engines 10001, 10002, and 10003

    NASA Technical Reports Server (NTRS)

    Morgan, C. J.; Hulka, J. R.; Casiano, M. J.; Kenny, R. J.; Hinerman, T. D.; Scholten, N.

    2015-01-01

    The J-2X engine, a liquid oxygen/liquid hydrogen propellant rocket engine available for future use on the upper stage of the Space Launch System vehicle, has completed testing of three developmental engines at NASA Stennis Space Center. Twenty-one tests of engine E10001 were conducted from June 2011 through September 2012, thirteen tests of the engine E10002 were conducted from February 2013 through September 2013, and twelve tests of engine E10003 were conducted from November 2013 to April 2014. Verification of combustion stability of the thrust chamber assembly was conducted by perturbing each of the three developmental engines. The primary mechanism for combustion stability verification was examining the response caused by an artificial perturbation (bomb) in the main combustion chamber, i.e., dynamic combustion stability rating. No dynamic instabilities were observed in the TCA, although a few conditions were not bombed. Additional requirements, included to guard against spontaneous instability or rough combustion, were also investigated. Under certain conditions, discrete responses were observed in the dynamic pressure data. The discrete responses were of low amplitude and posed minimal risk to safe engine operability. Rough combustion analyses showed that all three engines met requirements for broad-banded frequency oscillations. Start and shutdown transient chug oscillations were also examined to assess the overall stability characteristics, with no major issues observed.

  1. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  2. Coupled numerical simulation of fire in tunnel

    NASA Astrophysics Data System (ADS)

    Pesavento, F.; Pachera, M.; Schrefler, B. A.; Gawin, D.; Witek, A.

    2018-01-01

    In this work, a coupling strategy for the analysis of a tunnel under fire is presented. This strategy consists in a "one-way" coupling between a tool considering the computational fluid dynamics and radiation with a model treating concrete as a multiphase porous material exposed to high temperature. This global approach allows for taking into account in a realistic manner the behavior of the "system tunnel", composed of the fluid and the solid domain (i.e. the concrete structures), from the fire onset, its development and propagation to the response of the structure. The thermal loads as well as the moisture exchange between the structure surface and the environment are calculated by means of computational fluid dynamics. These set of data are passed in an automatic way to the numerical tool implementing a model based on Multiphase Porous Media Mechanics. Thanks to this strategy the structural verification is no longer based on the standard fire curves commonly used in the engineering practice, but it is directly related to a realistic fire scenario. To show the capability of this strategy some numerical simulations of a fire in the Brenner Base Tunnel, under construction between Italy and Austria, is presented. The numerical simulations show the effects of a more realistic distribution of the thermal loads with respect to the ones obtained by using the standard fire curves. Moreover, it is possible to highlight how the localized thermal load generates a non-uniform pressure rise in the material, which results in an increase of the structure stress state and of the spalling risk. Spalling is likely the most dangerous collapse mechanism for a concrete structure. This coupling approach still represents a "one way" strategy, i.e. realized without considering explicitly the mass and energy exchange from the structure to the fluid through the interface. This results in an approximation, but from physical point of view the current form of the solid-fluid coupling is considered sufficiently accurate in this first phase of the research.

  3. Amy Robertson | NREL

    Science.gov Websites

    validation, and data analysis. At NREL, Amy specializes in the modeling of offshore wind system dynamics. She Amy.Robertson@nrel.gov | 303-384-7157 Amy's expertise is in structural dynamics modeling, verification and of offshore wind modeling tools. Prior to joining NREL, Amy worked as an independent consultant for

  4. Dynamic characterization and microprocessor control of the NASA/UVA proof mass actuator

    NASA Technical Reports Server (NTRS)

    Zimmerman, D. C.; Inman, D. J.; Horner, G. C.

    1984-01-01

    The self-contained electromagnetic-reaction-type force-actuator system developed by NASA/UVA for the verification of spacecraft-structure vibration-control laws is characterized and demonstrated. The device is controlled by a dedicated microprocessor and has dynamic characteristics determined by Fourier analysis. Test data on a cantilevered beam are shown.

  5. Analyzing Personalized Policies for Online Biometric Verification

    PubMed Central

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M.

    2014-01-01

    Motivated by India’s nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident’s biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India’s program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India’s biometric program. The mean delay is sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32–41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident. PMID:24787752

  6. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  7. A Practical Approach to Implementing Real-Time Semantics

    NASA Technical Reports Server (NTRS)

    Luettgen, Gerald; Bhat, Girish; Cleaveland, Rance

    1999-01-01

    This paper investigates implementations of process algebras which are suitable for modeling concurrent real-time systems. It suggests an approach for efficiently implementing real-time semantics using dynamic priorities. For this purpose a proces algebra with dynamic priority is defined, whose semantics corresponds one-to-one to traditional real-time semantics. The advantage of the dynamic-priority approach is that it drastically reduces the state-space sizes of the systems in question while preserving all properties of their functional and real-time behavior. The utility of the technique is demonstrated by a case study which deals with the formal modeling and verification of the SCSI-2 bus-protocol. The case study is carried out in the Concurrency Workbench of North Carolina, an automated verification tool in which the process algebra with dynamic priority is implemented. It turns out that the state space of the bus-protocol model is about an order of magnitude smaller than the one resulting from real-time semantics. The accuracy of the model is proved by applying model checking for verifying several mandatory properties of the bus protocol.

  8. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Glaese, John R.

    1994-01-01

    Under this contract, the Large Space Structure Ground Test Verification (LSSGTV) Facility at the George C. Marshall Space Flight Center (MSFC) was developed. Planning in coordination with NASA was finalized and implemented. The contract was modified and extended with several increments of funding to procure additional hardware and to continue support for the LSSGTV facility. Additional tasks were defined for the performance of studies in the dynamics, control and simulation of tethered satellites. When the LSSGTV facility development task was completed, support and enhancement activities were funded through a new competitive contract won by LCD. All work related to LSSGTV performed under NAS8-35835 has been completed and documented. No further discussion of these activities will appear in this report. This report summarizes the tether dynamics and control studies performed.

  9. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    NASA Technical Reports Server (NTRS)

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  10. WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.

    PubMed

    Poels, K; Depuydt, T; Verellen, D; De Ridder, M

    2012-06-01

    to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.

  11. Theoretical calculations and experimental verification for the pumping effect caused by the dynamic micro-tapered angle

    NASA Astrophysics Data System (ADS)

    Cai, Yufei; Zhang, Jianhui; Zhu, Chunling; Huang, Jun; Jiang, Feng

    2016-05-01

    The atomizer with micro cone apertures has advantages of ultra-fine atomized droplets, low power consumption and low temperature rise. The current research of this kind of atomizer mainly focuses on the performance and its application while there is less research of the principle of the atomization. Under the analysis of the dispenser and its micro-tapered aperture's deformation, the volume changes during the deformation and vibration of the micro-tapered aperture on the dispenser are calculated by coordinate transformation. Based on the characters of the flow resistance in a cone aperture, it is found that the dynamic cone angle results from periodical changes of the volume of the micro-tapered aperture of the atomizer and this change drives one-way flows. Besides, an experimental atomization platform is established to measure the atomization rates with different resonance frequencies of the cone aperture atomizer. The atomization performances of cone aperture and straight aperture atomizers are also measured. The experimental results show the existence of the pumping effect of the dynamic tapered angle. This effect is usually observed in industries that require low dispersion and micro- and nanoscale grain sizes, such as during production of high-pressure nozzles and inhalation therapy. Strategies to minimize the pumping effect of the dynamic cone angle or improve future designs are important concerns. This research proposes that dynamic micro-tapered angle is an important cause of atomization of the atomizer with micro cone apertures.

  12. Solar Dynamics Observatory (SDO) HGAS Induced Jitter

    NASA Technical Reports Server (NTRS)

    Liu, Alice; Blaurock, Carl; Liu, Kuo-Chia; Mule, Peter

    2008-01-01

    This paper presents the results of a comprehensive assessment of High Gain Antenna System induced jitter on the Solar Dynamics Observatory. The jitter prediction is created using a coupled model of the structural dynamics, optical response, control systems, and stepper motor actuator electromechanical dynamics. The paper gives an overview of the model components, presents the verification processes used to evaluate the models, describes validation and calibration tests and model-to-measurement comparison results, and presents the jitter analysis methodology and results.

  13. Modeling in the quality by design environment: Regulatory requirements and recommendations for design space and control strategy appointment.

    PubMed

    Djuris, Jelena; Djuric, Zorica

    2017-11-30

    Mathematical models can be used as an integral part of the quality by design (QbD) concept throughout the product lifecycle for variety of purposes, including appointment of the design space and control strategy, continual improvement and risk assessment. Examples of different mathematical modeling techniques (mechanistic, empirical and hybrid) in the pharmaceutical development and process monitoring or control are provided in the presented review. In the QbD context, mathematical models are predominantly used to support design space and/or control strategies. Considering their impact to the final product quality, models can be divided into the following categories: high, medium and low impact models. Although there are regulatory guidelines on the topic of modeling applications, review of QbD-based submission containing modeling elements revealed concerns regarding the scale-dependency of design spaces and verification of models predictions at commercial scale of manufacturing, especially regarding real-time release (RTR) models. Authors provide critical overview on the good modeling practices and introduce concepts of multiple-unit, adaptive and dynamic design space, multivariate specifications and methods for process uncertainty analysis. RTR specification with mathematical model and different approaches to multivariate statistical process control supporting process analytical technologies are also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Microscopy as a statistical, Rényi-Ulam, half-lie game: a new heuristic search strategy to accelerate imaging.

    PubMed

    Drumm, Daniel W; Greentree, Andrew D

    2017-11-07

    Finding a fluorescent target in a biological environment is a common and pressing microscopy problem. This task is formally analogous to the canonical search problem. In ideal (noise-free, truthful) search problems, the well-known binary search is optimal. The case of half-lies, where one of two responses to a search query may be deceptive, introduces a richer, Rényi-Ulam problem and is particularly relevant to practical microscopy. We analyse microscopy in the contexts of Rényi-Ulam games and half-lies, developing a new family of heuristics. We show the cost of insisting on verification by positive result in search algorithms; for the zero-half-lie case bisectioning with verification incurs a 50% penalty in the average number of queries required. The optimal partitioning of search spaces directly following verification in the presence of random half-lies is determined. Trisectioning with verification is shown to be the most efficient heuristic of the family in a majority of cases.

  15. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  16. Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2002-01-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.

  17. A strategy for automatically generating programs in the lucid programming language

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1987-01-01

    A strategy for automatically generating and verifying simple computer programs is described. The programs are specified by a precondition and a postcondition in predicate calculus. The programs generated are in the Lucid programming language, a high-level, data-flow language known for its attractive mathematical properties and ease of program verification. The Lucid programming is described, and the automatic program generation strategy is described and applied to several example problems.

  18. The Hawaiian Electric Companies | Energy Systems Integration Facility |

    Science.gov Websites

    farm in Maui, Hawaii Verification of Voltage Regulation Operating Strategies NREL has studied how Hawaiian Electric Companies can best manage voltage regulation functions from distributed technologies. Two

  19. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  20. Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets

    NASA Technical Reports Server (NTRS)

    Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.

    1978-01-01

    A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.

  1. European Train Control System: A Case Study in Formal Verification

    NASA Astrophysics Data System (ADS)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  2. Specific 13C labeling of leucine, valine and isoleucine methyl groups for unambiguous detection of long-range restraints in protein solid-state NMR studies

    NASA Astrophysics Data System (ADS)

    Fasshuber, Hannes Klaus; Demers, Jean-Philippe; Chevelkov, Veniamin; Giller, Karin; Becker, Stefan; Lange, Adam

    2015-03-01

    Here we present an isotopic labeling strategy to easily obtain unambiguous long-range distance restraints in protein solid-state NMR studies. The method is based on the inclusion of two biosynthetic precursors in the bacterial growth medium, α-ketoisovalerate and α-ketobutyrate, leading to the production of leucine, valine and isoleucine residues that are exclusively 13C labeled on methyl groups. The resulting spectral simplification facilitates the collection of distance restraints, the verification of carbon chemical shift assignments and the measurement of methyl group dynamics. This approach is demonstrated on the type-three secretion system needle of Shigella flexneri, where 49 methyl-methyl and methyl-nitrogen distance restraints including 10 unambiguous long-range distance restraints could be collected. By combining this labeling scheme with ultra-fast MAS and proton detection, the assignment of methyl proton chemical shifts was achieved.

  3. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Doane, G. B., III; Glaese, J. R.; Tollison, D. K.; Howsman, T. G.; Curtis, S. (Editor); Banks, B.

    1984-01-01

    Control theory and design, dynamic system modelling, and simulation of test scenarios are the main ideas discussed. The overall effort is the achievement at Marshall Space Flight Center of a successful ground test experiment of a large space structure. A simplified planar model of ground test experiment of a large space structure. A simplified planar model of ground test verification was developed. The elimination from that model of the uncontrollable rigid body modes was also examined. Also studied was the hardware/software of computation speed.

  4. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  5. Genome-Scale Screen for DNA Methylation-Based Detection Markers for Ovarian Cancer

    PubMed Central

    Houshdaran, Sahar; Shen, Hui; Widschwendter, Martin; Daxenbichler, Günter; Long, Tiffany; Marth, Christian; Laird-Offringa, Ite A.; Press, Michael F.; Dubeau, Louis; Siegmund, Kimberly D.; Wu, Anna H.; Groshen, Susan; Chandavarkar, Uma; Roman, Lynda D.; Berchuck, Andrew; Pearce, Celeste L.; Laird, Peter W.

    2011-01-01

    Background The identification of sensitive biomarkers for the detection of ovarian cancer is of high clinical relevance for early detection and/or monitoring of disease recurrence. We developed a systematic multi-step biomarker discovery and verification strategy to identify candidate DNA methylation markers for the blood-based detection of ovarian cancer. Methodology/Principal Findings We used the Illumina Infinium platform to analyze the DNA methylation status of 27,578 CpG sites in 41 ovarian tumors. We employed a marker selection strategy that emphasized sensitivity by requiring consistency of methylation across tumors, while achieving specificity by excluding markers with methylation in control leukocyte or serum DNA. Our verification strategy involved testing the ability of identified markers to monitor disease burden in serially collected serum samples from ovarian cancer patients who had undergone surgical tumor resection compared to CA-125 levels. We identified one marker, IFFO1 promoter methylation (IFFO1-M), that is frequently methylated in ovarian tumors and that is rarely detected in the blood of normal controls. When tested in 127 serially collected sera from ovarian cancer patients, IFFO1-M showed post-resection kinetics significantly correlated with serum CA-125 measurements in six out of 16 patients. Conclusions/Significance We implemented an effective marker screening and verification strategy, leading to the identification of IFFO1-M as a blood-based candidate marker for sensitive detection of ovarian cancer. Serum levels of IFFO1-M displayed post-resection kinetics consistent with a reflection of disease burden. We anticipate that IFFO1-M and other candidate markers emerging from this marker development pipeline may provide disease detection capabilities that complement existing biomarkers. PMID:22163280

  6. Biometric verification in dynamic writing

    NASA Astrophysics Data System (ADS)

    George, Susan E.

    2002-03-01

    Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.

  7. Determination and Control of Optical and X-Ray Wave Fronts

    NASA Technical Reports Server (NTRS)

    Kim, Young K.

    1997-01-01

    A successful design of a space-based or ground optical system requires an iterative procedure which includes the kinematics and dynamics of the system in operating environment, control synthesis and verification. To facilitate the task of designing optical wave front control systems being developed at NASA/MSFC, a multi-discipline dynamics and control tool has been developed by utilizing TREETOPS, a multi-body dynamics and control simulation, NASTRAN and MATLAB. Dynamics and control models of STABLE and ARIS were developed for TREETOPS simulation, and their simulation results are documented in this report.

  8. Clinical evaluation of 4D PET motion compensation strategies for treatment verification in ion beam therapy

    NASA Astrophysics Data System (ADS)

    Gianoli, Chiara; Kurz, Christopher; Riboldi, Marco; Bauer, Julia; Fontana, Giulia; Baroni, Guido; Debus, Jürgen; Parodi, Katia

    2016-06-01

    A clinical trial named PROMETHEUS is currently ongoing for inoperable hepatocellular carcinoma (HCC) at the Heidelberg Ion Beam Therapy Center (HIT, Germany). In this framework, 4D PET-CT datasets are acquired shortly after the therapeutic treatment to compare the irradiation induced PET image with a Monte Carlo PET prediction resulting from the simulation of treatment delivery. The extremely low count statistics of this measured PET image represents a major limitation of this technique, especially in presence of target motion. The purpose of the study is to investigate two different 4D PET motion compensation strategies towards the recovery of the whole count statistics for improved image quality of the 4D PET-CT datasets for PET-based treatment verification. The well-known 4D-MLEM reconstruction algorithm, embedding the motion compensation in the reconstruction process of 4D PET sinograms, was compared to a recently proposed pre-reconstruction motion compensation strategy, which operates in sinogram domain by applying the motion compensation to the 4D PET sinograms. With reference to phantom and patient datasets, advantages and drawbacks of the two 4D PET motion compensation strategies were identified. The 4D-MLEM algorithm was strongly affected by inverse inconsistency of the motion model but demonstrated the capability to mitigate the noise-break-up effects. Conversely, the pre-reconstruction warping showed less sensitivity to inverse inconsistency but also more noise in the reconstructed images. The comparison was performed by relying on quantification of PET activity and ion range difference, typically yielding similar results. The study demonstrated that treatment verification of moving targets could be accomplished by relying on the whole count statistics image quality, as obtained from the application of 4D PET motion compensation strategies. In particular, the pre-reconstruction warping was shown to represent a promising choice when combined with intra-reconstruction smoothing.

  9. Closed-Loop Acoustic Control of Reverberant Room for Satellite Environmental Testing

    NASA Astrophysics Data System (ADS)

    Janssens, Karl; Bianciardi, Fabio; Sabbatini, Danilo; Debille, Jan; Carrella, Alex

    2012-07-01

    The full satellite acoustic test is an important milestone in a satellite launch survivability verification campaign. This test is required to verify the satellite’s mechanical design against the high-level acoustic loads induced by the launch vehicle during the atmospheric flight. During the test, the satellite is subjected to a broadband diffuse acoustic field, reproducing the pressure levels observed during launch. The excitation is in most cases provided by a combination of horns for the low frequencies and noise generators for the higher frequencies. Acoustic control tests are commonly performed in reverberant rooms, controlling the sound pressure levels in third octave bands over the specified target spectrum. This paper discusses an automatic feedback control system for acoustic control of large reverberation rooms for satellite environmental testing. The acoustic control system consists of parallel third octave PI (Proportional Integral) feedback controllers that take the reverberation characteristics of the room into consideration. The drive output of the control system is shaped at every control step based on the comparison of the average third octave noise spectrum, measured from a number of microphones in the test room, with the target spectrum. Cross-over filters split the output drive into band- limited signals to feed each of the horns. The control system is realized in several steps. In the first phase, a dynamic process model is developed, including the non-linear characteristics of the horns and the reverberant properties of the room. The model is identified from dynamic experiments using system identification techniques. In the next phase, an adequate control strategy is designed which is capable of reaching the target spectrum in the required time period without overshoots. This control strategy is obtained from model-in-the-loop (MIL) simulations, evaluating the performance of various potential strategies. Finally, the proposed strategy is implemented in real-time and its control performance tested and validated.

  10. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  11. Conceptual design of a Moving Belt Radiator (MBR) shuttle-attached experiment

    NASA Technical Reports Server (NTRS)

    Aguilar, Jerry L.

    1990-01-01

    The conceptual design of a shuttle-attached Moving Belt Radiator (MBR) experiment is presented. The MBR is an advanced radiator concept in which a rotating belt is used to radiate thermal energy to space. The experiment is developed with the primary focus being the verification of the dynamic characteristics of a rotating belt with a secondary objective of proving the thermal and sealing aspects in a reduced gravity, vacuum environment. The mechanical design, selection of the belt material and working fluid, a preliminary test plan, and program plan are presented. The strategy used for selecting the basic sizes and materials of the components are discussed. Shuttle and crew member requirements are presented with some options for increasing or decreasing the demands on the STS. An STS carrier and the criteria used in the selection process are presented. The proposed carrier for the Moving Belt Radiator experiment is the Hitchhiker-M. Safety issues are also listed with possible results. This experiment is designed so that a belt can be deployed, run at steady state conditions, run with dynamic perturbations imposed, verify the operation of the interface heat exchanger and seals, and finally be retracted into a stowed position for transport back to earth.

  12. ECCD-induced tearing mode stabilization via active control in coupled NIMROD/GENRAY HPC simulations

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas; Kruger, S. E.; Held, E. D.; Harvey, R. W.

    2012-10-01

    Actively controlled electron cyclotron current drive (ECCD) applied within magnetic islands formed by neoclassical tearing modes (NTMs) has been shown to control or suppress these modes. In conjunction with ongoing experimental efforts, the development and verification of integrated numerical models of this mode stabilization process is of paramount importance in determining optimal NTM stabilization strategies for ITER. In the advanced model developed by the SWIM Project, the equations/closures of extended (not reduced) MHD contain new terms arising from 3D (not toroidal or bounce-averaged) RF-induced quasilinear diffusion. The quasilinear operator formulation models the equilibration of driven current within the island using the same extended MHD dynamics which govern the physics of island formation, yielding a more accurate and self-consistent picture of 3D island response to RF drive. Results of computations which model ECRF deposition using ray tracing, assemble the 3D quasilinear operator from ray/profile data, and calculate the resultant forces within the extended MHD code will be presented. We also discuss the efficacy of various numerical active feedback control systems, which gather data from synthetic diagnostics to dynamically trigger and spatially align RF fields.

  13. Intelligent voltage control strategy for three-phase UPS inverters with output LC filter

    NASA Astrophysics Data System (ADS)

    Jung, J. W.; Leu, V. Q.; Dang, D. Q.; Do, T. D.; Mwasilu, F.; Choi, H. H.

    2015-08-01

    This paper presents a supervisory fuzzy neural network control (SFNNC) method for a three-phase inverter of uninterruptible power supplies (UPSs). The proposed voltage controller is comprised of a fuzzy neural network control (FNNC) term and a supervisory control term. The FNNC term is deliberately employed to estimate the uncertain terms, and the supervisory control term is designed based on the sliding mode technique to stabilise the system dynamic errors. To improve the learning capability, the FNNC term incorporates an online parameter training methodology, using the gradient descent method and Lyapunov stability theory. Besides, a linear load current observer that estimates the load currents is used to exclude the load current sensors. The proposed SFNN controller and the observer are robust to the filter inductance variations, and their stability analyses are described in detail. The experimental results obtained on a prototype UPS test bed with a TMS320F28335 DSP are presented to validate the feasibility of the proposed scheme. Verification results demonstrate that the proposed control strategy can achieve smaller steady-state error and lower total harmonic distortion when subjected to nonlinear or unbalanced loads compared to the conventional sliding mode control method.

  14. A Novel Patient Recruitment Strategy: Patient Selection Directly from the Community through Linkage to Clinical Data.

    PubMed

    Zimmerman, Lindsay P; Goel, Satyender; Sathar, Shazia; Gladfelter, Charon E; Onate, Alejandra; Kane, Lindsey L; Sital, Shelly; Phua, Jasmin; Davis, Paris; Margellos-Anast, Helen; Meltzer, David O; Polonsky, Tamar S; Shah, Raj C; Trick, William E; Ahmad, Faraz S; Kho, Abel N

    2018-01-01

    This article presents and describes our methods in developing a novel strategy for recruitment of underrepresented, community-based participants, for pragmatic research studies leveraging routinely collected electronic health record (EHR) data. We designed a new approach for recruiting eligible patients from the community, while also leveraging affiliated health systems to extract clinical data for community participants. The strategy involves methods for data collection, linkage, and tracking. In this workflow, potential participants are identified in the community and surveyed regarding eligibility. These data are then encrypted and deidentified via a hashing algorithm for linkage of the community participant back to a record at a clinical site. The linkage allows for eligibility verification and automated follow-up. Longitudinal data are collected by querying the EHR data and surveying the community participant directly. We discuss this strategy within the context of two national research projects, a clinical trial and an observational cohort study. The community-based recruitment strategy is a novel, low-touch, clinical trial enrollment method to engage a diverse set of participants. Direct outreach to community participants, while utilizing EHR data for clinical information and follow-up, allows for efficient recruitment and follow-up strategies. This new strategy for recruitment links data reported from community participants to clinical data in the EHR and allows for eligibility verification and automated follow-up. The workflow has the potential to improve recruitment efficiency and engage traditionally underrepresented individuals in research. Schattauer GmbH Stuttgart.

  15. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, A; Han, B; Bush, K

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluencemore » by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.« less

  16. Evaluation of the 29-km Eta Model for Weather Support to the United States Space Program

    NASA Technical Reports Server (NTRS)

    Manobianco, John; Nutter, Paul

    1997-01-01

    The Applied Meteorology Unit (AMU) conducted a year-long evaluation of NCEP's 29-km mesoscale Eta (meso-eta) weather prediction model in order to identify added value to forecast operations in support of the United States space program. The evaluation was stratified over warm and cool seasons and considered both objective and subjective verification methodologies. Objective verification results generally indicate that meso-eta model point forecasts at selected stations exhibit minimal error growth in terms of RMS errors and are reasonably unbiased. Conversely, results from the subjective verification demonstrate that model forecasts of developing weather events such as thunderstorms, sea breezes, and cold fronts, are not always as accurate as implied by the seasonal error statistics. Sea-breeze case studies reveal that the model generates a dynamically-consistent thermally direct circulation over the Florida peninsula, although at a larger scale than observed. Thunderstorm verification reveals that the meso-eta model is capable of predicting areas of organized convection, particularly during the late afternoon hours but is not capable of forecasting individual thunderstorms. Verification of cold fronts during the cool season reveals that the model is capable of forecasting a majority of cold frontal passages through east central Florida to within +1-h of observed frontal passage.

  17. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  18. Integrated verification and testing system (IVTS) for HAL/S programs

    NASA Technical Reports Server (NTRS)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  19. A program for the investigation of the Multibody Modeling, Verification, and Control Laboratory

    NASA Technical Reports Server (NTRS)

    Tobbe, Patrick A.; Christian, Paul M.; Rakoczy, John M.; Bulter, Marlon L.

    1993-01-01

    The Multibody Modeling, Verification, and Control (MMVC) Laboratory is under development at NASA MSFC in Huntsville, Alabama. The laboratory will provide a facility in which dynamic tests and analyses of multibody flexible structures representative of future space systems can be conducted. The purpose of the tests are to acquire dynamic measurements of the flexible structures undergoing large angle motions and use the data to validate the multibody modeling code, TREETOPS, developed under sponsorship of NASA. Advanced control systems design and system identification methodologies will also be implemented in the MMVC laboratory. This paper describes the ground test facility, the real-time control system, and the experiments. A top-level description of the TREETOPS code is also included along with the validation plan for the MMVC program. Dynamic test results from component testing are also presented and discussed. A detailed discussion of the test articles, which manifest the properties of large flexible space structures, is included along with a discussion of the various candidate control methodologies to be applied in the laboratory.

  20. Shuttle structural dynamics characteristics: The analysis and verification

    NASA Technical Reports Server (NTRS)

    Modlin, C. T., Jr.; Zupp, G. A., Jr.

    1985-01-01

    The space shuttle introduced a new dimension in the complexity of the structural dynamics of a space vehicle. The four-body configuration exhibited structural frequencies as low as 2 hertz with a model density on the order of 10 modes per hertz. In the verification process, certain mode shapes and frequencies were identified by the users as more important than others and, as such, the test objectives were oriented toward experimentally extracting those modes and frequencies for analysis and test correlation purposes. To provide the necessary experimental data, a series of ground vibration tests (GVT's) was conducted using test articles ranging from the 1/4-scale structural replica of the space shuttle to the full-scale vehicle. The vibration test and analysis program revealed that the mode shapes and frequency correlations below 10 hertz were good. The quality of correlation of modes between 10 and 20 hertz ranged from good to fair and that of modes above 20 hertz ranged from poor to good. Since the most important modes, based on user preference, were below 10 hertz, it was judged that the shuttle structural dynamic models were adequate for flight certifications.

  1. Fast regional readout CMOS Image Sensor for dynamic MLC tracking

    NASA Astrophysics Data System (ADS)

    Zin, H.; Harris, E.; Osmond, J.; Evans, P.

    2014-03-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  2. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  3. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    NASA Astrophysics Data System (ADS)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  4. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  5. Bridge Health Monitoring Using a Machine Learning Strategy

    DOT National Transportation Integrated Search

    2017-01-01

    The goal of this project was to cast the SHM problem within a statistical pattern recognition framework. Techniques borrowed from speaker recognition, particularly speaker verification, were used as this discipline deals with problems very similar to...

  6. Evaluation and Research for Technology: Not Just Playing Around.

    ERIC Educational Resources Information Center

    Baker, Eva L.; O'Neil, Harold F., Jr.

    2003-01-01

    Discusses some of the challenges of technology-based training and education, the role of quality verification and evaluation, and strategies to integrate evaluation into the everyday design of technology-based systems for education and training. (SLD)

  7. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    NASA Technical Reports Server (NTRS)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  8. Expert system verification and validation survey, delivery 4

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  9. Expert system verification and validation survey. Delivery 2: Survey results

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and industry applications. This is the first task of the series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  10. The experimental verification of wall movement influence coefficients for an adaptive walled test section

    NASA Technical Reports Server (NTRS)

    Neal, G.

    1988-01-01

    Flexible walled wind tunnels have for some time been used to reduce wall interference effects at the model. A necessary part of the 3-D wall adjustment strategy being developed for the Transonic Self-Streamlining Wind Tunnel (TSWT) of Southampton University is the use of influence coefficients. The influence of a wall bump on the centerline flow in TSWT has been calculated theoretically using a streamline curvature program. This report details the experimental verification of these influence coefficients and concludes that it is valid to use the theoretically determined values in 3-D model testing.

  11. Expert system verification and validation survey. Delivery 5: Revised

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  12. Expert system verification and validation survey. Delivery 3: Recommendations

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  13. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Impact of an antiretroviral stewardship strategy on medication error rates.

    PubMed

    Shea, Katherine M; Hobbs, Athena Lv; Shumake, Jason D; Templet, Derek J; Padilla-Tolentino, Eimeira; Mondy, Kristin E

    2018-05-02

    The impact of an antiretroviral stewardship strategy on medication error rates was evaluated. This single-center, retrospective, comparative cohort study included patients at least 18 years of age infected with human immunodeficiency virus (HIV) who were receiving antiretrovirals and admitted to the hospital. A multicomponent approach was developed and implemented and included modifications to the order-entry and verification system, pharmacist education, and a pharmacist-led antiretroviral therapy checklist. Pharmacists performed prospective audits using the checklist at the time of order verification. To assess the impact of the intervention, a retrospective review was performed before and after implementation to assess antiretroviral errors. Totals of 208 and 24 errors were identified before and after the intervention, respectively, resulting in a significant reduction in the overall error rate ( p < 0.001). In the postintervention group, significantly lower medication error rates were found in both patient admissions containing at least 1 medication error ( p < 0.001) and those with 2 or more errors ( p < 0.001). Significant reductions were also identified in each error type, including incorrect/incomplete medication regimen, incorrect dosing regimen, incorrect renal dose adjustment, incorrect administration, and the presence of a major drug-drug interaction. A regression tree selected ritonavir as the only specific medication that best predicted more errors preintervention ( p < 0.001); however, no antiretrovirals reliably predicted errors postintervention. An antiretroviral stewardship strategy for hospitalized HIV patients including prospective audit by staff pharmacists through use of an antiretroviral medication therapy checklist at the time of order verification decreased error rates. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  15. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  16. Detection and characterization of foodborne pathogenic bacteria with hyperspectral microscope imaging

    USDA-ARS?s Scientific Manuscript database

    Rapid detection and identification of pathogenic microorganisms naturally occurring during food processing are important in developing intervention and verification strategies. In the poultry industry, contamination of poultry meat with foodborne pathogens (especially, Salmonella and Campylobacter) ...

  17. Monitoring tobacco brand websites to understand marketing strategies aimed at tobacco product users and potential users.

    PubMed

    Escobedo, Patricia; Cruz, Tess Boley; Tsai, Kai-Ya; Allem, Jon-Patrick; Soto, Daniel W; Kirkpatrick, Matthew G; Pattarroyo, Monica; Unger, Jennifer B

    2017-09-11

    Limited information exists about strategies and methods used on brand marketing websites to transmit pro-tobacco messages to tobacco users and potential users. This study compared age verification methods, themes, interactive activities and links to social media across tobacco brand websites. This study examined 12 tobacco brand websites representing four tobacco product categories: cigarettes, cigar/cigarillos, smokeless tobacco, and e-cigarettes. Website content was analyzed by tobacco product category and data from all website visits (n = 699) were analyzed. Adult smokers (n=32) coded websites during a one-year period, indicating whether or not they observed any of 53 marketing themes, seven interactive activities, or five external links to social media sites. Most (58%) websites required online registration before entering, however e-cigarette websites used click-through age verification. Compared to cigarette sites, cigar/cigarillo sites were more likely to feature themes related to "party" lifestyle, and e-cigarette websites were much more likely to feature themes related to harm reduction. Cigarette sites featured greater levels of interactive content compared to other tobacco products. Compared to cigarette sites, cigar/cigarillo sites were more likely to feature activities related to events and music. Compared to cigarette sites, both cigar and e-cigarette sites were more likely to direct visitors to external social media sites. Marketing methods and strategies normalize tobacco use by providing website visitors with positive themes combined with interactive content, and is an area of future research. Moreover, all tobacco products under federal regulatory authority should be required to use more stringent age verification gates. Findings indicate the Food and Drug Administration (FDA) should require brand websites of all tobacco products under its regulatory authority use more stringent age verification gates by requiring all visitors be at least 18 years of age and register online prior to entry. This is important given that marketing strategies may encourage experimentation with tobacco or deter quit attempts among website visitors. Future research should examine the use of interactive activities and social media on a wide variety of tobacco brand websites as interactive content is associated with more active information processing. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verificationmore » test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.« less

  19. Verification and implementation of set-up empirical models in pile design : research project capsule.

    DOT National Transportation Integrated Search

    2016-08-01

    The primary objectives of this research include: performing static and dynamic load tests on : newly instrumented test piles to better understand the set-up mechanism for individual soil : layers, verifying or recalibrating previously developed empir...

  20. An evaluation of reading comprehension of expository text in adults with traumatic brain injury.

    PubMed

    Sohlberg, McKay Moore; Griffiths, Gina G; Fickas, Stephen

    2014-05-01

    This project was conducted to obtain information about reading problems of adults with traumatic brain injury (TBI) with mild-to-moderate cognitive impairments and to investigate how these readers respond to reading comprehension strategy prompts integrated into digital versions of text. Participants from 2 groups, adults with TBI (n = 15) and matched controls (n = 15), read 4 different 500-word expository science passages linked to either a strategy prompt condition or a no-strategy prompt condition. The participants' reading comprehension was evaluated using sentence verification and free recall tasks. The TBI and control groups exhibited significant differences on 2 of the 5 reading comprehension measures: paraphrase statements on a sentence verification task and communication units on a free recall task. Unexpected group differences were noted on the participants' prerequisite reading skills. For the within-group comparison, participants showed significantly higher reading comprehension scores on 2 free recall measures: words per communication unit and type-token ratio. There were no significant interactions. The results help to elucidate the nature of reading comprehension in adults with TBI with mild-to-moderate cognitive impairments and endorse further evaluation of reading comprehension strategies as a potential intervention option for these individuals. Future research is needed to better understand how individual differences influence a person's reading and response to intervention.

  1. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  2. Experimental device for measuring the dynamic properties of diaphragm motors

    NASA Astrophysics Data System (ADS)

    Fojtášek, Kamil; Dvořák, Lukáš; Mejzlík, Jan

    The subject of this paper is to design and description of the experimental device for the determination dynamic properties of diaphragm pneumatic motors. These motors are structurally quite different from conventional pneumatic linear cylinders. The working fluid is typically compressed air, the piston of motor is replaced by an elastic part and during the working cycle there is a contact of two elastic environments. In the manufacturers catalogs of these motors are not given any working characteristics. Description of the dynamic behavior of diaphragm motor will be used for verification of mathematical models.

  3. Current Results and Proposed Activities in Microgravity Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Polezhaev, V. I.

    1996-01-01

    The Institute for Problems in Mechanics' Laboratory work in mathematical and physical modelling of fluid mechanics develops models, methods, and software for analysis of fluid flow, instability analysis, direct numerical modelling and semi-empirical models of turbulence, as well as experimental research and verification of these models and their applications in technological fluid dynamics, microgravity fluid mechanics, geophysics, and a number of engineering problems. This paper presents an overview of the results in microgravity fluid dynamics research during the last two years. Nonlinear problems of weakly compressible and compressible fluid flows are discussed.

  4. The role of the real-time simulation facility, SIMFAC, in the design, development and performance verification of the Shuttle Remote Manipulator System (SRMS) with man-in-the-loop

    NASA Technical Reports Server (NTRS)

    Mccllough, J. R.; Sharpe, A.; Doetsch, K. H.

    1980-01-01

    The SIMFAC has played a vital role in the design, development, and performance verification of the shuttle remote manipulator system (SRMS) to be installed in the space shuttle orbiter. The facility provides for realistic man-in-the-loop operation of the SRMS by an operator in the operator complex, a flightlike crew station patterned after the orbiter aft flight deck with all necessary man machine interface elements, including SRMS displays and controls and simulated out-of-the-window and CCTV scenes. The characteristics of the manipulator system, including arm and joint servo dynamics and control algorithms, are simulated by a comprehensive mathematical model within the simulation subsystem of the facility. Major studies carried out using SIMFAC include: SRMS parameter sensitivity evaluations; the development, evaluation, and verification of operating procedures; and malfunction simulation and analysis of malfunction performance. Among the most important and comprehensive man-in-the-loop simulations carried out to date on SIMFAC are those which support SRMS performance verification and certification when the SRMS is part of the integrated orbiter-manipulator system.

  5. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  6. Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results

    NASA Astrophysics Data System (ADS)

    Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.

    2017-05-01

    The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.

  7. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  8. Enhanced dynamic wedge and independent monitor unit verification.

    PubMed

    Howlett, S J

    2005-03-01

    Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. This paper describes development of an independent MU program, concentrating on the implementation of the Enhanced Dynamic Wedge (EDW) component. The difficult case of non centre of field (COF) calculation points under the EDW was studied in some detail. Results of a survey of Australasian centres regarding the use of independent MU check systems is also presented. The system was developed with reference to MU calculations made by Pinnacle 3D Radiotherapy Treatment Planning (RTP) system (ADAC - Philips) for 4MV, 6MV and 18MV X-ray beams used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. A small systematic error was detected in the equation used for the EDW calculations. Results indicate that COF equations may be used in the non COF situation with similar accuracy to that achieved with profile corrected methods. Further collaborative work with other centres is planned to extend these findings.

  9. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  10. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  11. Development and Use of Engineering Standards for Computational Fluid Dynamics for Complex Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Lee, Hyung B.; Ghia, Urmila; Bayyuk, Sami; Oberkampf, William L.; Roy, Christopher J.; Benek, John A.; Rumsey, Christopher L.; Powers, Joseph M.; Bush, Robert H.; Mani, Mortaza

    2016-01-01

    Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making.

  12. Expert system verification and validation study. Phase 2: Requirements identification. Delivery 1: Updated survey report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.

  13. Homolytic Cleavage of a B-B Bond by the Cooperative Catalysis of Two Lewis Bases: Computational Design and Experimental Verification.

    PubMed

    Wang, Guoqiang; Zhang, Honglin; Zhao, Jiyang; Li, Wei; Cao, Jia; Zhu, Chengjian; Li, Shuhua

    2016-05-10

    Density functional theory (DFT) investigations revealed that 4-cyanopyridine was capable of homolytically cleaving the B-B σ bond of diborane via the cooperative coordination to the two boron atoms of the diborane to generate pyridine boryl radicals. Our experimental verification provides supportive evidence for this new B-B activation mode. With this novel activation strategy, we have experimentally realized the catalytic reduction of azo-compounds to hydrazine derivatives, deoxygenation of sulfoxides to sulfides, and reduction of quinones with B2 (pin)2 at mild conditions. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Modelling and analysis of the sugar cataract development process using stochastic hybrid systems.

    PubMed

    Riley, D; Koutsoukos, X; Riley, K

    2009-05-01

    Modelling and analysis of biochemical systems such as sugar cataract development (SCD) are critical because they can provide new insights into systems, which cannot be easily tested with experiments; however, they are challenging problems due to the highly coupled chemical reactions that are involved. The authors present a stochastic hybrid system (SHS) framework for modelling biochemical systems and demonstrate the approach for the SCD process. A novel feature of the framework is that it allows modelling the effect of drug treatment on the system dynamics. The authors validate the three sugar cataract models by comparing trajectories computed by two simulation algorithms. Further, the authors present a probabilistic verification method for computing the probability of sugar cataract formation for different chemical concentrations using safety and reachability analysis methods for SHSs. The verification method employs dynamic programming based on a discretisation of the state space and therefore suffers from the curse of dimensionality. To analyse the SCD process, a parallel dynamic programming implementation that can handle large, realistic systems was developed. Although scalability is a limiting factor, this work demonstrates that the proposed method is feasible for realistic biochemical systems.

  15. Exploring the e-cigarette e-commerce marketplace: Identifying Internet e-cigarette marketing characteristics and regulatory gaps.

    PubMed

    Mackey, Tim K; Miner, Angela; Cuomo, Raphael E

    2015-11-01

    The electronic cigarette (e-cigarette) market is maturing into a billion-dollar industry. Expansion includes new channels of access not sufficiently assessed, including Internet sales of e-cigarettes. This study identifies unique e-cigarette Internet vendor characteristics, including geographic location, promotional strategies, use of social networking, presence/absence of age verification, and consumer warning representation. We performed structured Internet search engine queries and used inclusion/exclusion criteria to identify e-cigarette vendors. We then conducted content analysis of characteristics of interest. Our examination yielded 57 e-cigarette Internet vendors including 54.4% (n=31) that sold exclusively online. The vast majority of websites (96.5%, n=55) were located in the U.S. Vendors used a variety of sales promotion strategies to market e-cigarettes including 70.2% (n=40) that used more than one social network service (SNS) and 42.1% (n=24) that used more than one promotional sales strategies. Most vendors (68.4%, n=39) displayed one or more health warnings on their website, but often displayed them in smaller font or in their terms and conditions. Additionally, 35.1% (n=20) of vendors did not have any detectable age verification process. E-cigarette Internet vendors are actively engaged in various promotional activities to increase the appeal and presence of their products online. In the absence of FDA regulations specific to the Internet, the e-cigarette e-commerce marketplace is likely to grow. This digital environment poses unique challenges requiring targeted policy-making including robust online age verification, monitoring of SNS marketing, and greater scrutiny of certain forms of marketing promotional practices. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Robust and Accurate Discrimination of Self/Non-Self Antigen Presentations by Regulatory T Cell Suppression.

    PubMed

    Furusawa, Chikara; Yamaguchi, Tomoyuki

    The immune response by T cells usually discriminates self and non-self antigens, even though the negative selection of self-reactive T cells is imperfect and a certain fraction of T cells can respond to self-antigens. In this study, we construct a simple mathematical model of T cell populations to analyze how such self/non-self discrimination is possible. The results demonstrate that the control of the immune response by regulatory T cells enables a robust and accurate discrimination of self and non-self antigens, even when there is a significant overlap between the affinity distribution of T cells to self and non-self antigens. Here, the number of regulatory T cells in the system acts as a global variable controlling the T cell population dynamics. The present study provides a basis for the development of a quantitative theory for self and non-self discrimination in the immune system and a possible strategy for its experimental verification.

  17. Robust and Accurate Discrimination of Self/Non-Self Antigen Presentations by Regulatory T Cell Suppression

    PubMed Central

    Furusawa, Chikara; Yamaguchi, Tomoyuki

    2016-01-01

    The immune response by T cells usually discriminates self and non-self antigens, even though the negative selection of self-reactive T cells is imperfect and a certain fraction of T cells can respond to self-antigens. In this study, we construct a simple mathematical model of T cell populations to analyze how such self/non-self discrimination is possible. The results demonstrate that the control of the immune response by regulatory T cells enables a robust and accurate discrimination of self and non-self antigens, even when there is a significant overlap between the affinity distribution of T cells to self and non-self antigens. Here, the number of regulatory T cells in the system acts as a global variable controlling the T cell population dynamics. The present study provides a basis for the development of a quantitative theory for self and non-self discrimination in the immune system and a possible strategy for its experimental verification. PMID:27668873

  18. Manual and automation testing and verification of TEQ [ECI PROPIRETRY

    NASA Astrophysics Data System (ADS)

    Abhichandra, Ravi; Jasmine Pemeena Priyadarsini, M.

    2017-11-01

    The telecommunication industry has progressed from 1G to 4G and now 5G is gaining prominence. Given the pace of this abrupt transformation, technological obsolescence is becoming a serious issue to deal with. Adding to this fact is that the execution of each technology requires ample investment into network, infrastructure, development etc. As a result, the industry is becoming more dynamic and strategy oriented. It requires professionals who not only can understand technology but also can evaluate the same from a business perspective. The “Information Revolution” and the dramatic advances in telecommunications technology, which has made this possible, currently drive the global economy in large part. As wireless networks become more advanced and far-reaching, we are redefining the notion of connectivity and the possibilities of communications technology. In this paper I test and verify the optical cards and automate this test procedure by using a new in-house technology “TEQ” developed by ECI TELECOM which uses one the optical cards itself to pump traffic of 100gbps.

  19. Specific 13C labeling of leucine, valine and isoleucine methyl groups for unambiguous detection of long-range restraints in protein solid-state NMR studies.

    PubMed

    Fasshuber, Hannes Klaus; Demers, Jean-Philippe; Chevelkov, Veniamin; Giller, Karin; Becker, Stefan; Lange, Adam

    2015-03-01

    Here we present an isotopic labeling strategy to easily obtain unambiguous long-range distance restraints in protein solid-state NMR studies. The method is based on the inclusion of two biosynthetic precursors in the bacterial growth medium, α-ketoisovalerate and α-ketobutyrate, leading to the production of leucine, valine and isoleucine residues that are exclusively (13)C labeled on methyl groups. The resulting spectral simplification facilitates the collection of distance restraints, the verification of carbon chemical shift assignments and the measurement of methyl group dynamics. This approach is demonstrated on the type-three secretion system needle of Shigella flexneri, where 49 methyl-methyl and methyl-nitrogen distance restraints including 10 unambiguous long-range distance restraints could be collected. By combining this labeling scheme with ultra-fast MAS and proton detection, the assignment of methyl proton chemical shifts was achieved. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  1. METHANOGENESIS AND SULFATE REDUCTION IN CHEMOSTATS: II. MODEL DEVELOPMENT AND VERIFICATION

    EPA Science Inventory

    A comprehensive dynamic model is presented that simulates methanogenesis and sulfate reduction in a continuously stirred tank reactor (CSTR). This model incorporates the complex chemistry of anaerobic systems. A salient feature of the model is its ability to predict the effluent ...

  2. Side impact test and analyses of a DOT-111 tank car : final report.

    DOT National Transportation Integrated Search

    2015-10-01

    Transportation Technology Center, Inc. conducted a side impact test on a DOT-111 tank car to evaluate the performance of the : tank car under dynamic impact conditions and to provide data for the verification and refinement of a computational model. ...

  3. Frame synchronization for the Galileo code

    NASA Technical Reports Server (NTRS)

    Arnold, S.; Swanson, L.

    1991-01-01

    Results are reported on the performance of the Deep Space Network's frame synchronizer for the (15,1/4) convolutional code after Viterbi decoding. The threshold is found that optimizes the probability of acquiring true sync within four frames using a strategy that requires next frame verification.

  4. Remarks to Eighth Annual State of Modeling and Simulation

    DTIC Science & Technology

    1999-06-04

    organization, training as well as materiel Discovery vice Verification Tolerance for Surprise Free play Red Team Iterative Process Push to failure...Account for responsive & innovative future adversaries – free play , adaptive strategies and tactics by professional red teams • Address C2 issues & human

  5. 76 FR 40753 - NASA Advisory Council; Aeronautics Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-11

    ... strategy Verification and Validation of Flight Critical Systems planning update NASA Aeronautics systems analysis and strategic planning It is imperative that this meeting be held on this date to accommodate the... aeronautics community and other persons, research and technical information relevant to program planning...

  6. Microgravity Investigation of Crew Reactions in 0-G (MICRO-G)

    NASA Technical Reports Server (NTRS)

    Newman, Dava; Coleman, Charles; Metaxas, Dimitri

    2004-01-01

    There is a need for a human factors, technology-based bioastronautics research effort to develop an integrated system that reduces risk and provides scientific knowledge of astronaut-induced loads and motions during long-duration missions on the International Space Station (ISS), which will lead to appropriate countermeasures. The primary objectives of the Microgravity Investigation of Crew Reactions in 0-G (MICRO-GI research effort are to quantify astronaut adaptation and movement as well as to model motor strategies for differing gravity environments. The overall goal of this research program is to improve astronaut performance and efficiency through the use of rigorous quantitative dynamic analysis, simulation and experimentation. The MICRO-G research effort provides a modular, kinetic and kinematic capability for the ISS. The collection and evaluation of kinematics (whole-body motion) and dynamics (reacting forces and torques) of astronauts within the ISS will allow for quantification of human motion and performance in weightlessness, gathering fundamental human factors information for design, scientific investigation in the field of dynamics and motor control, technological assessment of microgravity disturbances, and the design of miniaturized, real-time space systems. The proposed research effort builds on a strong foundation of successful microgravity experiments, namely, the EDLS (Enhanced Dynamics Load Sensors) flown aboard the Russian Mir space station (19961998) and the DLS (Dynamic Load Sensors) flown on Space Shuttle Mission STS-62. In addition, previously funded NASA ground-based research into sensor technology development and development of algorithms to produce three-dimensional (3-0) kinematics from video images have come to fruition and these efforts culminate in the proposed collaborative MICRO-G flight experiment. The required technology and hardware capitalize on previous sensor design, fabrication, and testing and can be flight qualified for a fraction of the cost of an initial spaceflight experiment. Four dynamic load sensors/restraints are envisioned for measurement of astronaut forces and torques. Two standard ISS video cameras record typical astronaut operations and prescribed IVA motions for 3-D kinematics. Forces and kinematics are combined for dynamic analysis of astronaut motion, exploiting the results of the detailed dynamic modeling effort for the quantitative verification of astronaut IVA performance, induced-loads, and adaptive control strategies for crewmember whole-body motion in microgravity. This comprehensive effort, provides an enhanced human factors approach based on physics-based modeling to identify adaptive performance during long-duration spaceflight, which is critically important for astronaut training as well as providing a spaceflight database to drive countermeasure design.

  7. Spatio-temporal dynamic climate model for Neoleucinodes elegantalis using CLIMEX

    NASA Astrophysics Data System (ADS)

    da Silva, Ricardo Siqueira; Kumar, Lalit; Shabani, Farzin; da Silva, Ezio Marques; da Silva Galdino, Tarcisio Visintin; Picanço, Marcelo Coutinho

    2017-05-01

    Seasonal variations are important components in understanding the ecology of insect population of crops. Ecological studies through modeling may be a useful tool for enhancing knowledge of seasonal patterns of insects on field crops as well as seasonal patterns of favorable climatic conditions for species. Recently CLIMEX, a semi-mechanistic niche model, was upgraded and enhanced to consider spatio-temporal dynamics of climate suitability through time. In this study, attempts were made to determine monthly variations of climate suitability for Neoleucinodes elegantalis (Guenée) (Lepidoptera: Crambidae) in five commercial tomato crop localities through the latest version of CLIMEX. We observed that N. elegantalis displays seasonality with increased abundance in tomato crops during summer and autumn, corresponding to the first 6 months of the year in monitored areas in this study. Our model demonstrated a strong accord between the CLIMEX weekly growth index (GIw) and the density of N. elegantalis for this period, thus indicating a greater confidence in our model results. Our model shows a seasonal variability of climatic suitability for N. elegantalis and provides useful information for initiating methods for timely management, such as sampling strategies and control, during periods of high degree of suitability for N. elegantalis. In this study, we ensure that the simulation results are valid through our verification using field data.

  8. A Novel Strategy for Selection and Validation of Reference Genes in Dynamic Multidimensional Experimental Design in Yeast

    PubMed Central

    Cankorur-Cetinkaya, Ayca; Dereli, Elif; Eraslan, Serpil; Karabekmez, Erkan; Dikicioglu, Duygu; Kirdar, Betul

    2012-01-01

    Background Understanding the dynamic mechanism behind the transcriptional organization of genes in response to varying environmental conditions requires time-dependent data. The dynamic transcriptional response obtained by real-time RT-qPCR experiments could only be correctly interpreted if suitable reference genes are used in the analysis. The lack of available studies on the identification of candidate reference genes in dynamic gene expression studies necessitates the identification and the verification of a suitable gene set for the analysis of transient gene expression response. Principal Findings In this study, a candidate reference gene set for RT-qPCR analysis of dynamic transcriptional changes in Saccharomyces cerevisiae was determined using 31 different publicly available time series transcriptome datasets. Ten of the twelve candidates (TPI1, FBA1, CCW12, CDC19, ADH1, PGK1, GCN4, PDC1, RPS26A and ARF1) we identified were not previously reported as potential reference genes. Our method also identified the commonly used reference genes ACT1 and TDH3. The most stable reference genes from this pool were determined as TPI1, FBA1, CDC19 and ACT1 in response to a perturbation in the amount of available glucose and as FBA1, TDH3, CCW12 and ACT1 in response to a perturbation in the amount of available ammonium. The use of these newly proposed gene sets outperformed the use of common reference genes in the determination of dynamic transcriptional response of the target genes, HAP4 and MEP2, in response to relaxation from glucose and ammonium limitations, respectively. Conclusions A candidate reference gene set to be used in dynamic real-time RT-qPCR expression profiling in yeast was proposed for the first time in the present study. Suitable pools of stable reference genes to be used under different experimental conditions could be selected from this candidate set in order to successfully determine the expression profiles for the genes of interest. PMID:22675547

  9. Dose calculation of dynamic trajectory radiotherapy using Monte Carlo.

    PubMed

    Manser, P; Frauchiger, D; Frei, D; Volken, W; Terribilini, D; Fix, M K

    2018-04-06

    Using volumetric modulated arc therapy (VMAT) delivery technique gantry position, multi-leaf collimator (MLC) as well as dose rate change dynamically during the application. However, additional components can be dynamically altered throughout the dose delivery such as the collimator or the couch. Thus, the degrees of freedom increase allowing almost arbitrary dynamic trajectories for the beam. While the dose delivery of such dynamic trajectories for linear accelerators is technically possible, there is currently no dose calculation and validation tool available. Thus, the aim of this work is to develop a dose calculation and verification tool for dynamic trajectories using Monte Carlo (MC) methods. The dose calculation for dynamic trajectories is implemented in the previously developed Swiss Monte Carlo Plan (SMCP). SMCP interfaces the treatment planning system Eclipse with a MC dose calculation algorithm and is already able to handle dynamic MLC and gantry rotations. Hence, the additional dynamic components, namely the collimator and the couch, are described similarly to the dynamic MLC by defining data pairs of positions of the dynamic component and the corresponding MU-fractions. For validation purposes, measurements are performed with the Delta4 phantom and film measurements using the developer mode on a TrueBeam linear accelerator. These measured dose distributions are then compared with the corresponding calculations using SMCP. First, simple academic cases applying one-dimensional movements are investigated and second, more complex dynamic trajectories with several simultaneously moving components are compared considering academic cases as well as a clinically motivated prostate case. The dose calculation for dynamic trajectories is successfully implemented into SMCP. The comparisons between the measured and calculated dose distributions for the simple as well as for the more complex situations show an agreement which is generally within 3% of the maximum dose or 3mm. The required computation time for the dose calculation remains the same when the additional dynamic moving components are included. The results obtained for the dose comparisons for simple and complex situations suggest that the extended SMCP is an accurate dose calculation and efficient verification tool for dynamic trajectory radiotherapy. This work was supported by Varian Medical Systems. Copyright © 2018. Published by Elsevier GmbH.

  10. Wind Turbine Dynamics

    NASA Technical Reports Server (NTRS)

    Thresher, R. W. (Editor)

    1981-01-01

    Recent progress in the analysis and prediction of the dynamic behavior of wind turbine generators is discussed. The following areas were addressed: (1) the adequacy of state of the art analysis tools for designing the next generation of wind power systems; (2) the use of state of the art analysis tools designers; and (3) verifications of theory which might be lacking or inadequate. Summaries of these informative discussions as well as the questions and answers which followed each paper are documented in the proceedings.

  11. Lageos assembly operation plan

    NASA Technical Reports Server (NTRS)

    Brueger, J.

    1975-01-01

    Guidelines and constraints procedures for LAGEOS assembly, operation, and design performance are given. Special attention was given to thermal, optical, and dynamic analysis and testing. The operation procedures illustrate the interrelation and sequence of tasks in a flow diagram. The diagram also includes quality assurance functions for verification of operation tasks.

  12. A calibration method for patient specific IMRT QA using a single therapy verification film

    PubMed Central

    Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.

    2013-01-01

    Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558

  13. In vivo dose verification of IMRT treated head and neck cancer patients.

    PubMed

    Engström, Per E; Haraldsson, Pia; Landberg, Torsten; Sand Hansen, Hanne; Aage Engelholm, Svend; Nyström, Håkan

    2005-01-01

    An independent in vivo dose verification procedure for IMRT treatments of head and neck cancers was developed. Results of 177 intracavitary TLD measurements from 10 patients are presented. The study includes data from 10 patients with cancer of the rhinopharynx or the thyroid treated with dynamic IMRT. Dose verification was performed by insertion of a flexible naso-oesophageal tube containing TLD rods and markers for EPID and simulator image detection. Part of the study focussed on investigating the accuracy of the TPS calculations in the presence of inhomogeneities. Phantom measurements and Monte Carlo simulations were performed for a number of geometries involving lateral electronic disequilibrium and steep density shifts. The in vivo TLD measurements correlated well with the predictions of the treatment planning system with a measured/calculated dose ratio of 1.002+/-0.051 (1 SD, N=177). The measurements were easily performed and well tolerated by the patients. We conclude that in vivo intracavitary dosimetry with TLD is suitable and accurate for dose determination in intensity-modulated beams.

  14. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  15. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  16. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  17. NATIONAL PREPAREDNESS: Integrating New and Existing Technology and Information Sharing into an Effective Homeland Security Strategy

    DTIC Science & Technology

    2002-06-07

    Continue to Develop and Refine Emerging Technology • Some of the emerging biometric devices, such as iris scans and facial recognition systems...such as iris scans and facial recognition systems, facial recognition systems, and speaker verification systems. (976301)

  18. COST EVALUATION STRATEGIES FOR TECHNOLOGIES TESTED UNDER THE ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This document provides a general set of guidelines that may be consistently applied for collecting, evaluation, and reporting the costs of technologies tested under the ETV Program. Because of the diverse nature of the technologies and industries covered in this program, each ETV...

  19. A new approach to handling incoming verifications.

    PubMed

    Luizzo, Anthony; Roy, Bill; Luizzo, Philip

    2016-10-01

    Outside requests for data on current or former employees are handled in different ways by healthcare organizations and present considerable liability risks if a corporate policy for handling such risks is not in place. In this article, the authors present a strategy for responsible handling of sensitive information.

  20. Providing an empirical basis for optimizing the verification and testing phases of software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  1. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  2. Coupled dynamics analysis of wind energy systems

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.

    1977-01-01

    A qualitative description of all key elements of a complete wind energy system computer analysis code is presented. The analysis system addresses the coupled dynamics characteristics of wind energy systems, including the interactions of the rotor, tower, nacelle, power train, control system, and electrical network. The coupled dynamics are analyzed in both the frequency and time domain to provide the basic motions and loads data required for design, performance verification and operations analysis activities. Elements of the coupled analysis code were used to design and analyze candidate rotor articulation concepts. Fundamental results and conclusions derived from these studies are presented.

  3. Model of ballistic targets' dynamics used for trajectory tracking algorithms

    NASA Astrophysics Data System (ADS)

    Okoń-FÄ fara, Marta; Kawalec, Adam; Witczak, Andrzej

    2017-04-01

    There are known only few ballistic object tracking algorithms. To develop such algorithms and to its further testing, it is necessary to implement possibly simple and reliable objects' dynamics model. The article presents the dynamics' model of a tactical ballistic missile (TBM) including the three stages of flight: the boost stage and two passive stages - the ascending one and the descending one. Additionally, the procedure of transformation from the local coordinate system to the polar-radar oriented and the global is presented. The prepared theoretical data may be used to determine the tracking algorithm parameters and to its further verification.

  4. Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations

    NASA Technical Reports Server (NTRS)

    Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)

    1998-01-01

    This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.

  5. Research of improved banker algorithm

    NASA Astrophysics Data System (ADS)

    Yuan, Xingde; Xu, Hong; Qiao, Shijiao

    2013-03-01

    In the multi-process operating system, resource management strategy of system is a critical global issue, especially when many processes implicating for the limited resources, since unreasonable scheduling will cause dead lock. The most classical solution for dead lock question is the banker algorithm; however, it has its own deficiency and only can avoid dead lock occurring in a certain extent. This article aims at reducing unnecessary safety checking, and then uses the new allocation strategy to improve the banker algorithm. Through full analysis and example verification of the new allocation strategy, the results show the improved banker algorithm obtains substantial increase in performance.

  6. A Note on Verification of Computer Simulation Models

    ERIC Educational Resources Information Center

    Aigner, Dennis J.

    1972-01-01

    Establishes an argument that questions the validity of one test'' of goodness-of-fit (the extent to which a series of obtained measures agrees with a series of theoretical measures) for the simulated time path of a simple endogenous (internally developed) variable in a simultaneous, perhaps dynamic econometric model. (Author)

  7. Field verification of KDOT's Superpave mixture properties to be used as inputs in the NCHRP mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2009-01-01

    In the MechanisticEmpirical Pavement Design Guide (M-EPDG), prediction of flexible pavement response and performance needs an input of dynamic modulus of hot-mix asphalt (HMA) at all three levels of hierarchical inputs. This study was intended to ...

  8. The verification of LANDSAT data in the geographical analysis of wetlands in west Tennessee

    NASA Technical Reports Server (NTRS)

    Rehder, J.; Quattrochi, D. A.

    1978-01-01

    The reliability of LANDSAT imagery as a medium for identifying, delimiting, monitoring, measuring, and mapping wetlands in west Tennessee was assessed to verify LANDSAT as an accurate, efficient cartographic tool that could be employed by a wide range of users to study wetland dynamics. The verification procedure was based on the visual interpretation and measurement of multispectral imagery. The accuracy testing procedure was predicated on surrogate ground truth data gleaned from medium altitude imagery of the wetlands. Fourteen sites or case study areas were selected from individual 9 x 9 inch photo frames on the aerial photography. These sites were then used as data control calibration parameters for assessing the cartography accuracy of the LANDSAT imagery. An analysis of results obtained from the verification tests indicated that 1:250,000 scale LANDSAT data were the most reliable scale of imagery for visually mapping and measuring wetlands using the area grid technique. The mean areal percentage of accuracy was 93.54 percent (real) and 96.93 percent (absolute). As a test of accuracy, the LANDSAT 1:250,000 scale overall wetland measurements were compared with an area cell mensuration of the swamplands from 1:130,000 scale color infrared U-2 aircraft imagery. The comparative totals substantiated the results from the LANDSAT verification procedure.

  9. Testing a Model of Participant Retention in Longitudinal Substance Abuse Research

    ERIC Educational Resources Information Center

    Gilmore, Devin; Kuperminc, Gabriel P.

    2014-01-01

    Longitudinal substance abuse research has often been compromised by high rates of attrition, thought to be the result of the lifestyle that often accompanies addiction. Several studies have used strategies including collection of locator information at the baseline assessment, verification of the information, and interim contacts prior to…

  10. Earth Science Activities: A Guide to Effective Elementary School Science Teaching.

    ERIC Educational Resources Information Center

    Kanis, Ira B.; Yasso, Warren E.

    The primary emphasis of this book is on new or revised earth science activities that promote concept development rather than mere verification of concepts learned by passive means. Chapter 2 describes philosophies, strategies, methods, and techniques to guide preservice and inservice teachers, school building administrators, and curriculum…

  11. [A review of progress of real-time tumor tracking radiotherapy technology based on dynamic multi-leaf collimator].

    PubMed

    Liu, Fubo; Li, Guangjun; Shen, Jiuling; Li, Ligin; Bai, Sen

    2017-02-01

    While radiation treatment to patients with tumors in thorax and abdomen is being performed, further improvement of radiation accuracy is restricted by the tumor intra-fractional motion due to respiration. Real-time tumor tracking radiation is an optimal solution to tumor intra-fractional motion. A review of the progress of real-time dynamic multi-leaf collimator(DMLC) tracking is provided in the present review, including DMLC tracking method, time lag of DMLC tracking system, and dosimetric verification.

  12. Volumetric Verification of Multiaxis Machine Tool Using Laser Tracker

    PubMed Central

    Aguilar, Juan José

    2014-01-01

    This paper aims to present a method of volumetric verification in machine tools with linear and rotary axes using a laser tracker. Beyond a method for a particular machine, it presents a methodology that can be used in any machine type. Along this paper, the schema and kinematic model of a machine with three axes of movement, two linear and one rotational axes, including the measurement system and the nominal rotation matrix of the rotational axis are presented. Using this, the machine tool volumetric error is obtained and nonlinear optimization techniques are employed to improve the accuracy of the machine tool. The verification provides a mathematical, not physical, compensation, in less time than other methods of verification by means of the indirect measurement of geometric errors of the machine from the linear and rotary axes. This paper presents an extensive study about the appropriateness and drawbacks of the regression function employed depending on the types of movement of the axes of any machine. In the same way, strengths and weaknesses of measurement methods and optimization techniques depending on the space available to place the measurement system are presented. These studies provide the most appropriate strategies to verify each machine tool taking into consideration its configuration and its available work space. PMID:25202744

  13. Second order gyrokinetic theory for particle-in-cell codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tronko, Natalia; Bottino, Alberto; Sonnendrücker, Eric

    2016-08-15

    The main idea of the gyrokinetic dynamical reduction consists in a systematical removal of the fast scale motion (the gyromotion) from the dynamics of the plasma, resulting in a considerable simplification and a significant gain of computational time. The gyrokinetic Maxwell–Vlasov equations are nowadays implemented in for modeling (both laboratory and astrophysical) strongly magnetized plasmas. Different versions of the reduced set of equations exist, depending on the construction of the gyrokinetic reduction procedure and the approximations performed in the derivation. The purpose of this article is to explicitly show the connection between the general second order gyrokinetic Maxwell–Vlasov system issuedmore » from the modern gyrokinetic theory and the model currently implemented in the global electromagnetic Particle-in-Cell code ORB5. Necessary information about the modern gyrokinetic formalism is given together with the consistent derivation of the gyrokinetic Maxwell–Vlasov equations from first principles. The variational formulation of the dynamics is used to obtain the corresponding energy conservation law, which in turn is used for the verification of energy conservation diagnostics currently implemented in ORB5. This work fits within the context of the code verification project VeriGyro currently run at IPP Max-Planck Institut in collaboration with others European institutions.« less

  14. Development of procedures for calculating stiffness and damping properties of elastomers in engineering applications. Part 1: Verification of basic methods

    NASA Technical Reports Server (NTRS)

    Chiang, T.; Tessarzik, J. M.; Badgley, R. H.

    1972-01-01

    The primary aim of this investigation was verification of basic methods which are to be used in cataloging elastomer dynamic properties (stiffness and damping) in terms of viscoelastic model constants. These constants may then be used to predict dynamic properties for general elastomer shapes and operating conditions, thereby permitting optimum application of elastomers as energy absorption and/or energy storage devices in the control of vibrations in a broad variety of applications. The efforts reported involved: (1) literature search; (2) the design, fabrication and use of a test rig for obtaining elastomer dynamic test data over a wide range of frequencies, amplitudes, and preloads; and (3) the reduction of the test data, by means of a selected three-element elastomer model and specialized curve fitting techniques, to material properties. Material constants thus obtained have been used to calculate stiffness and damping for comparison with measured test data. These comparisons are excellent for a number of test conditions and only fair to poor for others. The results confirm the validity of the basic approach of the overall program and the mechanics of the cataloging procedure, and at the same time suggest areas in which refinements should be made.

  15. Multimodal fusion of polynomial classifiers for automatic person recgonition

    NASA Astrophysics Data System (ADS)

    Broun, Charles C.; Zhang, Xiaozheng

    2001-03-01

    With the prevalence of the information age, privacy and personalization are forefront in today's society. As such, biometrics are viewed as essential components of current evolving technological systems. Consumers demand unobtrusive and non-invasive approaches. In our previous work, we have demonstrated a speaker verification system that meets these criteria. However, there are additional constraints for fielded systems. The required recognition transactions are often performed in adverse environments and across diverse populations, necessitating robust solutions. There are two significant problem areas in current generation speaker verification systems. The first is the difficulty in acquiring clean audio signals in all environments without encumbering the user with a head- mounted close-talking microphone. Second, unimodal biometric systems do not work with a significant percentage of the population. To combat these issues, multimodal techniques are being investigated to improve system robustness to environmental conditions, as well as improve overall accuracy across the population. We propose a multi modal approach that builds on our current state-of-the-art speaker verification technology. In order to maintain the transparent nature of the speech interface, we focus on optical sensing technology to provide the additional modality-giving us an audio-visual person recognition system. For the audio domain, we use our existing speaker verification system. For the visual domain, we focus on lip motion. This is chosen, rather than static face or iris recognition, because it provides dynamic information about the individual. In addition, the lip dynamics can aid speech recognition to provide liveness testing. The visual processing method makes use of both color and edge information, combined within Markov random field MRF framework, to localize the lips. Geometric features are extracted and input to a polynomial classifier for the person recognition process. A late integration approach, based on a probabilistic model, is employed to combine the two modalities. The system is tested on the XM2VTS database combined with AWGN in the audio domain over a range of signal-to-noise ratios.

  16. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification

    PubMed Central

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    Purpose: To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. Materials and methods: A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm2. Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm2 and compared with ion chamber data. Scanditronix/Wellhofer OmniProTM IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Results: Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm2 at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and depths studied. Dose profiles measured with EDR2 film were consistent with those measured with an ion chamber. The optimal sensitometric curve was acquired by irradiating film at a depth of 5 cm with doses ranging from 20 to 450 cGy with a 3×3 cm2 multileaf collimator. The optimal sensitometric curve allowed accurate determination of the absolute dose distribution. In almost 200 cases of dynamic IMRT plan verification with EDR2 film, the difference between measured and calculated dose was generally less than 3% and with 3 mm distance to agreement when using gamma value verification. Conclusion: EDR2 film can be used for accurate verification of composite isodose distributions of dynamic IMRT when the optimal sensitometric curve has been established. PMID:21614315

  17. Optimal sensitometric curves of Kodak EDR2 film for dynamic intensity modulated radiation therapy verification.

    PubMed

    Suriyapee, S; Pitaxtarnin, N; Oonsiri, S; Jumpangern, C; Israngkul Na Ayuthaya, I

    2008-01-01

    To investigate the optimal sensitometric curves of extended dose range (EDR2) radiographic film in terms of depth, field size, dose range and processing conditions for dynamic intensity modulated radiation therapy (IMRT) dosimetry verification with 6 MV X-ray beams. A Varian Clinac 23 EX linear accelerator with 6 MV X-ray beam was used to study the response of Kodak EDR2 film. Measurements were performed at depths of 5, 10 and 15 cm in MedTec virtual water phantom and with field sizes of 2x2, 3x3, 10x10 and 15x15 cm(2). Doses ranging from 20 to 450 cGy were used. The film was developed with the Kodak RP X-OMAT Model M6B automatic film processor. Film response was measured with the Vidar model VXR-16 scanner. Sensitometric curves were applied to the dose profiles measured with film at 5 cm in the virtual water phantom with field sizes of 2x2 and 10x10 cm(2) and compared with ion chamber data. Scanditronix/Wellhofer OmniPro(TM) IMRT software was used for the evaluation of the IMRT plan calculated by Eclipse treatment planning. Investigation of the reproducibility and accuracy of the film responses, which depend mainly on the film processor, was carried out by irradiating one film nine times with doses of 20 to 450 cGy. A maximum standard deviation of 4.9% was found which decreased to 1.9% for doses between 20 and 200 cGy. The sensitometric curves for various field sizes at fixed depth showed a maximum difference of 4.2% between 2x2 and 15x15 cm(2) at 5 cm depth with a dose of 450 cGy. The shallow depth tended to show a greater effect of field size responses than the deeper depths. The sensitometric curves for various depths at fixed field size showed slightly different film responses; the difference due to depth was within 1.8% for all field sizes studied. Both field size and depth effect were reduced when the doses were lower than 450 cGy. The difference was within 2.5% in the dose range from 20 to 300 cGy for all field sizes and depths studied. Dose profiles measured with EDR2 film were consistent with those measured with an ion chamber. The optimal sensitometric curve was acquired by irradiating film at a depth of 5 cm with doses ranging from 20 to 450 cGy with a 3×3 cm(2) multileaf collimator. The optimal sensitometric curve allowed accurate determination of the absolute dose distribution. In almost 200 cases of dynamic IMRT plan verification with EDR2 film, the difference between measured and calculated dose was generally less than 3% and with 3 mm distance to agreement when using gamma value verification. EDR2 film can be used for accurate verification of composite isodose distributions of dynamic IMRT when the optimal sensitometric curve has been established.

  18. Off-fault plasticity in three-dimensional dynamic rupture simulations using a modal Discontinuous Galerkin method on unstructured meshes: Implementation, verification, and application

    NASA Astrophysics Data System (ADS)

    Wollherr, Stephanie; Gabriel, Alice-Agnes; Uphoff, Carsten

    2018-05-01

    The dynamics and potential size of earthquakes depend crucially on rupture transfers between adjacent fault segments. To accurately describe earthquake source dynamics, numerical models can account for realistic fault geometries and rheologies such as nonlinear inelastic processes off the slip interface. We present implementation, verification, and application of off-fault Drucker-Prager plasticity in the open source software SeisSol (www.seissol.org). SeisSol is based on an arbitrary high-order derivative modal Discontinuous Galerkin (ADER-DG) method using unstructured, tetrahedral meshes specifically suited for complex geometries. Two implementation approaches are detailed, modelling plastic failure either employing sub-elemental quadrature points or switching to nodal basis coefficients. At fine fault discretizations the nodal basis approach is up to 6 times more efficient in terms of computational costs while yielding comparable accuracy. Both methods are verified in community benchmark problems and by three dimensional numerical h- and p-refinement studies with heterogeneous initial stresses. We observe no spectral convergence for on-fault quantities with respect to a given reference solution, but rather discuss a limitation to low-order convergence for heterogeneous 3D dynamic rupture problems. For simulations including plasticity, a high fault resolution may be less crucial than commonly assumed, due to the regularization of peak slip rate and an increase of the minimum cohesive zone width. In large-scale dynamic rupture simulations based on the 1992 Landers earthquake, we observe high rupture complexity including reverse slip, direct branching, and dynamic triggering. The spatio-temporal distribution of rupture transfers are altered distinctively by plastic energy absorption, correlated with locations of geometrical fault complexity. Computational cost increases by 7% when accounting for off-fault plasticity in the demonstrating application. Our results imply that the combination of fully 3D dynamic modelling, complex fault geometries, and off-fault plastic yielding is important to realistically capture dynamic rupture transfers in natural fault systems.

  19. Using Dynamic Geometry to Expand Mathematics Teachers' Understanding of Proof

    ERIC Educational Resources Information Center

    de Villiers, Michael

    2004-01-01

    This paper gives a broad descriptive account of some activities that the author has designed using Sketchpad to develop teachers' understanding of other functions of proof than just the traditional function of 'verification'. These other functions of proof illustrated here are those of explanation, discovery and systematization (in the context of…

  20. Space shuttle propulsion estimation development verification, volume 1

    NASA Technical Reports Server (NTRS)

    Rogers, Robert M.

    1989-01-01

    The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.

  1. A comparative verification of high resolution precipitation forecasts using model output statistics

    NASA Astrophysics Data System (ADS)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  2. A review of dynamic inflow and its effect on experimental correlations

    NASA Technical Reports Server (NTRS)

    Gaonkar, G. H.; Peters, D. A.

    1985-01-01

    A review is given of the relationship between experimental data and the development of modern dynamic-inflow theory. Some of the most interesting data, first presented 10 years ago at the Dynamic Specialist's Meeting, is now reviewed in light of the newer theories. These pure blade-flapping data correlate very well with analyses that include the new dynamic inflow theory, thus verifying the theory. Experimental data are also presented for damping with coupled inplane and body motions. Although inclusion of dynamic inflow is often required to correlate this coupled data, the data cannot be used to verify any particular dynamic inflow theory due to the uncertainties in modeling the inplane degree of freedom. For verification, pure flapping is required. However, the coupled data do show that inflow is often important in such computations.

  3. Identification of the numerical model of FEM in reference to measurements in situ

    NASA Astrophysics Data System (ADS)

    Jukowski, Michał; Bec, Jarosław; Błazik-Borowa, Ewa

    2018-01-01

    The paper deals with the verification of various numerical models in relation to the pilot-phase measurements of a rail bridge subjected to dynamic loading. Three types of FEM models were elaborated for this purpose. Static, modal and dynamic analyses were performed. The study consisted of measuring the acceleration values of the structural components of the object at the moment of the train passing. Based on this, FFT analysis was performed, the main natural frequencies of the bridge were determined, the structural damping ratio and the dynamic amplification factor (DAF) were calculated and compared with the standard values. Calculations were made using Autodesk Simulation Multiphysics (Algor).

  4. Gait analysis--precise, rapid, automatic, 3-D position and orientation kinematics and dynamics.

    PubMed

    Mann, R W; Antonsson, E K

    1983-01-01

    A fully automatic optoelectronic photogrammetric technique is presented for measuring the spatial kinematics of human motion (both position and orientation) and estimating the inertial (net) dynamics. Calibration and verification showed that in a two-meter cube viewing volume, the system achieves one millimeter of accuracy and resolution in translation and 20 milliradians in rotation. Since double differentiation of generalized position data to determine accelerations amplifies noise, the frequency domain characteristics of the system were investigated. It was found that the noise and all other errors in the kinematic data contribute less than five percent error to the resulting dynamics.

  5. 76 FR 46834 - Notice of Submission of Proposed Information Collection to OMB; Notice of Funding Availability...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-03

    ... management, retrofit strategies, home performance verification, and sustainable construction fundamentals... submitted to the Office of Management and Budget (OMB) for review, as required by the Paperwork Reduction... (2528--Pending) and should be sent to: HUD Desk Officer, Office of Management and Budget, New Executive...

  6. Automatic Verification of Serializers.

    DTIC Science & Technology

    1980-03-01

    31 2.5 Using semaphores to implement sei ;alizers ......................... 32 2.6 A comparison of...of concurrency control, while Hewitt has concentrated on more primitive control of concurrency in a context where programs communicate by passing...translation oflserializers into clusters and semaphores is given as a possible implementation strategy. Chapter 3 presents a simple semantic model that supl

  7. 77 FR 40895 - Culebra National Wildlife Refuge, PR; Draft Comprehensive Conservation Plan and Environmental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-11

    ... Post at Punta Flamenco; (7) developing hiking trails; (8) completing boundary verification; and (9... of sea turtles and their nests/eggs. To benefit resident and migratory birds, annual surveys would be... management strategies to benefit target species of birds and cooperate with Puerto Rico DNER to conduct...

  8. 77 FR 17353 - Migratory Bird Subsistence Harvest in Alaska; Harvest Regulations for Migratory Birds in Alaska...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... reduce the potential for shooting mortality or injury of closed species. These conservation measures..., meetings, radio shows, signs, school visits, and one-on-one contacts. We also recognize that no listed..., and in-season verification of the harvest. Our primary strategy to reduce the threat of shooting...

  9. An Analysis of Heavy-Ion Single Event Effects for a Variety of Finite State-Machine Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Label, Kenneth A.; Kim, Hak; Phan, Anthony; Seidleck, Christina

    2014-01-01

    Finite state-machines (FSMs) are used to control operational flow in application specific integrated circuits (ASICs) and field programmable gate array (FPGA) devices. Because of their ease of interpretation, FSMs simplify the design and verification process and consequently are significant components in a synchronous design.

  10. Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications

    NASA Technical Reports Server (NTRS)

    Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.

    2017-01-01

    Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.

  11. Verification of the CFD simulation system SAUNA for complex aircraft configurations

    NASA Astrophysics Data System (ADS)

    Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.

    1994-04-01

    This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.

  12. Thermal/structural design verification strategies for large space structures

    NASA Technical Reports Server (NTRS)

    Benton, David

    1988-01-01

    Requirements for space structures of increasing size, complexity, and precision have engendered a search for thermal design verification methods that do not impose unreasonable costs, that fit within the capabilities of existing facilities, and that still adequately reduce technical risk. This requires a combination of analytical and testing methods. This requires two approaches. The first is to limit thermal testing to sub-elements of the total system only in a compact configuration (i.e., not fully deployed). The second approach is to use a simplified environment to correlate analytical models with test results. These models can then be used to predict flight performance. In practice, a combination of these approaches is needed to verify the thermal/structural design of future very large space systems.

  13. Methods of increasing efficiency and maintainability of pipeline systems

    NASA Astrophysics Data System (ADS)

    Ivanov, V. A.; Sokolov, S. M.; Ogudova, E. V.

    2018-05-01

    This study is dedicated to the issue of pipeline transportation system maintenance. The article identifies two classes of technical-and-economic indices, which are used to select an optimal pipeline transportation system structure. Further, the article determines various system maintenance strategies and strategy selection criteria. Meanwhile, the maintenance strategies turn out to be not sufficiently effective due to non-optimal values of maintenance intervals. This problem could be solved by running the adaptive maintenance system, which includes a pipeline transportation system reliability improvement algorithm, especially an equipment degradation computer model. In conclusion, three model building approaches for determining optimal technical systems verification inspections duration were considered.

  14. Problems experienced and envisioned for dynamical physical systems

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.

    1985-01-01

    The use of high performance systems, which is the trend of future space systems, naturally leads to lower margins and a higher sensitivity to parameter variations and, therefore, more problems of dynamical physical systems. To circumvent dynamic problems of these systems, appropriate design, verification analysis, and tests must be planned and conducted. The basic design goal is to define the problem before it occurs. The primary approach for meeting this goal is a good understanding and reviewing of the problems experienced in the past in terms of the system under design. This paper reviews many of the dynamic problems experienced in space systems design and operation, categorizes them as to causes, and envisions future program implications, developing recommendations for analysis and test approaches.

  15. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  16. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  17. NEW DEVELOPMENTS AND APPLICATIONS OF SUPERHEATED EMULSIONS: WARHEAD VERIFICATION AND SPECIAL NUCLEAR MATERIAL INTERDICTION.

    PubMed

    d'Errico, F; Chierici, A; Gattas-Sethi, M; Philippe, S; Goldston, R; Glaser, A

    2018-04-25

    In recent years, neutron detection with superheated emulsions has received renewed attention thanks to improved detector manufacturing and read-out techniques, and thanks to successful applications in warhead verification and special nuclear material (SNM) interdiction. Detectors are currently manufactured with methods allowing high uniformity of the drop sizes, which in turn allows the use of optical read-out techniques based on dynamic light scattering. Small detector cartridges arranged in 2D matrices are developed for the verification of a declared warhead without revealing its design. For this application, the enabling features of the emulsions are that bubbles formed at different times cannot be distinguished from each other, while the passive nature of the detectors avoids the susceptibility to electronic snooping and tampering. Large modules of emulsions are developed to detect the presence of shielded special nuclear materials hidden in cargo containers 'interrogated' with high energy X-rays. In this case, the enabling features of the emulsions are photon discrimination, a neutron detection threshold close to 3 MeV and a rate-insensitive read-out.

  18. Real time radiotherapy verification with Cherenkov imaging: development of a system for beamlet verification

    NASA Astrophysics Data System (ADS)

    Pogue, B. W.; Krishnaswamy, V.; Jermyn, M.; Bruza, P.; Miao, T.; Ware, William; Saunders, S. L.; Andreozzi, J. M.; Gladstone, D. J.; Jarvis, L. A.

    2017-05-01

    Cherenkov imaging has been shown to allow near real time imaging of the beam entrance and exit on patient tissue, with the appropriate intensified camera and associated image processing. A dedicated system has been developed for research into full torso imaging of whole breast irradiation, where the dual camera system captures the beam shape for all beamlets used in this treatment protocol. Particularly challenging verification measurement exists in dynamic wedge, field in field, and boost delivery, and the system was designed to capture these as they are delivered. Two intensified CMOS (ICMOS) cameras were developed and mounted in a breast treatment room, and pilot studies for intensity and stability were completed. Software tools to contour the treatment area have been developed and are being tested prior to initiation of the full trial. At present, it is possible to record delivery of individual beamlets as small as a single MLC thickness, and readout at 20 frames per second is achieved. Statistical analysis of system repeatibilty and stability is presented, as well as pilot human studies.

  19. SU-F-J-32: Do We Need KV Imaging During CBCT Based Patient Set-Up for Lung Radiation Therapy?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopal, A; Zhou, J; Prado, K

    Purpose: To evaluate the role of 2D kilovoltage (kV) imaging to complement cone beam CT (CBCT) imaging in a shift threshold based image guided radiation therapy (IGRT) strategy for conventional lung radiotherapy. Methods: A retrospective study was conducted by analyzing IGRT couch shift trends for 15 patients that received lung radiation therapy to evaluate the benefit of performing orthogonal kV imaging prior to CBCT imaging. Herein, a shift threshold based IGRT protocol was applied, which would mandate additional CBCT verification if the applied patient shifts exceeded 3 mm to avoid intraobserver variability in CBCT registration and to confirm table shifts.more » For each patient, two IGRT strategies: kV + CBCT and CBCT alone, were compared and the recorded patient shifts were categorized into whether additional CBCT acquisition would have been mandated or not. The effectiveness of either strategy was gauged by the likelihood of needing additional CBCT imaging for accurate patient set-up. Results: The use of CBCT alone was 6 times more likely to require an additional CBCT than KV+CBCT, for a 3 mm shift threshold (88% vs 14%). The likelihood of additional CBCT verification generally increased with lower shift thresholds, and was significantly lower when kV+CBCT was used (7% with 5 mm shift threshold, 36% with 2 mm threshold), than with CBCT alone (61% with 5 mm shift threshold, 97% with 2 mm threshold). With CBCT alone, treatment time increased by 2.2 min and dose increased by 1.9 cGy per fraction on average due to additional CBCT with a 3mm shift threshold. Conclusion: The benefit of kV imaging to screen for gross misalignments led to more accurate CBCT based patient localization compared with using CBCT alone. The subsequently reduced need for additional CBCT verification will minimize treatment time and result in less overall patient imaging dose.« less

  20. Two years experience with quality assurance protocol for patient related Rapid Arc treatment plan verification using a two dimensional ionization chamber array

    PubMed Central

    2011-01-01

    Purpose To verify the dose distribution and number of monitor units (MU) for dynamic treatment techniques like volumetric modulated single arc radiation therapy - Rapid Arc - each patient treatment plan has to be verified prior to the first treatment. The purpose of this study was to develop a patient related treatment plan verification protocol using a two dimensional ionization chamber array (MatriXX, IBA, Schwarzenbruck, Germany). Method Measurements were done to determine the dependence between response of 2D ionization chamber array, beam direction, and field size. Also the reproducibility of the measurements was checked. For the patient related verifications the original patient Rapid Arc treatment plan was projected on CT dataset of the MatriXX and the dose distribution was calculated. After irradiation of the Rapid Arc verification plans measured and calculated 2D dose distributions were compared using the gamma evaluation method implemented in the measuring software OmniPro (version 1.5, IBA, Schwarzenbruck, Germany). Results The dependence between response of 2D ionization chamber array, field size and beam direction has shown a passing rate of 99% for field sizes between 7 cm × 7 cm and 24 cm × 24 cm for measurements of single arc. For smaller and larger field sizes than 7 cm × 7 cm and 24 cm × 24 cm the passing rate was less than 99%. The reproducibility was within a passing rate of 99% and 100%. The accuracy of the whole process including the uncertainty of the measuring system, treatment planning system, linear accelerator and isocentric laser system in the treatment room was acceptable for treatment plan verification using gamma criteria of 3% and 3 mm, 2D global gamma index. Conclusion It was possible to verify the 2D dose distribution and MU of Rapid Arc treatment plans using the MatriXX. The use of the MatriXX for Rapid Arc treatment plan verification in clinical routine is reasonable. The passing rate should be 99% than the verification protocol is able to detect clinically significant errors. PMID:21342509

  1. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    PubMed

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  2. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  3. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the needmore » to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.« less

  4. Analysis and test for space shuttle propellant dynamics

    NASA Technical Reports Server (NTRS)

    Berry, R. L.; Demchak, L. J.; Tegart, J. R.

    1983-01-01

    This report presents the results of a study to develop an analytical model capable of predicting the dynamic interaction forces on the Shuttle External Tank, due to large amplitude propellant slosh during RTLS separation. The report details low-g drop tower and KC-135 test programs that were conducted to investigate propellant reorientation during RTLS. In addition, the development of a nonlinear finite element slosh model (LAMPS2, two dimensional, and one LAMPS3, three dimensional) is presented. Correlation between the model and test data is presented as a verification of the modeling approach.

  5. Numerical computation of orbits and rigorous verification of existence of snapback repellers.

    PubMed

    Peng, Chen-Chang

    2007-03-01

    In this paper we show how analysis from numerical computation of orbits can be applied to prove the existence of snapback repellers in discrete dynamical systems. That is, we present a computer-assisted method to prove the existence of a snapback repeller of a specific map. The existence of a snapback repeller of a dynamical system implies that it has chaotic behavior [F. R. Marotto, J. Math. Anal. Appl. 63, 199 (1978)]. The method is applied to the logistic map and the discrete predator-prey system.

  6. Optimal placement of excitations and sensors for verification of large dynamical systems

    NASA Technical Reports Server (NTRS)

    Salama, M.; Rose, T.; Garba, J.

    1987-01-01

    The computationally difficult problem of the optimal placement of excitations and sensors to maximize the observed measurements is studied within the framework of combinatorial optimization, and is solved numerically using a variation of the simulated annealing heuristic algorithm. Results of numerical experiments including a square plate and a 960 degrees-of-freedom Control of Flexible Structure (COFS) truss structure, are presented. Though the algorithm produces suboptimal solutions, its generality and simplicity allow the treatment of complex dynamical systems which would otherwise be difficult to handle.

  7. A study of the dynamics of rotating space stations with elastically connected counterweight and attached flexible appendages. Volume 1: Theory

    NASA Technical Reports Server (NTRS)

    Austin, F.; Markowitz, J.; Goldenberg, S.; Zetkov, G. A.

    1973-01-01

    The formulation of a mathematical model for predicting the dynamic behavior of rotating flexible space station configurations was conducted. The overall objectives of the study were: (1) to develop the theoretical techniques for determining the behavior of a realistically modeled rotating space station, (2) to provide a versatile computer program for the numerical analysis, and (3) to present practical concepts for experimental verification of the analytical results. The mathematical model and its associated computer program are described.

  8. Impact of the difference in the plantar flexor strength of the ankle joint in the affected side among hemiplegic patients on the plantar pressure and walking asymmetry.

    PubMed

    You, Young Youl; Chung, Sin Ho; Lee, Hyung Jin

    2016-11-01

    [Purpose] This study was to examine the changes in the gait lines and plantar pressures in static and dynamic circumstances, according to the differences in the strengths of the plantar flexors in the ankle joints on the affected sides of hemiplegic patients, and to determine their impacts on walking symmetry. [Subjects and Methods] A total of thirty hospitalized stroke patients suffering from hemiplegia were selected in this study. The subjects had ankylosing patterns in the ankle joints of the affected sides. Fifteen of the patients had plantar flexor manual muscle testing scores between poor and fair, while fifteen of the patients had zero and trace. [Results] The contact pattern of the plantar surface with the ground is a reliable method for walking analysis, which is an important index for understanding the ankle mechanism and the relationship between the plantar surface and the ground. [Conclusion] The functional improvement of patients with stroke could be supported through a verification of the analysis methods of the therapy strategy and walking pattern.

  9. First results of the wind evaluation breadboard for ELT primary mirror design

    NASA Astrophysics Data System (ADS)

    Reyes García-Talavera, Marcos; Viera, Teodora; Núñez, Miguel

    2010-07-01

    The Wind Evaluation Breadboard (WEB) is a primary mirror and telescope simulator formed by seven aluminium segments, including position sensors, electromechanical support systems and support structures. WEB has been developed to evaluate technologies for primary mirror wavefront control and to evaluate the performance of the control of wind buffeting disturbance on ELT segmented mirrors. For this purpose WEB electro-mechanical set-up simulates the real operational constrains applied to large segmented mirrors. This paper describes the WEB assembly, integration and verification, the instrument characterisation and close loop control design, including the dynamical characterization of the instrument and the control architecture. The performance of the new technologies developed for position sensing, acting and controlling is evaluated. The integration of the instrument in the observatory and the results of the first experiments are summarised, with different wind conditions, elevation and azimuth angles of incidence. Conclusions are extracted with respect the wind rejection performance and the control strategy for an ELT. WEB has been designed and developed by IAC, ESO, ALTRAN and JUPASA, with the integration of subsystems of FOGALE and TNO.

  10. Framework for Evaluating Loop Invariant Detection Games in Relation to Automated Dynamic Invariant Detectors

    DTIC Science & Technology

    2015-09-01

    Detectability ...............................................................................................37 Figure 20. Excel VBA Codes for Checker...National Vulnerability Database OS Operating System SQL Structured Query Language VC Verification Condition VBA Visual Basic for Applications...checks each of these assertions for detectability by Daikon. The checker is an Excel Visual Basic for Applications ( VBA ) script that checks the

  11. META 2f: Probabilistic, Compositional, Multi-dimension Model-Based Verification (PROMISE)

    DTIC Science & Technology

    2011-10-01

    Equational Logic, Rewriting Logic, and Maude ................................................ 52  5.3  Results and Discussion...and its discrete transitions are left unchanged. However, the differential equations describing the continuous dynamics (in each mode) are replaced by...by replacing hard-to-analyze differential equations by discrete transitions. In principle, predicate and qualitative abstraction can be used on a

  12. Experimental verification of dynamic simulation

    NASA Technical Reports Server (NTRS)

    Yae, K. Harold; Hwang, Howyoung; Chern, Su-Tai

    1989-01-01

    The dynamics model here is a backhoe, which is a four degree of freedom manipulator from the dynamics standpoint. Two types of experiment are chosen that can also be simulated by a multibody dynamics simulation program. In the experiment, recorded were the configuration and force histories; that is, velocity and position, and force output and differential pressure change from the hydraulic cylinder, in the time domain. When the experimental force history is used as driving force in the simulation model, the forward dynamics simulation produces a corresponding configuration history. Then, the experimental configuration history is used in the inverse dynamics analysis to generate a corresponding force history. Therefore, two sets of configuration and force histories--one set from experiment, and the other from the simulation that is driven forward and backward with the experimental data--are compared in the time domain. More comparisons are made in regard to the effects of initial conditions, friction, and viscous damping.

  13. Space station dynamics, attitude control and momentum management

    NASA Technical Reports Server (NTRS)

    Sunkel, John W.; Singh, Ramen P.; Vengopal, Ravi

    1989-01-01

    The Space Station Attitude Control System software test-bed provides a rigorous environment for the design, development and functional verification of GN and C algorithms and software. The approach taken for the simulation of the vehicle dynamics and environmental models using a computationally efficient algorithm is discussed. The simulation includes capabilities for docking/berthing dynamics, prescribed motion dynamics associated with the Mobile Remote Manipulator System (MRMS) and microgravity disturbances. The vehicle dynamics module interfaces with the test-bed through the central Communicator facility which is in turn driven by the Station Control Simulator (SCS) Executive. The Communicator addresses issues such as the interface between the discrete flight software and the continuous vehicle dynamics, and multi-programming aspects such as the complex flow of control in real-time programs. Combined with the flight software and redundancy management modules, the facility provides a flexible, user-oriented simulation platform.

  14. Eye-Tracking Verification of the Strategy Used to Analyse Algorithms Expressed in a Flowchart and Pseudocode

    ERIC Educational Resources Information Center

    Andrzejewska, Magdalena; Stolinska, Anna; Blasiak, Wladyslaw; Peczkowski, Pawel; Rosiek, Roman; Rozek, Bozena; Sajka, Miroslawa; Wcislo, Dariusz

    2016-01-01

    The results of qualitative and quantitative investigations conducted with individuals who learned algorithms in school are presented in this article. In these investigations, eye-tracking technology was used to follow the process of solving algorithmic problems. The algorithmic problems were presented in two comparable variants: in a pseudocode…

  15. Verification and Validation Strategy for Implementation of Hybrid Potts-Phase Field Hydride Modeling Capability in MBM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason D. Hales; Veena Tikare

    2014-04-01

    The Used Fuel Disposition (UFD) program has initiated a project to develop a hydride formation modeling tool using a hybrid Potts­phase field approach. The Potts model is incorporated in the SPPARKS code from Sandia National Laboratories. The phase field model is provided through MARMOT from Idaho National Laboratory.

  16. Impact Damage and Strain Rate Effects for Toughened Epoxy Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    Structural integrity of composite systems under dynamic impact loading is investigated herein. The GENOA virtual testing software environment is used to implement the effects of dynamic loading on fracture progression and damage tolerance. Combinations of graphite and glass fibers with a toughened epoxy matrix are investigated. The effect of a ceramic coating for the absorption of impact energy is also included. Impact and post impact simulations include verification and prediction of (1) Load and Impact Energy, (2) Impact Damage Size, (3) Maximum Impact Peak Load, (4) Residual Strength, (5) Maximum Displacement, (6) Contribution of Failure Modes to Failure Mechanisms, (7) Prediction of Impact Load Versus Time, and (8) Damage, and Fracture Pattern. A computer model is utilized for the assessment of structural response, progressive fracture, and defect/damage tolerance characteristics. Results show the damage progression sequence and the changes in the structural response characteristics due to dynamic impact. The fundamental premise of computational simulation is that the complete evaluation of composite fracture requires an assessment of ply and subply level damage/fracture processes as the structure is subjected to loads. Simulation results for the graphite/epoxy composite were compared with the impact and tension failure test data, correlation and verification was obtained that included: (1) impact energy, (2) damage size, (3) maximum impact peak load, (4) residual strength, (5) maximum displacement, and (6) failure mechanisms of the composite structure.

  17. An open source solution for an in-house built dynamic platform for the validation of stereotactic ablative body radiotherapy for VMAT and IMRT.

    PubMed

    Munoz, Luis; Ziebell, Amy; Morton, Jason; Bhat, Madhava

    2016-12-01

    An in-house solution for the verification of dose delivered to a moving phantom as required for the clinical implementation of lung stereotactic ablative body radiation therapy was developed. The superior-inferior movement required to simulate tumour motion during a normal breathing cycle was achieved via the novel use of an Arduino Uno™, a low-cost open-source microcontroller board connected to a high torque servo motor. Slow CT imaging was used to acquire the image set and a 4D cone beam CT (4D-CBCT) verified the efficacy of contoured margins before treatment on the moving phantom. Treatment fields were delivered to a section of a CIRS™ anthropomorphic phantom. Dose verification to the dynamic phantom with Gafchromic EBT3 film using 3 %-1 mm gamma analysis acceptance criteria registered an absolute dose pass rate for IMRT and VMAT of 98 and 96.6 %, respectively. It was verified that 100 % of the PTV received the prescribed dose of 12 Gy per fraction using the dynamic phantom, and no major discrepancy between planned and measured results due to interplay between multileaf collimator sequences and target motion was observed. This study confirmed that the use of an in-house solution using open source hardware and software with existing quality assurance equipment was appropriate in validating a new treatment technique.

  18. Model Mismatch Paradigm for Probe based Nanoscale Imaging

    NASA Astrophysics Data System (ADS)

    Agarwal, Pranav

    Scanning Probe Microscopes (SPMs) are widely used for investigation of material properties and manipulation of matter at the nanoscale. These instruments are considered critical enablers of nanotechnology by providing the only technique for direct observation of dynamics at the nanoscale and affecting it with sub Angstrom resolution. Current SPMs are limited by low throughput and lack of quantitative measurements of material properties. Various applications like the high density data storage, sub-20 nm lithography, fault detection and functional probing of semiconductor circuits, direct observation of dynamical processes involved in biological samples viz. motor proteins and transport phenomena in various materials demand high throughput operation. Researchers involved in material characterization at nanoscale are interested in getting quantitative measurements of stiffness and dissipative properties of various materials in a least invasive manner. In this thesis, system theoretic concepts are used to address these limitations. The central tenet of the thesis is to model, the known information about the system and then focus on perturbations of these known dynamics or model, to sense the effects due to changes in the environment such as changes in material properties or surface topography. Thus a model mismatch paradigm for probe based nanoscale imaging is developed. The topic is developed by presenting physics based modeling of a particular mode of operation of SPMs called the dynamic mode operation. This mode is modeled as a forced Lure system where a linear time invariant system is in feedback with an unknown static memoryless nonlinearity. Tools from averaging theory are used to tame this complex nonlinear system by approximating it as a linear system with time varying parameters. Material properties are thus transformed from being parameters of unknown nonlinear functions to being unknown coefficients of a linear plant. The first contribution of this thesis deals with real time detection and reduction of spurious areas in the image which are also known as probe-loss areas. These areas become severely detrimental during high speed operations. The detection strategy is based on thresholding of a distance measure, which captures the difference between sensor models in absence and presence of probe-loss. A switching gain control strategy based on the output of a Kalman Filter is used to reduce probe-loss areas in real time. The efficacy of this technique is demonstrated through experimental results showing increased image fidelity at scan rates that are 10 times faster than conventional scan rates. The second contribution of this thesis deals with developing multi-frequency input excitation strategy and deriving a bias compensated adaptive parameter estimation strategy to determine the instantaneous equivalent cantilever model. This is used to address the challenge of quantitative imaging at high bandwidth operation by relating the estimated plant coefficients to conservative and dissipative components of tip-sample interaction. The efficacy of the technique is demonstrated for quantitative material characterization of a polymer sample, resulting in material information not previously obtainable during dynamic mode operation. This information is obtained at speeds which are two orders faster than existing techniques. Quantitative verification strategies for the accuracy of estimated parameters are presented. The third contribution of this thesis deals with developing real time tractable models and characterization methodology for an electrostatically actuated MEMS cantilever with an integrated solid state thermal sensor. Appropriate modeling assumptions are made to delineate various nonlinear forces on the cantilever viz. electrostatic force, tip-sample interaction force and capacitive coupling. Experimental strategy is presented to measure the thermal sensing transfer function from DC-100kHz. A quantitative match between experimental and simulated data is obtained for the large range nonlinearities and small signal dynamics.

  19. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the existing models.

  20. Analysis of features of hydrodynamics and heat transfer in the fuel assembly of prospective sodium reactor with a high rate of reproduction in the uranium-plutonium fuel cycle

    NASA Astrophysics Data System (ADS)

    Lubina, A. S.; Subbotin, A. S.; Sedov, A. A.; Frolov, A. A.

    2016-12-01

    The fast sodium reactor fuel assembly (FA) with U-Pu-Zr metallic fuel is described. In comparison with a "classical" fast reactor, this FA contains thin fuel rods and a wider fuel rod grid. Studies of the fluid dynamics and the heat transfer were carried out for such a new FA design. The verification of the ANSYS CFX code was provided for determination of the velocity, pressure, and temperature fields in the different channels. The calculations in the cells and in the FA were carried out using the model of shear stress transport (SST) selected at the stage of verification. The results of the hydrodynamics and heat transfer calculations have been analyzed.

  1. APPLICATION OF STEEL PIPE PILE LOADING TESTS TO DESIGN VERIFICATION OF FOUNDATION OF THE TOKYO GATE BRIDGE

    NASA Astrophysics Data System (ADS)

    Saitou, Yutaka; Kikuchi, Yoshiaki; Kusakabe, Osamu; Kiyomiya, Osamu; Yoneyama, Haruo; Kawakami, Taiji

    Steel sheet pipe pile foundations with large diameter steel pipe sheet pile were used for the foundation of the main pier of the Tokyo Gateway bridge. However, as for the large diameter steel pipe pile, the bearing mechanism including a pile tip plugging effect is still unclear due to lack of the practical examinations even though loading tests are performed on Trans-Tokyo Bay Highway. In the light of the foregoing problems, static pile loading tests both vertical and horizontal directions, a dynamic loading test, and cone penetration tests we re conducted for determining proper design parameters of the ground for the foundations. Design parameters were determined rationally based on the tests results. Rational design verification was obtained from this research.

  2. The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hui; Shi, Tujin; Qian, Wei-Jun

    2015-12-04

    Mass spectrometry-based proteomics has become an indispensable tool in biomedical research with broad applications ranging from fundamental biology, systems biology, and biomarker discovery. Recent advances in LC-MS have made it become a major technology in clinical applications, especially in cancer biomarker discovery and verification. To overcome the challenges associated with the analysis of clinical samples, such as extremely wide dynamic range of protein concentrations in biofluids and the need to perform high throughput and accurate quantification, significant efforts have been devoted to improve the overall performance of LC-MS bases clinical proteomics. In this review, we summarize the recent advances inmore » LC-MS in the aspect of cancer biomarker discovery and quantification, and discuss its potentials, limitations, and future perspectives.« less

  3. The clinical impact of recent advances in LC-MS for cancer biomarker discovery and verification.

    PubMed

    Wang, Hui; Shi, Tujin; Qian, Wei-Jun; Liu, Tao; Kagan, Jacob; Srivastava, Sudhir; Smith, Richard D; Rodland, Karin D; Camp, David G

    2016-01-01

    Mass spectrometry (MS) -based proteomics has become an indispensable tool with broad applications in systems biology and biomedical research. With recent advances in liquid chromatography (LC) and MS instrumentation, LC-MS is making increasingly significant contributions to clinical applications, especially in the area of cancer biomarker discovery and verification. To overcome challenges associated with analyses of clinical samples (for example, a wide dynamic range of protein concentrations in bodily fluids and the need to perform high throughput and accurate quantification of candidate biomarker proteins), significant efforts have been devoted to improve the overall performance of LC-MS-based clinical proteomics platforms. Reviewed here are the recent advances in LC-MS and its applications in cancer biomarker discovery and quantification, along with the potentials, limitations and future perspectives.

  4. Credit Card Fraud Detection: A Realistic Modeling and a Novel Learning Strategy.

    PubMed

    Dal Pozzolo, Andrea; Boracchi, Giacomo; Caelen, Olivier; Alippi, Cesare; Bontempi, Gianluca

    2017-09-14

    Detecting frauds in credit card transactions is perhaps one of the best testbeds for computational intelligence algorithms. In fact, this problem involves a number of relevant challenges, namely: concept drift (customers' habits evolve and fraudsters change their strategies over time), class imbalance (genuine transactions far outnumber frauds), and verification latency (only a small set of transactions are timely checked by investigators). However, the vast majority of learning algorithms that have been proposed for fraud detection rely on assumptions that hardly hold in a real-world fraud-detection system (FDS). This lack of realism concerns two main aspects: 1) the way and timing with which supervised information is provided and 2) the measures used to assess fraud-detection performance. This paper has three major contributions. First, we propose, with the help of our industrial partner, a formalization of the fraud-detection problem that realistically describes the operating conditions of FDSs that everyday analyze massive streams of credit card transactions. We also illustrate the most appropriate performance measures to be used for fraud-detection purposes. Second, we design and assess a novel learning strategy that effectively addresses class imbalance, concept drift, and verification latency. Third, in our experiments, we demonstrate the impact of class unbalance and concept drift in a real-world data stream containing more than 75 million transactions, authorized over a time window of three years.

  5. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  6. Application of 3D Laser Scanning Technology in Inspection and Dynamic Reserves Detection of Open-Pit Mine

    NASA Astrophysics Data System (ADS)

    Hu, Zhumin; Wei, Shiyu; Jiang, Jun

    2017-10-01

    The traditional open-pit mine mining rights verification and dynamic reserve detection means rely on the total station and RTK to collect the results of the turning point coordinates of mining surface contours. It resulted in obtaining the results of low precision and large error in the means that is limited by the traditional measurement equipment accuracy and measurement methods. The three-dimensional scanning technology can obtain the three-dimensional coordinate data of the surface of the measured object in a large area at high resolution. This paper expounds the commonly used application of 3D scanning technology in the inspection and dynamic reserve detection of open mine mining rights.

  7. Verification of nonlinear dynamic structural test results by combined image processing and acoustic analysis

    NASA Astrophysics Data System (ADS)

    Tene, Yair; Tene, Noam; Tene, G.

    1993-08-01

    An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.

  8. Research on Equivalent Tests of Dynamics of On-orbit Soft Contact Technology Based on On-Orbit Experiment Data

    NASA Astrophysics Data System (ADS)

    Yang, F.; Dong, Z. H.; Ye, X.

    2018-05-01

    Currently, space robots have been become a very important means of space on-orbit maintenance and support. Many countries are taking deep research and experiment on this. Because space operation attitude is very complicated, it is difficult to model them in research lab. This paper builds up a complete equivalent experiment framework according to the requirement of proposed space soft-contact technology. Also, this paper carries out flexible multi-body dynamics parameters verification for on-orbit soft-contact mechanism, which combines on-orbit experiment data, the built soft-contact mechanism equivalent model and flexible multi-body dynamics equivalent model that is based on KANE equation. The experiment results approve the correctness of the built on-orbit soft-contact flexible multi-body dynamics.

  9. Kinematic measures for assessing gait stability in elderly individuals: a systematic review

    PubMed Central

    Hamacher, D.; Singh, N.B.; Van Dieën, J.H.; Heller, M.O.; Taylor, W.R.

    2011-01-01

    Falls not only present a considerable health threat, but the resulting treatment and loss of working days also place a heavy economic burden on society. Gait instability is a major fall risk factor, particularly in geriatric patients, and walking is one of the most frequent dynamic activities of daily living. To allow preventive strategies to become effective, it is therefore imperative to identify individuals with an unstable gait. Assessment of dynamic stability and gait variability via biomechanical measures of foot kinematics provides a viable option for quantitative evaluation of gait stability, but the ability of these methods to predict falls has generally not been assessed. Although various methods for assessing gait stability exist, their sensitivity and applicability in a clinical setting, as well as their cost-effectiveness, need verification. The objective of this systematic review was therefore to evaluate the sensitivity of biomechanical measures that quantify gait stability among elderly individuals and to evaluate the cost of measurement instrumentation required for application in a clinical setting. To assess gait stability, a comparative effect size (Cohen's d) analysis of variability and dynamic stability of foot trajectories during level walking was performed on 29 of an initial yield of 9889 articles from four electronic databases. The results of this survey demonstrate that linear variability of temporal measures of swing and stance was most capable of distinguishing between fallers and non-fallers, whereas step width and stride velocity prove more capable of discriminating between old versus young (OY) adults. In addition, while orbital stability measures (Floquet multipliers) applied to gait have been shown to distinguish between both elderly fallers and non-fallers as well as between young and old adults, local stability measures (λs) have been able to distinguish between young and old adults. Both linear and nonlinear measures of foot time series during gait seem to hold predictive ability in distinguishing healthy from fall-prone elderly adults. In conclusion, biomechanical measurements offer promise for identifying individuals at risk of falling and can be obtained with relatively low-cost tools. Incorporation of the most promising measures in combined retrospective and prospective studies for understanding fall risk and designing preventive strategies is warranted. PMID:21880615

  10. Kinematic measures for assessing gait stability in elderly individuals: a systematic review.

    PubMed

    Hamacher, D; Singh, N B; Van Dieën, J H; Heller, M O; Taylor, W R

    2011-12-07

    Falls not only present a considerable health threat, but the resulting treatment and loss of working days also place a heavy economic burden on society. Gait instability is a major fall risk factor, particularly in geriatric patients, and walking is one of the most frequent dynamic activities of daily living. To allow preventive strategies to become effective, it is therefore imperative to identify individuals with an unstable gait. Assessment of dynamic stability and gait variability via biomechanical measures of foot kinematics provides a viable option for quantitative evaluation of gait stability, but the ability of these methods to predict falls has generally not been assessed. Although various methods for assessing gait stability exist, their sensitivity and applicability in a clinical setting, as well as their cost-effectiveness, need verification. The objective of this systematic review was therefore to evaluate the sensitivity of biomechanical measures that quantify gait stability among elderly individuals and to evaluate the cost of measurement instrumentation required for application in a clinical setting. To assess gait stability, a comparative effect size (Cohen's d) analysis of variability and dynamic stability of foot trajectories during level walking was performed on 29 of an initial yield of 9889 articles from four electronic databases. The results of this survey demonstrate that linear variability of temporal measures of swing and stance was most capable of distinguishing between fallers and non-fallers, whereas step width and stride velocity prove more capable of discriminating between old versus young (OY) adults. In addition, while orbital stability measures (Floquet multipliers) applied to gait have been shown to distinguish between both elderly fallers and non-fallers as well as between young and old adults, local stability measures (λs) have been able to distinguish between young and old adults. Both linear and nonlinear measures of foot time series during gait seem to hold predictive ability in distinguishing healthy from fall-prone elderly adults. In conclusion, biomechanical measurements offer promise for identifying individuals at risk of falling and can be obtained with relatively low-cost tools. Incorporation of the most promising measures in combined retrospective and prospective studies for understanding fall risk and designing preventive strategies is warranted.

  11. Motion and Stability of Saturated Soil Systems under Dynamic Loading.

    DTIC Science & Technology

    1985-04-04

    12 7.3 Experimental Verification of Theories ............................. 13 8. ADDITIONAL COMMENTS AND OTHER WORK, AT THE OHIO...theoretical/computational models. The continuing rsearch effort will extend and refine the theoretical models, allow for compressibility of soil as...motion of soil and water and, therefore, a correct theory of liquefaction should not include this assumption. Finite element methodologies have been

  12. Geometric Verification of Dynamic Wave Arc Delivery With the Vero System Using Orthogonal X-ray Fluoroscopic Imaging.

    PubMed

    Burghelea, Manuela; Verellen, Dirk; Poels, Kenneth; Gevaert, Thierry; Depuydt, Tom; Tournel, Koen; Hung, Cecilia; Simon, Viorica; Hiraoka, Masahiro; de Ridder, Mark

    2015-07-15

    The purpose of this study was to define an independent verification method based on on-board orthogonal fluoroscopy to determine the geometric accuracy of synchronized gantry-ring (G/R) rotations during dynamic wave arc (DWA) delivery available on the Vero system. A verification method for DWA was developed to calculate O-ring-gantry (G/R) positional information from ball-bearing positions retrieved from fluoroscopic images of a cubic phantom acquired during DWA delivery. Different noncoplanar trajectories were generated in order to investigate the influence of path complexity on delivery accuracy. The G/R positions detected from the fluoroscopy images (DetPositions) were benchmarked against the G/R angulations retrieved from the control points (CP) of the DWA RT plan and the DWA log files recorded by the treatment console during DWA delivery (LogActed). The G/R rotational accuracy was quantified as the mean absolute deviation ± standard deviation. The maximum G/R absolute deviation was calculated as the maximum 3-dimensional distance between the CP and the closest DetPositions. In the CP versus DetPositions comparison, an overall mean G/R deviation of 0.13°/0.16° ± 0.16°/0.16° was obtained, with a maximum G/R deviation of 0.6°/0.2°. For the LogActed versus DetPositions evaluation, the overall mean deviation was 0.08°/0.15° ± 0.10°/0.10° with a maximum G/R of 0.3°/0.4°. The largest decoupled deviations registered for gantry and ring were 0.6° and 0.4° respectively. No directional dependence was observed between clockwise and counterclockwise rotations. Doubling the dose resulted in a double number of detected points around each CP, and an angular deviation reduction in all cases. An independent geometric quality assurance approach was developed for DWA delivery verification and was successfully applied on diverse trajectories. Results showed that the Vero system is capable of following complex G/R trajectories with maximum deviations during DWA below 0.6°. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Verification and Validation Testing of the Parachute Decelerator System Prior to the First Supersonic Flight Dynamics Test for the Low Density Supersonic Decelerator Program

    NASA Technical Reports Server (NTRS)

    Gallon, John C.; Witkowski, Allen

    2015-01-01

    The Parachute Decelerator System (PDS) is comprised of all components associated with the supersonic parachute and its associated deployment. During the Supersonic Flight Dynamics Test (SFDT), for the Low Density Supersonic Decelerators Program, the PDS was required to deploy the supersonic parachute in a defined fashion. The PDS hardware includes three major subsystems that must function together. The first subsystem is the Parachute Deployment Device (PDD), which acts as a modified pilot deployment system. It is comprised of a pyrotechnic mortar, a Kevlar ballute, a lanyard actuated pyrotechnic inflation aid, and rigging with its associated thermal protection material (TPS). The second subsystem is the supersonic parachute deployment hardware. This includes all of the parachute specific rigging that includes the parachute stowage can and the rigging including TPS and bridle stiffeners for bridle management during deployment. The third subsystem is the Supersonic Parachute itself, which includes the main parachute and deployment bags. This paper summarizes the verification and validation of the deployment process, from the initialization of the PDS system through parachute bag strip that was done prior to the first SFDT.

  14. Dynamic Rupture Benchmarking of the ADER-DG Method

    NASA Astrophysics Data System (ADS)

    Gabriel, Alice; Pelties, Christian

    2013-04-01

    We will verify the arbitrary high-order derivative Discontinuous Galerkin (ADER-DG) method in various test cases of the 'SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise' benchmark suite (Harris et al. 2009). The ADER-DG scheme is able to solve the spontaneous rupture problem with high-order accuracy in space and time on three-dimensional unstructured tetrahedral meshes. Strong mesh coarsening or refinement at areas of interest can be applied to keep the computational costs feasible. Moreover, the method does not generate spurious high-frequency contributions in the slip rate spectra and therefore does not require any artificial damping as demonstrated in previous presentations and publications (Pelties et al. 2010 and 2012). We will show that the mentioned features hold also for more advanced setups as e.g. a branching fault system, heterogeneous background stresses and bimaterial faults. The advanced geometrical flexibility combined with an enhanced accuracy will make the ADER-DG method a useful tool to study earthquake dynamics on complex fault systems in realistic rheologies. References: Harris, R.A., M. Barall, R. Archuleta, B. Aagaard, J.-P. Ampuero, H. Bhat, V. Cruz-Atienza, L. Dalguer, P. Dawson, S. Day, B. Duan, E. Dunham, G. Ely, Y. Kaneko, Y. Kase, N. Lapusta, Y. Liu, S. Ma, D. Oglesby, K. Olsen, A. Pitarka, S. Song, and E. Templeton, The SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise, Seismological Research Letters, vol. 80, no. 1, pages 119-126, 2009 Pelties, C., J. de la Puente, and M. Kaeser, Dynamic Rupture Modeling in Three Dimensions on Unstructured Meshes Using a Discontinuous Galerkin Method, AGU 2010 Fall Meeting, abstract #S21C-2068 Pelties, C., J. de la Puente, J.-P. Ampuero, G. Brietzke, and M. Kaeser, Three-Dimensional Dynamic Rupture Simulation with a High-order Discontinuous Galerkin Method on Unstructured Tetrahedral Meshes, JGR. - Solid Earth, VOL. 117, B02309, 2012

  15. Dynamic Rupture Benchmarking of the ADER-DG Method

    NASA Astrophysics Data System (ADS)

    Pelties, C.; Gabriel, A.

    2012-12-01

    We will verify the arbitrary high-order derivative Discontinuous Galerkin (ADER-DG) method in various test cases of the 'SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise' benchmark suite (Harris et al. 2009). The ADER-DG scheme is able to solve the spontaneous rupture problem with high-order accuracy in space and time on three-dimensional unstructured tetrahedral meshes. Strong mesh coarsening or refinement at areas of interest can be applied to keep the computational costs feasible. Moreover, the method does not generate spurious high-frequency contributions in the slip rate spectra and therefore does not require any artificial damping as demonstrated in previous presentations and publications (Pelties et al. 2010 and 2012). We will show that the mentioned features hold also for more advanced setups as e.g. a branching fault system, heterogeneous background stresses and bimaterial faults. The advanced geometrical flexibility combined with an enhanced accuracy will make the ADER-DG method a useful tool to study earthquake dynamics on complex fault systems in realistic rheologies. References: Harris, R.A., M. Barall, R. Archuleta, B. Aagaard, J.-P. Ampuero, H. Bhat, V. Cruz-Atienza, L. Dalguer, P. Dawson, S. Day, B. Duan, E. Dunham, G. Ely, Y. Kaneko, Y. Kase, N. Lapusta, Y. Liu, S. Ma, D. Oglesby, K. Olsen, A. Pitarka, S. Song, and E. Templeton, The SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise, Seismological Research Letters, vol. 80, no. 1, pages 119-126, 2009 Pelties, C., J. de la Puente, and M. Kaeser, Dynamic Rupture Modeling in Three Dimensions on Unstructured Meshes Using a Discontinuous Galerkin Method, AGU 2010 Fall Meeting, abstract #S21C-2068 Pelties, C., J. de la Puente, J.-P. Ampuero, G. Brietzke, and M. Kaeser, Three-Dimensional Dynamic Rupture Simulation with a High-order Discontinuous Galerkin Method on Unstructured Tetrahedral Meshes, JGR. - Solid Earth, VOL. 117, B02309, 2012

  16. Exploring Formalized Elite Coach Mentoring Programmes in the UK: 'We've Had to Play the Game'

    ERIC Educational Resources Information Center

    Sawiuk, Rebecca; Taylor, William G.; Groom, Ryan

    2018-01-01

    Formalized mentoring programmes have been implemented increasingly by UK sporting institutions as a central coach development tool, yet claims supporting formal mentoring as an effective learning strategy are often speculative, scarce, ill-defined and accepted without verification. The aim of this study, therefore, was to explore some of the…

  17. Frame synchronization methods based on channel symbol measurements

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Cheung, K.-M.

    1989-01-01

    The current DSN frame synchronization procedure is based on monitoring the decoded bit stream for the appearance of a sync marker sequence that is transmitted once every data frame. The possibility of obtaining frame synchronization by processing the raw received channel symbols rather than the decoded bits is explored. Performance results are derived for three channel symbol sync methods, and these are compared with results for decoded bit sync methods reported elsewhere. It is shown that each class of methods has advantages or disadvantages under different assumptions on the frame length, the global acquisition strategy, and the desired measure of acquisition timeliness. It is shown that the sync statistics based on decoded bits are superior to the statistics based on channel symbols, if the desired operating region utilizes a probability of miss many orders of magnitude higher than the probability of false alarm. This operating point is applicable for very large frame lengths and minimal frame-to-frame verification strategy. On the other hand, the statistics based on channel symbols are superior if the desired operating point has a miss probability only a few orders of magnitude greater than the false alarm probability. This happens for small frames or when frame-to-frame verifications are required.

  18. Recruitment Strategies of Methamphetamine-Using Men Who Have Sex with Men into an Online Survey

    PubMed Central

    Wilkerson, J. Michael; Shenk, Jared E.; Grey, Jeremy A.; Simon Rosser, B. R.; Noor, Syed W.

    2014-01-01

    Recruiting hidden populations into online research remains challenging. In this manuscript, we report lessons learned from our efforts to recruit methamphetamine-using men who have sex with men. Between July and October 2012, we implemented a four-phase recruitment strategy to enroll a total of 343 methamphetamine-using MSM into an online survey about recent substance use, sexual behavior, and various psychosocial measures. The four phases were implemented sequentially. During phase one, we placed advertisements on mobile applications, and during phase two, we placed advertisements on traditional websites formatted for browsers. During phase three, we used e-mail to initiate snowball recruitment, and during phase four, we used social media for snowball recruitment. Advertisements on mobile devices and websites formatted for browsers proved to be expensive options and resulted in few eligible participants. Our attempts to initiate a snowball through e-mail also proved unsuccessful. The majority (n=320) of observations in our final dataset came from our use of social media. However, participant fraud was a concern, requiring us to implement a strong participant verification protocol. For maximum recruitment and cost-effectiveness, researchers should use social media for recruitment provided they employ strong participant verification protocols. PMID:25642143

  19. Recruitment Strategies of Methamphetamine-Using Men Who Have Sex with Men into an Online Survey.

    PubMed

    Wilkerson, J Michael; Shenk, Jared E; Grey, Jeremy A; Simon Rosser, B R; Noor, Syed W

    Recruiting hidden populations into online research remains challenging. In this manuscript, we report lessons learned from our efforts to recruit methamphetamine-using men who have sex with men. Between July and October 2012, we implemented a four-phase recruitment strategy to enroll a total of 343 methamphetamine-using MSM into an online survey about recent substance use, sexual behavior, and various psychosocial measures. The four phases were implemented sequentially. During phase one, we placed advertisements on mobile applications, and during phase two, we placed advertisements on traditional websites formatted for browsers. During phase three, we used e-mail to initiate snowball recruitment, and during phase four, we used social media for snowball recruitment. Advertisements on mobile devices and websites formatted for browsers proved to be expensive options and resulted in few eligible participants. Our attempts to initiate a snowball through e-mail also proved unsuccessful. The majority (n=320) of observations in our final dataset came from our use of social media. However, participant fraud was a concern, requiring us to implement a strong participant verification protocol. For maximum recruitment and cost-effectiveness, researchers should use social media for recruitment provided they employ strong participant verification protocols.

  20. Control structural interaction testbed: A model for multiple flexible body verification

    NASA Technical Reports Server (NTRS)

    Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.

    1993-01-01

    Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.

  1. Solar array flight dynamic experiment

    NASA Technical Reports Server (NTRS)

    Schock, R. W.

    1986-01-01

    The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on space shuttle flight STS-31D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.

  2. Solar array flight dynamic experiment

    NASA Technical Reports Server (NTRS)

    Schock, Richard W.

    1986-01-01

    The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on Space Shuttle flight STS-31D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.

  3. Solar array flight dynamic experiment

    NASA Technical Reports Server (NTRS)

    Schock, Richard W.

    1987-01-01

    The purpose of the Solar Array Flight Dynamic Experiment (SAFDE) is to demonstrate the feasibility of on-orbit measurement and ground processing of large space structures' dynamic characteristics. Test definition or verification provides the dynamic characteristic accuracy required for control systems use. An illumination/measurement system was developed to fly on space shuttle flight STS-41D. The system was designed to dynamically evaluate a large solar array called the Solar Array Flight Experiment (SAFE) that had been scheduled for this flight. The SAFDE system consisted of a set of laser diode illuminators, retroreflective targets, an intelligent star tracker receiver and the associated equipment to power, condition, and record the results. In six tests on STS-41D, data was successfully acquired from 18 retroreflector targets and ground processed, post flight, to define the solar array's dynamic characteristic. The flight experiment proved the viability of on-orbit test definition of large space structures dynamic characteristics. Future large space structures controllability should be greatly enhanced by this capability.

  4. Air traffic surveillance and control using hybrid estimation and protocol-based conflict resolution

    NASA Astrophysics Data System (ADS)

    Hwang, Inseok

    The continued growth of air travel and recent advances in new technologies for navigation, surveillance, and communication have led to proposals by the Federal Aviation Administration (FAA) to provide reliable and efficient tools to aid Air Traffic Control (ATC) in performing their tasks. In this dissertation, we address four problems frequently encountered in air traffic surveillance and control; multiple target tracking and identity management, conflict detection, conflict resolution, and safety verification. We develop a set of algorithms and tools to aid ATC; These algorithms have the provable properties of safety, computational efficiency, and convergence. Firstly, we develop a multiple-maneuvering-target tracking and identity management algorithm which can keep track of maneuvering aircraft in noisy environments and of their identities. Secondly, we propose a hybrid probabilistic conflict detection algorithm between multiple aircraft which uses flight mode estimates as well as aircraft current state estimates. Our algorithm is based on hybrid models of aircraft, which incorporate both continuous dynamics and discrete mode switching. Thirdly, we develop an algorithm for multiple (greater than two) aircraft conflict avoidance that is based on a closed-form analytic solution and thus provides guarantees of safety. Finally, we consider the problem of safety verification of control laws for safety critical systems, with application to air traffic control systems. We approach safety verification through reachability analysis, which is a computationally expensive problem. We develop an over-approximate method for reachable set computation using polytopic approximation methods and dynamic optimization. These algorithms may be used either in a fully autonomous way, or as supporting tools to increase controllers' situational awareness and to reduce their work load.

  5. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates formore » the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.« less

  6. NASTRAN analysis of the 1/8-scale space shuttle dynamic model

    NASA Technical Reports Server (NTRS)

    Bernstein, M.; Mason, P. W.; Zalesak, J.; Gregory, D. J.; Levy, A.

    1973-01-01

    The space shuttle configuration has more complex structural dynamic characteristics than previous launch vehicles primarily because of the high model density at low frequencies and the high degree of coupling between the lateral and longitudinal motions. An accurate analytical representation of these characteristics is a primary means for treating structural dynamics problems during the design phase of the shuttle program. The 1/8-scale model program was developed to explore the adequacy of available analytical modeling technology and to provide the means for investigating problems which are more readily treated experimentally. The basic objectives of the 1/8-scale model program are: (1) to provide early verification of analytical modeling procedures on a shuttle-like structure, (2) to demonstrate important vehicle dynamic characteristics of a typical shuttle design, (3) to disclose any previously unanticipated structural dynamic characteristics, and (4) to provide for development and demonstration of cost effective prototype testing procedures.

  7. Space Shuttle Tail Service Mast Concept Verification

    NASA Technical Reports Server (NTRS)

    Uda, R. T.

    1976-01-01

    Design studies and analyses were performed to describe the loads and dynamics of the space shuttle tail service masts (TSMs). Of particular interest are the motion and interaction of the umbilical carrier plate, lanyard system, vacuum jacketed hoses, latches, links, and masthead. A development test rig was designed and fabricated to obtain experimental data. The test program is designed to (1) verify the theoretical dynamics calculations, (2) prove the soundness of design concepts, and (3) elucidate problem areas (if any) in the design of mechanisms and structural components. Design, fabrication, and initiation of TSM development testing at Kennedy Space Center are described.

  8. Simulations of space charge neutralization in a magnetized electron cooler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerity, James; McIntyre, Peter M.; Bruhwiler, David Leslie

    Magnetized electron cooling at relativistic energies and Ampere scale current is essential to achieve the proposed ion luminosities in a future electron-ion collider (EIC). Neutralization of the space charge in such a cooler can significantly increase the magnetized dynamic friction and, hence, the cooling rate. The Warp framework is being used to simulate magnetized electron beam dynamics during and after the build-up of neutralizing ions, via ionization of residual gas in the cooler. The design follows previous experiments at Fermilab as a verification case. We also discuss the relevance to EIC designs.

  9. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  10. Development of a database for the verification of trans-ionospheric remote sensing systems

    NASA Astrophysics Data System (ADS)

    Leitinger, R.

    2005-08-01

    Remote sensing systems need verification by means of in-situ data or by means of model data. In the case of ionospheric occultation inversion, ionosphere tomography and other imaging methods on the basis of satellite-to-ground or satellite-to-satellite electron content, the availability of in-situ data with adequate spatial and temporal co-location is a very rare case, indeed. Therefore the method of choice for verification is to produce artificial electron content data with realistic properties, subject these data to the inversion/retrieval method, compare the results with model data and apply a suitable type of “goodness of fit” classification. Inter-comparison of inversion/retrieval methods should be done with sets of artificial electron contents in a “blind” (or even “double blind”) way. The set up of a relevant database for the COST 271 Action is described. One part of the database will be made available to everyone interested in testing of inversion/retrieval methods. The artificial electron content data are calculated by means of large-scale models that are “modulated” in a realistic way to include smaller scale and dynamic structures, like troughs and traveling ionospheric disturbances.

  11. Nontargeted metabolomic analysis and "commercial-homophyletic" comparison-induced biomarkers verification for the systematic chemical differentiation of five different parts of Panax ginseng.

    PubMed

    Qiu, Shi; Yang, Wen-Zhi; Yao, Chang-Liang; Qiu, Zhi-Dong; Shi, Xiao-Jian; Zhang, Jing-Xian; Hou, Jin-Jun; Wang, Qiu-Rong; Wu, Wan-Ying; Guo, De-An

    2016-07-01

    A key segment in authentication of herbal medicines is the establishment of robust biomarkers that embody the intrinsic metabolites difference independent of the growing environment or processing technics. We present a strategy by nontargeted metabolomics and "Commercial-homophyletic" comparison-induced biomarkers verification with new bioinformatic vehicles, to improve the efficiency and reliability in authentication of herbal medicines. The chemical differentiation of five different parts (root, leaf, flower bud, berry, and seed) of Panax ginseng was illustrated as a case study. First, an optimized ultra-performance liquid chromatography/quadrupole time-of-flight-MS(E) (UPLC/QTOF-MS(E)) approach was established for global metabolites profiling. Second, UNIFI™ combined with search of an in-house library was employed to automatically characterize the metabolites. Third, pattern recognition multivariate statistical analysis of the MS(E) data of different parts of commercial and homophyletic samples were separately performed to explore potential biomarkers. Fourth, potential biomarkers deduced from commercial and homophyletic root and leaf samples were cross-compared to infer robust biomarkers. Fifth, discriminating models by artificial neutral network (ANN) were established to identify different parts of P. ginseng. Consequently, 164 compounds were characterized, and 11 robust biomarkers enabling the differentiation among root, leaf, flower bud, and berry, were discovered by removing those structurally unstable and possibly processing-related ones. The ANN models using the robust biomarkers managed to exactly discriminate four different parts and root adulterant with leaf as well. Conclusively, biomarkers verification using homophyletic samples conduces to the discovery of robust biomarkers. The integrated strategy facilitates authentication of herbal medicines in a more efficient and more intelligent manner. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Knowledge-based verification of clinical guidelines by detection of anomalies.

    PubMed

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.

  13. Fluctuation-driven price dynamics and investment strategies

    PubMed Central

    Li, Yan; Zheng, Bo; Chen, Ting-Ting; Jiang, Xiong-Fei

    2017-01-01

    Investigation of the driven mechanism of the price dynamics in complex financial systems is important and challenging. In this paper, we propose an investment strategy to study how dynamic fluctuations drive the price movements. The strategy is successfully applied to different stock markets in the world, and the result indicates that the driving effect of the dynamic fluctuations is rather robust. We investigate how the strategy performance is influenced by the market states and optimize the strategy performance by introducing two parameters. The strategy is also compared with several typical technical trading rules. Our findings not only provide an investment strategy which extends investors’ profits, but also offer a useful method to look into the dynamic properties of complex financial systems. PMID:29240783

  14. Fluctuation-driven price dynamics and investment strategies.

    PubMed

    Li, Yan; Zheng, Bo; Chen, Ting-Ting; Jiang, Xiong-Fei

    2017-01-01

    Investigation of the driven mechanism of the price dynamics in complex financial systems is important and challenging. In this paper, we propose an investment strategy to study how dynamic fluctuations drive the price movements. The strategy is successfully applied to different stock markets in the world, and the result indicates that the driving effect of the dynamic fluctuations is rather robust. We investigate how the strategy performance is influenced by the market states and optimize the strategy performance by introducing two parameters. The strategy is also compared with several typical technical trading rules. Our findings not only provide an investment strategy which extends investors' profits, but also offer a useful method to look into the dynamic properties of complex financial systems.

  15. Strategy for 90% autoverification of clinical chemistry and immunoassay test results using six sigma process improvement.

    PubMed

    Randell, Edward W; Short, Garry; Lee, Natasha; Beresford, Allison; Spencer, Margaret; Kennell, Marina; Moores, Zoë; Parry, David

    2018-06-01

    Six Sigma involves a structured process improvement strategy that places processes on a pathway to continued improvement. The data presented here summarizes a project that took three clinical laboratories from autoverification processes that allowed between about 40% to 60% of tests being auto-verified to more than 90% of tests and samples auto-verified. The project schedule, metrics and targets, a description of the previous system and detailed information on the changes made to achieve greater than 90% auto-verification is presented for this Six Sigma DMAIC (Design, Measure, Analyze, Improve, Control) process improvement project.

  16. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    NASA Astrophysics Data System (ADS)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.

  17. Analysis of features of hydrodynamics and heat transfer in the fuel assembly of prospective sodium reactor with a high rate of reproduction in the uranium-plutonium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lubina, A. S., E-mail: lubina-as@nrcki.ru; Subbotin, A. S.; Sedov, A. A.

    2016-12-15

    The fast sodium reactor fuel assembly (FA) with U–Pu–Zr metallic fuel is described. In comparison with a “classical” fast reactor, this FA contains thin fuel rods and a wider fuel rod grid. Studies of the fluid dynamics and the heat transfer were carried out for such a new FA design. The verification of the ANSYS CFX code was provided for determination of the velocity, pressure, and temperature fields in the different channels. The calculations in the cells and in the FA were carried out using the model of shear stress transport (SST) selected at the stage of verification. The resultsmore » of the hydrodynamics and heat transfer calculations have been analyzed.« less

  18. Open-Source Software in Computational Research: A Case Study

    DOE PAGES

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; ...

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

  19. Hypersonic CFD applications for the National Aero-Space Plane

    NASA Technical Reports Server (NTRS)

    Richardson, Pamela F.; Mcclinton, Charles R.; Bittner, Robert D.; Dilley, A. Douglas; Edwards, Kelvin W.

    1989-01-01

    Design and analysis of the NASP depends heavily upon developing the critical technology areas that cover the entire engineering design of the vehicle. These areas include materials, structures, propulsion systems, propellants, integration of airframe and propulsion systems, controls, subsystems, and aerodynamics areas. Currently, verification of many of the classical engineering tools relies heavily on computational fluid dynamics. Advances are being made in the development of CFD codes to accomplish nose-to-tail analyses for hypersonic aircraft. Additional details involving the partial development, analysis, verification, and application of the CFL3D code and the SPARK combustor code are discussed. A nonequilibrium version of CFL3D that is presently being developed and tested is also described. Examples are given of portion calculations for research hypersonic aircraft geometries and comparisons with experiment data show good agreement.

  20. Mining the human plasma proteome with three-dimensional strategies by high-resolution Quadrupole Orbitrap Mass Spectrometry.

    PubMed

    Zhao, Yan; Chang, Cheng; Qin, Peibin; Cao, Qichen; Tian, Fang; Jiang, Jing; Li, Xianyu; Yu, Wenfeng; Zhu, Yunping; He, Fuchu; Ying, Wantao; Qian, Xiaohong

    2016-01-21

    Human plasma is a readily available clinical sample that reflects the status of the body in normal physiological and disease states. Although the wide dynamic range and immense complexity of plasma proteins are obstacles, comprehensive proteomic analysis of human plasma is necessary for biomarker discovery and further verification. Various methods such as immunodepletion, protein equalization and hyper fractionation have been applied to reduce the influence of high-abundance proteins (HAPs) and to reduce the high level of complexity. However, the depth at which the human plasma proteome has been explored in a relatively short time frame has been limited, which impedes the transfer of proteomic techniques to clinical research. Development of an optimal strategy is expected to improve the efficiency of human plasma proteome profiling. Here, five three-dimensional strategies combining HAP depletion (the 1st dimension) and protein fractionation (the 2nd dimension), followed by LC-MS/MS analysis (the 3rd dimension) were developed and compared for human plasma proteome profiling. Pros and cons of the five strategies are discussed for two issues: HAP depletion and complexity reduction. Strategies A and B used proteome equalization and tandem Seppro IgY14 immunodepletion, respectively, as the first dimension. Proteome equalization (strategy A) was biased toward the enrichment of basic and low-molecular weight proteins and had limited ability to enrich low-abundance proteins. By tandem removal of HAPs (strategy B), the efficiency of HAP depletion was significantly increased, whereas more off-target proteins were subtracted simultaneously. In the comparison of complexity reduction, strategy D involved a deglycosylation step before high-pH RPLC separation. However, the increase in sequence coverage did not increase the protein number as expected. Strategy E introduced SDS-PAGE separation of proteins, and the results showed oversampling of HAPs and identification of fewer proteins. Strategy C combined single Seppro IgY14 immunodepletion, high-pH RPLC fractionation and LC-MS/MS analysis. It generated the largest dataset, containing 1544 plasma protein groups and 258 newly identified proteins in a 30-h-machine-time analysis, making it the optimum three-dimensional strategy in our study. Further analysis of the integrated data from the five strategies showed identical distribution patterns in terms of sequence features and GO functional analysis with the 1929-plasma-protein dataset, further supporting the reliability of our plasma protein identifications. The characterization of 20 cytokines in the concentration range from sub-nanograms/milliliter to micrograms/milliliter demonstrated the sensitivity of the current strategies. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Dynamics of mechanical feedback-type hydraulic servomotors under inertia loads

    NASA Technical Reports Server (NTRS)

    Gold, Harold; Otto, Edward W; Ransom, Victor L

    1953-01-01

    An analysis of the dynamics of mechanical feedback-type hydraulic servomotors under inertia loads is developed and experimental verification is presented. The analysis, which is developed in terms of two physical parameters, yields direct expressions for the following dynamic responses: (1) the transient response to a step input and the maximum cylinder pressure during the transient and (2) the variation of amplitude attenuation and phase shift with the frequency of a sinusoidally varying input. The validity of the analysis is demonstrated by means of recorded transient and frequency responses obtained on two servomotors. The calculated responses are in close agreement with the measured responses. The relations presented are readily applicable to the design as well as to the analysis of hydraulic servomotors.

  2. Solution of the neutronics code dynamic benchmark by finite element method

    NASA Astrophysics Data System (ADS)

    Avvakumov, A. V.; Vabishchevich, P. N.; Vasilev, A. O.; Strizhov, V. F.

    2016-10-01

    The objective is to analyze the dynamic benchmark developed by Atomic Energy Research for the verification of best-estimate neutronics codes. The benchmark scenario includes asymmetrical ejection of a control rod in a water-type hexagonal reactor at hot zero power. A simple Doppler feedback mechanism assuming adiabatic fuel temperature heating is proposed. The finite element method on triangular calculation grids is used to solve the three-dimensional neutron kinetics problem. The software has been developed using the engineering and scientific calculation library FEniCS. The matrix spectral problem is solved using the scalable and flexible toolkit SLEPc. The solution accuracy of the dynamic benchmark is analyzed by condensing calculation grid and varying degree of finite elements.

  3. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  4. Verification of Modelica-Based Models with Analytical Solutions for Tritium Diffusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rader, Jordan D.; Greenwood, Michael Scott; Humrickhouse, Paul W.

    Here, tritium transport in metal and molten salt fluids combined with diffusion through high-temperature structural materials is an important phenomenon in both magnetic confinement fusion (MCF) and molten salt reactor (MSR) applications. For MCF, tritium is desirable to capture for fusion fuel. For MSRs, uncaptured tritium potentially can be released to the environment. In either application, quantifying the time- and space-dependent tritium concentration in the working fluid(s) and structural components is necessary.Whereas capability exists specifically for calculating tritium transport in such systems (e.g., using TMAP for fusion reactors), it is desirable to unify the calculation of tritium transport with othermore » system variables such as dynamic fluid and structure temperature combined with control systems such as those that might be found in a system code. Some capability for radioactive trace substance transport exists in thermal-hydraulic systems codes (e.g., RELAP5-3D); however, this capability is not coupled to species diffusion through solids. Combined calculations of tritium transport and thermal-hydraulic solution have been demonstrated with TRIDENT but only for a specific type of MSR.Researchers at Oak Ridge National Laboratory have developed a set of Modelica-based dynamic system modeling tools called TRANsient Simulation Framework Of Reconfigurable Models (TRANSFORM) that were used previously to model advanced fission reactors and associated systems. In this system, the augmented TRANSFORM library includes dynamically coupled fluid and solid trace substance transport and diffusion. Results from simulations are compared against analytical solutions for verification.« less

  5. Verification of Modelica-Based Models with Analytical Solutions for Tritium Diffusion

    DOE PAGES

    Rader, Jordan D.; Greenwood, Michael Scott; Humrickhouse, Paul W.

    2018-03-20

    Here, tritium transport in metal and molten salt fluids combined with diffusion through high-temperature structural materials is an important phenomenon in both magnetic confinement fusion (MCF) and molten salt reactor (MSR) applications. For MCF, tritium is desirable to capture for fusion fuel. For MSRs, uncaptured tritium potentially can be released to the environment. In either application, quantifying the time- and space-dependent tritium concentration in the working fluid(s) and structural components is necessary.Whereas capability exists specifically for calculating tritium transport in such systems (e.g., using TMAP for fusion reactors), it is desirable to unify the calculation of tritium transport with othermore » system variables such as dynamic fluid and structure temperature combined with control systems such as those that might be found in a system code. Some capability for radioactive trace substance transport exists in thermal-hydraulic systems codes (e.g., RELAP5-3D); however, this capability is not coupled to species diffusion through solids. Combined calculations of tritium transport and thermal-hydraulic solution have been demonstrated with TRIDENT but only for a specific type of MSR.Researchers at Oak Ridge National Laboratory have developed a set of Modelica-based dynamic system modeling tools called TRANsient Simulation Framework Of Reconfigurable Models (TRANSFORM) that were used previously to model advanced fission reactors and associated systems. In this system, the augmented TRANSFORM library includes dynamically coupled fluid and solid trace substance transport and diffusion. Results from simulations are compared against analytical solutions for verification.« less

  6. STAR-CCM+ Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less

  7. Specification and Verification of Secure Concurrent and Distributed Software Systems

    DTIC Science & Technology

    1992-02-01

    primitive search strategies work for operating systems that contain relatively few operations . As the number of operations increases, so does the the...others have granted him access to, etc . The burden of security falls on the operating system , although appropriate hardware support can minimize the...Guttag, J. Horning, and R. Levin. Synchronization primitives for a multiprocessor: a formal specification. Symposium on Operating System Principles

  8. SU-E-T-100: Designing a QA Tool for Enhance Dynamic Wedges Based On Dynalog Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousuf, A; Hussain, A

    2014-06-01

    Purpose: A robust quality assurance (QA) program for computer controlled enhanced dynamic wedge (EDW) has been designed and tested. Calculations to perform such QA test is based upon the EDW dynamic log files generated during dose delivery. Methods: Varian record and verify system generates dynamic log (dynalog) files during dynamic dose delivery. The system generated dynalog files contain information such as date and time of treatment, energy, monitor units, wedge orientation, and type of treatment. It also contains the expected calculated segmented treatment tables (STT) and the actual delivered STT for the treatment delivery as a verification record. These filesmore » can be used to assess the integrity and precision of the treatment plan delivery. The plans were delivered with a 6 MV beam from a Varian linear accelerator. For available EDW angles (10°, 15°, 20°, 25°, 30°, 45°, and 60°) Varian STT values were used to manually calculate monitor units for each segment. It can also be used to calculate the EDW factors. Independent verification of fractional MUs per segment was performed against those generated from dynalog files. The EDW factors used to calculate MUs in TPS were dosimetrically verified in solid water phantom with semiflex chamber on central axis. Results: EDW factors were generated from the STT provided by Varian and verified against practical measurements. The measurements were in agreement of the order of 1 % to the calculated EDW data. Variation between the MUs per segment obtained from dynalog files and those manually calculated was found to be less than 2%. Conclusion: An efficient and easy tool to perform routine QA procedure of EDW is suggested. The method can be easily implemented in any institution without a need for expensive QA equipment. An error of the order of ≥2% can be easily detected.« less

  9. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  10. Quality assurance of a gimbaled head swing verification using feature point tracking.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Enosaki, Tsubasa; Kawakubo, Atsushi; Hosono, Fumika; Yamada, Kiyoshi; Nagata, Yasushi

    2017-01-01

    To perform dynamic tumor tracking (DTT) for clinical applications safely and accurately, gimbaled head swing verification is important. We propose a quantitative gimbaled head swing verification method for daily quality assurance (QA), which uses feature point tracking and a web camera. The web camera was placed on a couch at the same position for every gimbaled head swing verification, and could move based on a determined input function (sinusoidal patterns; amplitude: ± 20 mm; cycle: 3 s) in the pan and tilt directions at isocenter plane. Two continuous images were then analyzed for each feature point using the pyramidal Lucas-Kanade (LK) method, which is an optical flow estimation algorithm. We used a tapped hole as a feature point of the gimbaled head. The period and amplitude were analyzed to acquire a quantitative gimbaled head swing value for daily QA. The mean ± SD of the period were 3.00 ± 0.03 (range: 3.00-3.07) s and 3.00 ± 0.02 (range: 3.00-3.07) s in the pan and tilt directions, respectively. The mean ± SD of the relative displacement were 19.7 ± 0.08 (range: 19.6-19.8) mm and 18.9 ± 0.2 (range: 18.4-19.5) mm in the pan and tilt directions, respectively. The gimbaled head swing was reliable for DTT. We propose a quantitative gimbaled head swing verification method for daily QA using the feature point tracking method and a web camera. Our method can quantitatively assess the gimbaled head swing for daily QA from baseline values, measured at the time of acceptance and commissioning. © 2016 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  11. Experimental verification of Space Platform battery discharger design optimization

    NASA Astrophysics Data System (ADS)

    Sable, Dan M.; Deuty, Scott; Lee, Fred C.; Cho, Bo H.

    The detailed design of two candidate topologies for the Space Platform battery discharger, a four module boost converter (FMBC) and a voltage-fed push-pull autotransformer (VFPPAT), is presented. Each has unique problems. The FMBC requires careful design and analysis in order to obtain good dynamic performance. This is due to the presence of a right-half-plane (RHP) zero in the control-to-output transfer function. The VFPPAT presents a challenging power stage design in order to yield high efficiency and light component weight. The authors describe the design of each of these converters and compare their efficiency, weight, and dynamic characteristics.

  12. Assessing Spontaneous Combustion Instability with Recurrence Quantification Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, Chad J.; Casiano, Matthew J.

    2016-01-01

    Spontaneous instabilities can pose a significant challenge to verification of combustion stability, and characterizing its onset is an important avenue of improvement for stability assessments of liquid propellant rocket engines. Recurrence Quantification Analysis (RQA) is used here to explore nonlinear combustion dynamics that might give insight into instability. Multiple types of patterns representative of different dynamical states are identified within fluctuating chamber pressure data, and markers for impending instability are found. A class of metrics which describe these patterns is also calculated. RQA metrics are compared with and interpreted against another metric from nonlinear time series analysis, the Hurst exponent, to help better distinguish between stable and unstable operation.

  13. Flexive and Propulsive Dynamics of Elastica at Low Reynolds Numbers

    NASA Astrophysics Data System (ADS)

    Wiggins, Chris; Goldstein, Raymond

    1997-11-01

    A stiff one-armed swimmer in glycerine goes nowhere. However, if its arm is elastic, exerting a restorative torque proportional to local curvature, the swimmer can go on its way. Considering this happy consequence, we study a hyperdiffusion equation for the shape of the elastica in viscous flow, find solutions for impulsive or oscillatory forcing, and elucidate relevant aspects of propulsion. We illustrate an experiment which, coupled with this analysis, provides verification of the hyperdiffusive nature of elastohydrodynamics as well as a novel technique for measuring biopolymer bending moduli. Extensions necessary to study the viscous dynamics of twist and writhe are elucidated.

  14. Experimental verification of Space Platform battery discharger design optimization

    NASA Technical Reports Server (NTRS)

    Sable, Dan M.; Deuty, Scott; Lee, Fred C.; Cho, Bo H.

    1991-01-01

    The detailed design of two candidate topologies for the Space Platform battery discharger, a four module boost converter (FMBC) and a voltage-fed push-pull autotransformer (VFPPAT), is presented. Each has unique problems. The FMBC requires careful design and analysis in order to obtain good dynamic performance. This is due to the presence of a right-half-plane (RHP) zero in the control-to-output transfer function. The VFPPAT presents a challenging power stage design in order to yield high efficiency and light component weight. The authors describe the design of each of these converters and compare their efficiency, weight, and dynamic characteristics.

  15. Review of hardware-in-the-loop simulation and its prospects in the automotive area

    NASA Astrophysics Data System (ADS)

    Fathy, Hosam K.; Filipi, Zoran S.; Hagena, Jonathan; Stein, Jeffrey L.

    2006-05-01

    Hardware-in-the-loop (HIL) simulation is rapidly evolving from a control prototyping tool to a system modeling, simulation, and synthesis paradigm synergistically combining many advantages of both physical and virtual prototyping. This paper provides a brief overview of the key enablers and numerous applications of HIL simulation, focusing on its metamorphosis from a control validation tool into a system development paradigm. It then describes a state-of-the art engine-in-the-loop (EIL) simulation facility that highlights the use of HIL simulation for the system-level experimental evaluation of powertrain interactions and development of strategies for clean and efficient propulsion. The facility comprises a real diesel engine coupled to accurate real-time driver, driveline, and vehicle models through a highly responsive dynamometer. This enables the verification of both performance and fuel economy predictions of different conventional and hybrid powertrains. Furthermore, the facility can both replicate the highly dynamic interactions occurring within a real powertrain and measure their influence on transient emissions and visual signature through state-of-the-art instruments. The viability of this facility for integrated powertrain system development is demonstrated through a case study exploring the development of advanced High Mobility Multipurpose Wheeled Vehicle (HMMWV) powertrains.

  16. A State-of-the-Art Experimental Laboratory for Cloud and Cloud-Aerosol Interaction Research

    NASA Technical Reports Server (NTRS)

    Fremaux, Charles M.; Bushnell, Dennis M.

    2011-01-01

    The state of the art for predicting climate changes due to increasing greenhouse gasses in the atmosphere with high accuracy is problematic. Confidence intervals on current long-term predictions (on the order of 100 years) are so large that the ability to make informed decisions with regard to optimum strategies for mitigating both the causes of climate change and its effects is in doubt. There is ample evidence in the literature that large sources of uncertainty in current climate models are various aerosol effects. One approach to furthering discovery as well as modeling, and verification and validation (V&V) for cloud-aerosol interactions is use of a large "cloud chamber" in a complimentary role to in-situ and remote sensing measurement approaches. Reproducing all of the complex interactions is not feasible, but it is suggested that the physics of certain key processes can be established in a laboratory setting so that relevant fluid-dynamic and cloud-aerosol phenomena can be experimentally simulated and studied in a controlled environment. This report presents a high-level argument for significantly improved laboratory capability, and is meant to serve as a starting point for stimulating discussion within the climate science and other interested communities.

  17. Flow visualization methods for field test verification of CFD analysis of an open gloveport

    DOE PAGES

    Strons, Philip; Bailey, James L.

    2017-01-01

    Anemometer readings alone cannot provide a complete picture of air flow patterns at an open gloveport. Having a means to visualize air flow for field tests in general provides greater insight by indicating direction in addition to the magnitude of the air flow velocities in the region of interest. Furthermore, flow visualization is essential for Computational Fluid Dynamics (CFD) verification, where important modeling assumptions play a significant role in analyzing the chaotic nature of low-velocity air flow. A good example is shown Figure 1, where an unexpected vortex pattern occurred during a field test that could not have been measuredmore » relying only on anemometer readings. Here by, observing and measuring the patterns of the smoke flowing into the gloveport allowed the CFD model to be appropriately updated to match the actual flow velocities in both magnitude and direction.« less

  18. Verification of Geosat sea surface topography in the Gulf Stream extension with surface drifting buoys and hydrographic measurements

    NASA Astrophysics Data System (ADS)

    Willebrand, J.; KäSe, R. H.; Stammer, D.; Hinrichsen, H.-H.; Krauss, W.

    1990-03-01

    Altimeter data from Geosat have been analyzed in the Gulf Stream extension area. Horizontal maps of the sea surface height anomaly relative to an annual mean for various 17-day intervals were constructed using an objective mapping procedure. The mean sea level was approximated by the dynamic topography from climatological hydrographic data. Geostrophic surface velocities derived from the composite maps (mean plus anomaly) are significantly correlated with surface drifter velocities observed during an oceanographie experiment in the spring of 1987. The drifter velocities contain much energy on scales less than 100 km which are not resolved in the altimetric maps. It is shown that the composite sea surface height also agrees well with ground verification from hydrographic data along sections in a triangle between the Azores, Newfoundland, and Bermuda, except in regions of high mean gradients.

  19. Integrated heat pipe-thermal storage system performance evaluation

    NASA Technical Reports Server (NTRS)

    Keddy, E.; Sena, J. T.; Merrigan, M.; Heidenreich, Gary

    1987-01-01

    An integrated thermal energy storage (TES) system, developed as a part of an organic Rankine cycle solar dynamic power system is described, and the results of the performance verification tests of this TES system are presented. The integrated system consists of potassium heat-pipe elements that incorporate TES canisters within the vapor space, along with an organic fluid heater tube used as the condenser region of the heat pipe. The heat pipe assembly was operated through the range of design conditions from the nominal design input of 4.8 kW to a maximum of 5.7 kW. The performance verification tests show that the system meets the functional requirements of absorbing the solar energy reflected by the concentrator, transporting the energy to the organic Rankine heater, providing thermal storage for the eclipse phase, and allowing uniform discharge from the thermal storage to the heater.

  20. LIHE Spectral Dynamics and Jaguar Data Acquisition System Measurement Assurance Results 2014.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covert, Timothy T.; Willis, Michael David; Radtke, Gregg Arthur

    2015-06-01

    The Light Initiated High Explosive (LIHE) facility performs high rigor, high consequence impulse testing for the nuclear weapons (NW) community. To support the facility mission, LIHE's extensive data acquisition system (DAS) is comprised of several discrete components as well as a fully integrated system. Due to the high consequence and high rigor of the testing performed at LIHE, a measurement assurance plan (MAP) was developed in collaboration with NW system customers to meet their data quality needs and to provide assurance of the robustness of the LIHE DAS. While individual components of the DAS have been calibrated by the SNLmore » Primary Standards Laboratory (PSL), the integrated nature of this complex system requires verification of the complete system, from end-to-end. This measurement assurance plan (MAP) report documents the results of verification and validation procedures used to ensure that the data quality meets customer requirements.« less

  1. Nonlinear earthquake analysis of reinforced concrete frames with fiber and Bernoulli-Euler beam-column element.

    PubMed

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched.

  2. Dynamic tire pressure sensor for measuring ground vibration.

    PubMed

    Wang, Qi; McDaniel, James Gregory; Wang, Ming L

    2012-11-07

    This work presents a convenient and non-contact acoustic sensing approach for measuring ground vibration. This approach, which uses an instantaneous dynamic tire pressure sensor (DTPS), possesses the capability to replace the accelerometer or directional microphone currently being used for inspecting pavement conditions. By measuring dynamic pressure changes inside the tire, ground vibration can be amplified and isolated from environmental noise. In this work, verifications of the DTPS concept of sensing inside the tire have been carried out. In addition, comparisons between a DTPS, ground-mounted accelerometer, and directional microphone are made. A data analysis algorithm has been developed and optimized to reconstruct ground acceleration from DTPS data. Numerical and experimental studies of this DTPS reveal a strong potential for measuring ground vibration caused by a moving vehicle. A calibration of transfer function between dynamic tire pressure change and ground acceleration may be needed for different tire system or for more accurate application.

  3. Dynamic Tire Pressure Sensor for Measuring Ground Vibration

    PubMed Central

    Wang, Qi; McDaniel, James Gregory; Wang, Ming L.

    2012-01-01

    This work presents a convenient and non-contact acoustic sensing approach for measuring ground vibration. This approach, which uses an instantaneous dynamic tire pressure sensor (DTPS), possesses the capability to replace the accelerometer or directional microphone currently being used for inspecting pavement conditions. By measuring dynamic pressure changes inside the tire, ground vibration can be amplified and isolated from environmental noise. In this work, verifications of the DTPS concept of sensing inside the tire have been carried out. In addition, comparisons between a DTPS, ground-mounted accelerometer, and directional microphone are made. A data analysis algorithm has been developed and optimized to reconstruct ground acceleration from DTPS data. Numerical and experimental studies of this DTPS reveal a strong potential for measuring ground vibration caused by a moving vehicle. A calibration of transfer function between dynamic tire pressure change and ground acceleration may be needed for different tire system or for more accurate application. PMID:23202206

  4. Loads and low frequency dynamics data base: Version 1.1 November 8, 1985. [Space Shuttles

    NASA Technical Reports Server (NTRS)

    Garba, J. A. (Editor)

    1985-01-01

    Structural design data for the Shuttle are presented in the form of a data base. The data can be used by designers of Shuttle experiments to assure compliance with Shuttle safety and structural verification requirements. A glossary of Shuttle design terminology is given, and the principal safety requirements of Shuttle are summarized. The Shuttle design data are given in the form of load factors.

  5. Interfering with the neutron spin

    NASA Astrophysics Data System (ADS)

    Wagh, Apoorva G.; Rakhecha, Veer Chand

    2004-07-01

    Charge neutrality, a spin frac{1}{2} and an associated magnetic moment of the neu- tron make it an ideal probe of quantal spinor evolutions. Polarized neutron interferometry in magnetic field Hamiltonians has thus scored several firsts such as direct verification of Pauli anticommutation, experimental separation of geometric and dynamical phases and observation of non-cyclic amplitudes and phases. This paper provides a flavour of the physics learnt from such experiments.

  6. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  7. Specifying and Verifying the Correctness of Dynamic Software Updates

    DTIC Science & Technology

    2011-11-15

    additional branching introduced by update points and the need to analyze the state transformer code. As tools become faster and more effective , our...It shows the effectiveness of merging-based verification on practical examples, including Redis [20], a widely deployed server program. 2 Defining...Gupta’s reachability while side -stepping the problem that reachability can leave behavior CS-TR-4997 under-constrained. For example, for the vsftpd update

  8. The study of thermal processes in control systems of heat consumption of buildings

    NASA Astrophysics Data System (ADS)

    Tsynaeva, E.; A, Tsynaeva

    2017-11-01

    The article discusses the main thermal processes in the automated control systems for heat consumption (ACSHC) of buildings, schematic diagrams of these systems, mathematical models used for description of thermal processes in ACSHC. Conducted verification represented by mathematical models. It was found that the efficiency of the operation of ACSHC depend from the external and internal factors. Numerical study of dynamic modes of operation of ACSHC.

  9. Wiltech Component Cleaning and Refurbishment Facility CFC Elimination Plan at NASA Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Williamson, Steve; Aman, Bob; Aurigema, Andrew; Melendez, Orlando

    1999-01-01

    The Wiltech Component Cleaning & Refurbishment Facility (WT-CCRF) at NASA Kennedy Space Center performs precision cleaning on approximately 200,000 metallic and non metallic components every year. WT-CCRF has developed a CFC elimination plan consisting of aqueous cleaning and verification and an economical dual solvent strategy for alternative solvent solution. Aqueous Verification Methodologies were implemented two years ago on a variety of Ground Support Equipment (GSE) components and sampling equipment. Today, 50% of the current workload is verified using aqueous methods and 90% of the total workload is degreased aqueously using, Zonyl and Brulin surfactants in ultrasonic baths. An additional estimated 20% solvent savings could be achieved if the proposed expanded use of aqueous methods are approved. Aqueous cleaning has shown to be effective, environmentally friendly and economical (i.e.. cost of materials, equipment, facilities and labor).

  10. Developing interpretable models with optimized set reduction for identifying high risk software components

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  11. Accessing defect dynamics using intense, nanosecond pulsed ion beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Persaud, A.; Barnard, J. J.; Guo, H.

    2015-06-18

    Gaining in-situ access to relaxation dynamics of radiation induced defects will lead to a better understanding of materials and is important for the verification of theoretical models and simulations. We show preliminary results from experiments at the new Neutralized Drift Compression Experiment (NDCX-II) at Lawrence Berkeley National Laboratory that will enable in-situ access to defect dynamics through pump-probe experiments. Here, the unique capabilities of the NDCX-II accelerator to generate intense, nanosecond pulsed ion beams are utilized. Preliminary data of channeling experiments using lithium and potassium ions and silicon membranes are shown. We compare these data to simulation results using Crystalmore » Trim. Furthermore, we discuss the improvements to the accelerator to higher performance levels and the new diagnostics tools that are being incorporated.« less

  12. Equilibrium stochastic dynamics of a Brownian particle in inhomogeneous space: Derivation of an alternative model

    NASA Astrophysics Data System (ADS)

    Bhattacharyay, A.

    2018-03-01

    An alternative equilibrium stochastic dynamics for a Brownian particle in inhomogeneous space is derived. Such a dynamics can model the motion of a complex molecule in its conformation space when in equilibrium with a uniform heat bath. The derivation is done by a simple generalization of the formulation due to Zwanzig for a Brownian particle in homogeneous heat bath. We show that, if the system couples to different number of bath degrees of freedom at different conformations then the alternative model gets derived. We discuss results of an experiment by Faucheux and Libchaber which probably has indicated possible limitation of the Boltzmann distribution as equilibrium distribution of a Brownian particle in inhomogeneous space and propose experimental verification of the present theory using similar methods.

  13. Mechanical properties of multifunctional structure with viscoelastic components based on FVE model

    NASA Astrophysics Data System (ADS)

    Hao, Dong; Zhang, Lin; Yu, Jing; Mao, Daiyong

    2018-02-01

    Based on the models of Lion and Kardelky (2004) and Hofer and Lion (2009), a finite viscoelastic (FVE) constitutive model, considering the predeformation-, frequency- and amplitude-dependent properties, has been proposed in our earlier paper [1]. FVE model is applied to investigating the dynamic characteristics of the multifunctional structure with the viscoelastic components. Combing FVE model with the finite element theory, the dynamic model of the multifunctional structure could be obtained. Additionally, the parametric identification and the experimental verification are also given via the frequency-sweep tests. The results show that the computational data agree well with the experimental data. FVE model has made a success of expressing the dynamic characteristics of the viscoelastic materials utilized in the multifunctional structure. The multifunctional structure technology has been verified by in-orbit experiments.

  14. Development of Nonlinear Flight Mechanical Model of High Aspect Ratio Light Utility Aircraft

    NASA Astrophysics Data System (ADS)

    Bahri, S.; Sasongko, R. A.

    2018-04-01

    The implementation of Flight Control Law (FCL) for Aircraft Electronic Flight Control System (EFCS) aims to reduce pilot workload, while can also enhance the control performance during missions that require long endurance flight and high accuracy maneuver. In the development of FCL, a quantitative representation of the aircraft dynamics is needed for describing the aircraft dynamics characteristic and for becoming the basis of the FCL design. Hence, a 6 Degree of Freedom nonlinear model of a light utility aircraft dynamics, also called the nonlinear Flight Mechanical Model (FMM), is constructed. This paper shows the construction of FMM from mathematical formulation, the architecture design of FMM, the trimming process and simulations. The verification of FMM is done by analysis of aircraft behaviour in selected trimmed conditions.

  15. Effective Dynamics of Microorganisms That Interact with Their Own Trail

    NASA Astrophysics Data System (ADS)

    Kranz, W. Till; Gelimson, Anatolij; Zhao, Kun; Wong, Gerard C. L.; Golestanian, Ramin

    2016-07-01

    Like ants, some microorganisms are known to leave trails on surfaces to communicate. We explore how trail-mediated self-interaction could affect the behavior of individual microorganisms when diffusive spreading of the trail is negligible on the time scale of the microorganism using a simple phenomenological model for an actively moving particle and a finite-width trail. The effective dynamics of each microorganism takes on the form of a stochastic integral equation with the trail interaction appearing in the form of short-term memory. For a moderate coupling strength below an emergent critical value, the dynamics exhibits effective diffusion in both orientation and position after a phase of superdiffusive reorientation. We report experimental verification of a seemingly counterintuitive perpendicular alignment mechanism that emerges from the model.

  16. Parallel Software Model Checking

    DTIC Science & Technology

    2015-01-08

    checker. This project will explore this strategy to parallelize the generalized PDR algorithm for software model checking. It belongs to TF1 due to its ... focus on formal verification . Generalized PDR. Generalized Property Driven Rechability (GPDR) i is an algorithm for solving HORN-SMT reachability...subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 08

  17. Final Report for the ZERT Project: Basic Science of Retention Issues, Risk Assessment & Measurement, Monitoring and Verification for Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangler, Lee; Cunningham, Alfred; Lageson, David

    2011-03-31

    ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.

  18. Anatomy-corresponding method of IMRT verification.

    PubMed

    Winiecki, Janusz; Zurawski, Zbigniew; Drzewiecka, Barbara; Slosarek, Krzysztof

    2010-01-01

    During a proper execution of dMLC plans, there occurs an undesired but frequent effect of the dose locally accumulated by tissue being significantly different than expected. The conventional dosimetric QA procedures give only a partial picture of the quality of IMRT treatment, because their solely quantitative outcomes usually correspond more to the total area of the detector than the actually irradiated volume. The aim of this investigation was to develop a procedure of dynamic plans verification which would be able to visualize the potential anomalies of dose distribution and specify which tissue they exactly refer to. The paper presents a method discovered and clinically examined in our department. It is based on a Gamma Evaluation concept and allows accurate localization of deviations between predicted and acquired dose distributions, which were registered by portal as well as film dosimetry. All the calculations were performed on the self-made software GammaEval, the γ-images (2-dimensional distribution of γ-values) and γ-histograms were created as quantitative outcomes of verification. Over 150 maps of dose distribution have been analyzed and the cross-examination of the gamma images with DRRs was performed. It seems, that the complex monitoring of treatment would be possible owing to the images obtained as a cross-examination of γ-images and corresponding DRRs.

  19. Assuring NASA's Safety and Mission Critical Software

    NASA Technical Reports Server (NTRS)

    Deadrick, Wesley

    2015-01-01

    What is IV&V? Independent Verification and Validation (IV&V) is an objective examination of safety and mission critical software processes and products. Independence: 3 Key parameters: Technical Independence; Managerial Independence; Financial Independence. NASA IV&V perspectives: Will the system's software: Do what it is supposed to do?; Not do what it is not supposed to do?; Respond as expected under adverse conditions?. Systems Engineering: Determines if the right system has been built and that it has been built correctly. IV&V Technical Approaches: Aligned with IEEE 1012; Captured in a Catalog of Methods; Spans the full project lifecycle. IV&V Assurance Strategy: The IV&V Project's strategy for providing mission assurance; Assurance Strategy is driven by the specific needs of an individual project; Implemented via an Assurance Design; Communicated via Assurance Statements.

  20. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System /spacecraft system. Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. For the flow regime being analyzed (turbulent, three-dimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.

  1. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System spacecraft system.Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. For the flow regime being analyzed (turbulent, three-dimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.

  2. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.

    2013-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This proposal describes an approach to validate the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft. The research described here is absolutely cutting edge. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional"validation by test only'' mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computationaf Fluid Dynamics can be used to veritY these requirements; however, the model must be validated by test data. The proposed research project includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT and OPEN FOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid . . . Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System /spacecraft system. Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. To date, the author is the only person to look at the uncertainty in the entire computational domain. For the flow regime being analyzed (turbulent, threedimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.

  3. Strand-specific Recognition of DNA Damages by XPD Provides Insights into Nucleotide Excision Repair Substrate Versatility*

    PubMed Central

    Buechner, Claudia N.; Heil, Korbinian; Michels, Gudrun; Carell, Thomas; Kisker, Caroline; Tessmer, Ingrid

    2014-01-01

    Recognition and removal of DNA damages is essential for cellular and organismal viability. Nucleotide excision repair (NER) is the sole mechanism in humans for the repair of carcinogenic UV irradiation-induced photoproducts in the DNA, such as cyclobutane pyrimidine dimers. The broad substrate versatility of NER further includes, among others, various bulky DNA adducts. It has been proposed that the 5′-3′ helicase XPD (xeroderma pigmentosum group D) protein plays a decisive role in damage verification. However, despite recent advances such as the identification of a DNA-binding channel and central pore in the protein, through which the DNA is threaded, as well as a dedicated lesion recognition pocket near the pore, the exact process of target site recognition and verification in eukaryotic NER still remained elusive. Our single molecule analysis by atomic force microscopy reveals for the first time that XPD utilizes different recognition strategies to verify structurally diverse lesions. Bulky fluorescein damage is preferentially detected on the translocated strand, whereas the opposite strand preference is observed for a cyclobutane pyrimidine dimer lesion. Both states, however, lead to similar conformational changes in the resulting specific complexes, indicating a merge to a “final” verification state, which may then trigger the recruitment of further NER proteins. PMID:24338567

  4. A robust preference for cheap-and-easy strategies over reliable strategies when verifying personal memories.

    PubMed

    Nash, Robert A; Wade, Kimberley A; Garry, Maryanne; Adelman, James S

    2017-08-01

    People depend on various sources of information when trying to verify their autobiographical memories. Yet recent research shows that people prefer to use cheap-and-easy verification strategies, even when these strategies are not reliable. We examined the robustness of this cheap strategy bias, with scenarios designed to encourage greater emphasis on source reliability. In three experiments, subjects described real (Experiments 1 and 2) or hypothetical (Experiment 3) autobiographical events, and proposed strategies they might use to verify their memories of those events. Subjects also rated the reliability, cost, and the likelihood that they would use each strategy. In line with previous work, we found that the preference for cheap information held when people described how they would verify childhood or recent memories (Experiment 1), personally important or trivial memories (Experiment 2), and even when the consequences of relying on incorrect information could be significant (Experiment 3). Taken together, our findings fit with an account of source monitoring in which the tendency to trust one's own autobiographical memories can discourage people from systematically testing or accepting strong disconfirmatory evidence.

  5. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian; Robertson, Amy; Jonkman, Jason

    2016-08-01

    The open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed by the University of Maine, and FEAMooring is a finite-element-based mooring dynamics module developed by Texas A&M University. This paper summarizes the work performed to verify and validate these modules against other mooring models and measured test data to assess their reliability and accuracy. The quality of the fairlead load predictions by the open-source mooring modules MoorDyn and FEAMooring appear to be largely equivalent to what is predicted by themore » commercial tool OrcaFlex. Both mooring dynamic model predictions agree well with the experimental data, considering the given limitations in the accuracy of the platform hydrodynamic load calculation and the quality of the measurement data.« less

  6. Verification and Validation of the New Dynamic Mooring Modules Available in FAST v8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F.; Andersen, Morten T.; Robertson, Amy N.

    2016-07-01

    The open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, was recently coupled to two newly developed mooring dynamics modules: MoorDyn and FEAMooring. MoorDyn is a lumped-mass-based mooring dynamics module developed by the University of Maine, and FEAMooring is a finite-element-based mooring dynamics module developed by Texas A&M University. This paper summarizes the work performed to verify and validate these modules against other mooring models and measured test data to assess their reliability and accuracy. The quality of the fairlead load predictions by the open-source mooring modules MoorDyn and FEAMooring appear to be largely equivalent to what is predicted by themore » commercial tool OrcaFlex. Both mooring dynamic model predictions agree well with the experimental data, considering the given limitations in the accuracy of the platform hydrodynamic load calculation and the quality of the measurement data.« less

  7. Methodologies for launcher-payload coupled dynamic analysis

    NASA Astrophysics Data System (ADS)

    Fransen, S. H. J. A.

    2012-06-01

    An important step in the design and verification process of spacecraft structures is the coupled dynamic analysis with the launch vehicle in the low-frequency domain, also referred to as coupled loads analysis (CLA). The objective of such analyses is the computation of the dynamic environment of the spacecraft (payload) in terms of interface accelerations, interface forces, center of gravity (CoG) accelerations as well as the internal state of stress. In order to perform an efficient, fast and accurate launcher-payload coupled dynamic analysis, various methodologies have been applied and developed. The methods are related to substructuring techniques, data recovery techniques, the effects of prestress and fluids and time integration problems. The aim of this paper was to give an overview of these methodologies and to show why, how and where these techniques can be used in the process of launcher-payload coupled dynamic analysis. In addition, it will be shown how these methodologies fit together in a library of procedures which can be used with the MSC.Nastran™ solution sequences.

  8. Drive-train dynamics technology - State-of-the-art and design of a test facility for advanced development

    NASA Technical Reports Server (NTRS)

    Badgley, R. H.; Fleming, D. P.; Smalley, A. J.

    1975-01-01

    A program for the development and verification of drive-train dynamic technology is described along with its basis and the results expected from it. A central feature of this program is a drive-train test facility designed for the testing and development of advanced drive-train components, including shaft systems, dampers, and couplings. Previous efforts in designing flexible dynamic drive-train systems are reviewed, and the present state of the art is briefly summarized. The design of the test facility is discussed with major attention given to the formulation of the test-rig concept, dynamic scaling of model shafts, and the specification of design parameters. Specific efforts envisioned for the test facility are briefly noted, including evaluations of supercritical test shafts, stability thresholds for various sources and types of instabilities that can exist in shaft systems, effects of structural flexibility on the dynamic performance of dampers, and methods for vibration control in two-level and three-level flexible shaft systems.

  9. Aspiration dynamics of multi-player games in finite populations

    PubMed Central

    Du, Jinming; Wu, Bin; Altrock, Philipp M.; Wang, Long

    2014-01-01

    On studying strategy update rules in the framework of evolutionary game theory, one can differentiate between imitation processes and aspiration-driven dynamics. In the former case, individuals imitate the strategy of a more successful peer. In the latter case, individuals adjust their strategies based on a comparison of their pay-offs from the evolutionary game to a value they aspire, called the level of aspiration. Unlike imitation processes of pairwise comparison, aspiration-driven updates do not require additional information about the strategic environment and can thus be interpreted as being more spontaneous. Recent work has mainly focused on understanding how aspiration dynamics alter the evolutionary outcome in structured populations. However, the baseline case for understanding strategy selection is the well-mixed population case, which is still lacking sufficient understanding. We explore how aspiration-driven strategy-update dynamics under imperfect rationality influence the average abundance of a strategy in multi-player evolutionary games with two strategies. We analytically derive a condition under which a strategy is more abundant than the other in the weak selection limiting case. This approach has a long-standing history in evolutionary games and is mostly applied for its mathematical approachability. Hence, we also explore strong selection numerically, which shows that our weak selection condition is a robust predictor of the average abundance of a strategy. The condition turns out to differ from that of a wide class of imitation dynamics, as long as the game is not dyadic. Therefore, a strategy favoured under imitation dynamics can be disfavoured under aspiration dynamics. This does not require any population structure, and thus highlights the intrinsic difference between imitation and aspiration dynamics. PMID:24598208

  10. Aspiration dynamics of multi-player games in finite populations.

    PubMed

    Du, Jinming; Wu, Bin; Altrock, Philipp M; Wang, Long

    2014-05-06

    On studying strategy update rules in the framework of evolutionary game theory, one can differentiate between imitation processes and aspiration-driven dynamics. In the former case, individuals imitate the strategy of a more successful peer. In the latter case, individuals adjust their strategies based on a comparison of their pay-offs from the evolutionary game to a value they aspire, called the level of aspiration. Unlike imitation processes of pairwise comparison, aspiration-driven updates do not require additional information about the strategic environment and can thus be interpreted as being more spontaneous. Recent work has mainly focused on understanding how aspiration dynamics alter the evolutionary outcome in structured populations. However, the baseline case for understanding strategy selection is the well-mixed population case, which is still lacking sufficient understanding. We explore how aspiration-driven strategy-update dynamics under imperfect rationality influence the average abundance of a strategy in multi-player evolutionary games with two strategies. We analytically derive a condition under which a strategy is more abundant than the other in the weak selection limiting case. This approach has a long-standing history in evolutionary games and is mostly applied for its mathematical approachability. Hence, we also explore strong selection numerically, which shows that our weak selection condition is a robust predictor of the average abundance of a strategy. The condition turns out to differ from that of a wide class of imitation dynamics, as long as the game is not dyadic. Therefore, a strategy favoured under imitation dynamics can be disfavoured under aspiration dynamics. This does not require any population structure, and thus highlights the intrinsic difference between imitation and aspiration dynamics.

  11. Mechanics

    NASA Astrophysics Data System (ADS)

    Cox, John

    2014-05-01

    Part 1. The Winning of the Principles: 1. Introduction; 2. The beginnings of statics. Archimedes. Problem of the lever and of the centre of gravity; 2. Experimental verification and applications of the principle of the lever; 3. The centre of gravity; 4. The balance; 5. Stevinus of Bruges. The principle of the inclined plane; 6. The parallelogram of forces; 7. The principle of virtual work; 8. Review of the principles of statics; 9. The beginnings of dynamics. Galileo. The problem of falling bodies; 10. Huyghens. The problem of uniform motion in a circle. 'Centrifugal force'; 11. Final statement of the principles of dynamics. Extension to the motions of the heavenly bodies. The law of universal gravitation. Newton; Part II. Mathematical Statement of the Principles: Introduction; 12. Kinematics; 13. Kinetics of a particle moving in a straight line. The laws of motion; 14. Experimental verification of the laws of motion. Atwood's machine; 15. Work and energy; 16. The parallelogram law; 17. The composition and resolution of forces. Resultant. Component. Equilibrium; 18. Forces in one plane; 19. Friction; Part III. Application to Various Problems: 20. Motion on an inclined plane. Brachistochrones; 21. Projectiles; 22. Simple harmonic motion; 23. The simple pendulum; 24. Central forces. The law of gravitation; 25. Impact and impulsive forces; Part IV. The Elements of Rigid Dynamics: 26. The compound pendulum. Huyghens' solution; 27. D'alembert's principle; 28. Moment of inertia; 29. Experimental determination of moments of inertia; 30. Determination of the value of gravity by Kater's pendulum; 31. The constant of gravitation, or weighing the Earth. The Cavendish experiment; Answers to the examples; Index.

  12. Age differences in strategy shift: retrieval avoidance or general shift reluctance?

    PubMed

    Frank, David J; Touron, Dayna R; Hertzog, Christopher

    2013-09-01

    Previous studies of metacognitive age differences in skill acquisition strategies have relied exclusively on tasks with a processing shift from an algorithm to retrieval strategy. Older adults' demonstrated reluctance to shift strategies in such tasks could reflect either a specific aversion to a memory retrieval strategy or a general, inertial resistance to strategy change. Haider and Frensch's (1999) alphabet verification task (AVT) affords a non-retrieval-based strategy shift. Participants verify the continuation of alphabet strings such as D E F G [4] L, with the bracketed digit indicating a number of letters to be skipped. When all deviations are restricted to the letter-digit-letter portion, participants can speed their responses by selectively attending to only that part of the stimulus. We adapted the AVT to include conditions that promoted shift to a retrieval strategy, a selective attention strategy, or both strategies. Item-level strategy reports were validated by eye movement data. Older adults shifted more slowly to the retrieval strategy but more quickly to the selective attention strategy than young adults, indicating a retrieval-strategy avoidance. Strategy confidence and perceived strategy difficulty correlated with shift to the two strategies in both age groups. Perceived speed of responses with each strategy specifically correlated with older adults' strategy choices, suggesting that some older adults avoid retrieval because they do not appreciate its efficiency benefits.

  13. Age Differences in Strategy Shift: Retrieval Avoidance or General Shift Reluctance?

    PubMed Central

    Frank, David J.; Touron, Dayna R.; Hertzog, Christopher

    2013-01-01

    Previous studies of metacognitive age differences in skill acquisition strategies have relied exclusively on tasks with a processing shift from an algorithm to retrieval strategy. Older adults’ demonstrated reluctance to shift strategies in such tasks could reflect either a specific aversion to a memory retrieval strategy or a general, inertial resistance to strategy change. Haider and Frensch’s (1999) alphabet verification task (AVT) affords a non-retrieval-based strategy shift. Participants verify the continuation of alphabet strings such as D E F G [4] L, with the bracketed digit indicating a number of letters to be skipped. When all deviations are restricted to the letter-digit-letter portion, participants can speed their responses by selectively attend only to that part of the stimulus. We adapted the AVT to include conditions which promoted shift to a retrieval strategy, a selective attention strategy, or both strategies. Item-level strategy reports were validated by eye movement data. Older adults shifted more slowly to the retrieval strategy but more quickly to the selective attention strategy than young adults, indicating a retrieval-strategy avoidance. Strategy confidence and perceived strategy difficulty correlated with shift to the two strategies in both age groups. Perceived speed of responses with each strategy specifically correlated with older adults’ strategy choices, suggesting that some older adults avoid retrieval because they do not appreciate its efficiency benefits. PMID:23088195

  14. Two Different Maintenance Strategies in the Hospital Environment: Preventive Maintenance for Older Technology Devices and Predictive Maintenance for Newer High-Tech Devices.

    PubMed

    Sezdi, Mana

    2016-01-01

    A maintenance program generated through the consideration of characteristics and failures of medical equipment is an important component of technology management. However, older technology devices and newer high-tech devices cannot be efficiently managed using the same strategies because of their different characteristics. This study aimed to generate a maintenance program comprising two different strategies to increase the efficiency of device management: preventive maintenance for older technology devices and predictive maintenance for newer high-tech devices. For preventive maintenance development, 589 older technology devices were subjected to performance verification and safety testing (PVST). For predictive maintenance development, the manufacturers' recommendations were used for 134 high-tech devices. These strategies were evaluated in terms of device reliability. This study recommends the use of two different maintenance strategies for old and new devices at hospitals in developing countries. Thus, older technology devices that applied only corrective maintenance will be included in maintenance like high-tech devices.

  15. Two Different Maintenance Strategies in the Hospital Environment: Preventive Maintenance for Older Technology Devices and Predictive Maintenance for Newer High-Tech Devices

    PubMed Central

    Sezdi, Mana

    2016-01-01

    A maintenance program generated through the consideration of characteristics and failures of medical equipment is an important component of technology management. However, older technology devices and newer high-tech devices cannot be efficiently managed using the same strategies because of their different characteristics. This study aimed to generate a maintenance program comprising two different strategies to increase the efficiency of device management: preventive maintenance for older technology devices and predictive maintenance for newer high-tech devices. For preventive maintenance development, 589 older technology devices were subjected to performance verification and safety testing (PVST). For predictive maintenance development, the manufacturers' recommendations were used for 134 high-tech devices. These strategies were evaluated in terms of device reliability. This study recommends the use of two different maintenance strategies for old and new devices at hospitals in developing countries. Thus, older technology devices that applied only corrective maintenance will be included in maintenance like high-tech devices. PMID:27195666

  16. [Development of Markov models for economics evaluation of strategies on hepatitis B vaccination and population-based antiviral treatment in China].

    PubMed

    Yang, P C; Zhang, S X; Sun, P P; Cai, Y L; Lin, Y; Zou, Y H

    2017-07-10

    Objective: To construct the Markov models to reflect the reality of prevention and treatment interventions against hepatitis B virus (HBV) infection, simulate the natural history of HBV infection in different age groups and provide evidence for the economics evaluations of hepatitis B vaccination and population-based antiviral treatment in China. Methods: According to the theory and techniques of Markov chain, the Markov models of Chinese HBV epidemic were developed based on the national data and related literature both at home and abroad, including the settings of Markov model states, allowable transitions and initial and transition probabilities. The model construction, operation and verification were conducted by using software TreeAge Pro 2015. Results: Several types of Markov models were constructed to describe the disease progression of HBV infection in neonatal period, perinatal period or adulthood, the progression of chronic hepatitis B after antiviral therapy, hepatitis B prevention and control in adults, chronic hepatitis B antiviral treatment and the natural progression of chronic hepatitis B in general population. The model for the newborn was fundamental which included ten states, i.e . susceptiblity to HBV, HBsAg clearance, immune tolerance, immune clearance, low replication, HBeAg negative CHB, compensated cirrhosis, decompensated cirrhosis, hepatocellular carcinoma (HCC) and death. The susceptible state to HBV was excluded in the perinatal period model, and the immune tolerance state was excluded in the adulthood model. The model for general population only included two states, survive and death. Among the 5 types of models, there were 9 initial states assigned with initial probabilities, and 27 states for transition probabilities. The results of model verifications showed that the probability curves were basically consistent with the situation of HBV epidemic in China. Conclusion: The Markov models developed can be used in economics evaluation of hepatitis B vaccination and treatment for the elimination of HBV infection in China though the structures and parameters in the model have uncertainty with dynamic natures.

  17. Dynamics modeling and loads analysis of an offshore floating wind turbine

    NASA Astrophysics Data System (ADS)

    Jonkman, Jason Mark

    The vast deepwater wind resource represents a potential to use offshore floating wind turbines to power much of the world with renewable energy. Many floating wind turbine concepts have been proposed, but dynamics models, which account for the wind inflow, aerodynamics, elasticity, and controls of the wind turbine, along with the incident waves, sea current, hydrodynamics, and platform and mooring dynamics of the floater, were needed to determine their technical and economic feasibility. This work presents the development of a comprehensive simulation tool for modeling the coupled dynamic response of offshore floating wind turbines, the verification of the simulation tool through model-to-model comparisons, and the application of the simulation tool to an integrated loads analysis for one of the promising system concepts. A fully coupled aero-hydro-servo-elastic simulation tool was developed with enough sophistication to address the limitations of previous frequency- and time-domain studies and to have the features required to perform loads analyses for a variety of wind turbine, support platform, and mooring system configurations. The simulation capability was tested using model-to-model comparisons. The favorable results of all of the verification exercises provided confidence to perform more thorough analyses. The simulation tool was then applied in a preliminary loads analysis of a wind turbine supported by a barge with catenary moorings. A barge platform was chosen because of its simplicity in design, fabrication, and installation. The loads analysis aimed to characterize the dynamic response and to identify potential loads and instabilities resulting from the dynamic couplings between the turbine and the floating barge in the presence of combined wind and wave excitation. The coupling between the wind turbine response and the barge-pitch motion, in particular, produced larger extreme loads in the floating turbine than experienced by an equivalent land-based turbine. Instabilities were also found in the system. The influence of conventional wind turbine blade-pitch control actions on the pitch damping of the floating turbine was also assessed. Design modifications for reducing the platform motions, improving the turbine response, and eliminating the instabilities are suggested. These suggestions are aimed at obtaining cost-effective designs that achieve favorable performance while maintaining structural integrity.

  18. Verification and rectification of the physical analogy of simulated annealing for the solution of the traveling salesman problem.

    PubMed

    Hasegawa, M

    2011-03-01

    The aim of the present study is to elucidate how simulated annealing (SA) works in its finite-time implementation by starting from the verification of its conventional optimization scenario based on equilibrium statistical mechanics. Two and one supplementary experiments, the design of which is inspired by concepts and methods developed for studies on liquid and glass, are performed on two types of random traveling salesman problems. In the first experiment, a newly parameterized temperature schedule is introduced to simulate a quasistatic process along the scenario and a parametric study is conducted to investigate the optimization characteristics of this adaptive cooling. In the second experiment, the search trajectory of the Metropolis algorithm (constant-temperature SA) is analyzed in the landscape paradigm in the hope of drawing a precise physical analogy by comparison with the corresponding dynamics of glass-forming molecular systems. These two experiments indicate that the effectiveness of finite-time SA comes not from equilibrium sampling at low temperature but from downward interbasin dynamics occurring before equilibrium. These dynamics work most effectively at an intermediate temperature varying with the total search time and thus this effective temperature is identified using the Deborah number. To test directly the role of these relaxation dynamics in the process of cooling, a supplementary experiment is performed using another parameterized temperature schedule with a piecewise variable cooling rate and the effect of this biased cooling is examined systematically. The results show that the optimization performance is not only dependent on but also sensitive to cooling in the vicinity of the above effec-tive temperature and that this feature is interpreted as a consequence of the presence or absence of the workable interbasin dynamics. It is confirmed for the present instances that the effectiveness of finite-time SA derives from the glassy relaxation dynamics occurring in the "landscape-influenced" temperature regime and that its naive optimization scenario should be rectified by considering the analogy with vitrification phenomena. A comprehensive guideline for the design of finite-time SA and SA-related algorithms is discussed on the basis of this rectified analogy.

  19. Identification of multiple novel protein biomarkers shed by human serous ovarian tumors into the blood of immunocompromised mice and verified in patient sera.

    PubMed

    Beer, Lynn A; Wang, Huan; Tang, Hsin-Yao; Cao, Zhijun; Chang-Wong, Tony; Tanyi, Janos L; Zhang, Rugang; Liu, Qin; Speicher, David W

    2013-01-01

    The most cancer-specific biomarkers in blood are likely to be proteins shed directly by the tumor rather than less specific inflammatory or other host responses. The use of xenograft mouse models together with in-depth proteome analysis for identification of human proteins in the mouse blood is an under-utilized strategy that can clearly identify proteins shed by the tumor. In the current study, 268 human proteins shed into mouse blood from human OVCAR-3 serous tumors were identified based upon human vs. mouse species differences using a four-dimensional plasma proteome fractionation strategy. A multi-step prioritization and verification strategy was subsequently developed to efficiently select some of the most promising biomarkers from this large number of candidates. A key step was parallel analysis of human proteins detected in the tumor supernatant, because substantially greater sequence coverage for many of the human proteins initially detected in the xenograft mouse plasma confirmed assignments as tumor-derived human proteins. Verification of candidate biomarkers in patient sera was facilitated by in-depth, label-free quantitative comparisons of serum pools from patients with ovarian cancer and benign ovarian tumors. The only proteins that advanced to multiple reaction monitoring (MRM) assay development were those that exhibited increases in ovarian cancer patients compared with benign tumor controls. MRM assays were facilely developed for all 11 novel biomarker candidates selected by this process and analysis of larger pools of patient sera suggested that all 11 proteins are promising candidate biomarkers that should be further evaluated on individual patient blood samples.

  20. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  1. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume I.

    DTIC Science & Technology

    1981-04-30

    However, SREM was not designed to harmonize these kinds of problems. Rather, it is a tool to investigate the logic of the processing specified in the... design . Supoorting programs were also conducted to perform basic research into such areas as software reliability, static and dynamic validation techniques...development. 0 Maintain requirements development independent of the target machine and the eventual software design . 0. Allow for easy response to

  2. SWI 1.10 Testing Process

    NASA Technical Reports Server (NTRS)

    Stokes, LeBarian

    2009-01-01

    This procedure establishes a system for performing testing in the Six-Degree-Of-Freedom Dynamic Test System (SDTS). Testing includes development and verification testing of customer supplied Test Articles (TAs) and other testing requirements, as requested. This procedure applies to all SDTS testing operations and equipment. The procedure provides an overview of testing performed in the SDTS including test identification requirements, test planning and procedure development, test and performance inspection, test data analysis, and test report generation.

  3. Integrated Computer-Aided Manufacturing (ICAM) Architecture Part 2. Volume 6. Dynamics Modeling Manual (IDEF2)

    DTIC Science & Technology

    1981-06-01

    design of manufacturing systems, "ilidation and verification of ICAM modules, integration of ICAM modules and the orderly transition of ICAM modules into...Function Model of "Manufacture Product" (MFGO) VIII - Composite Function Model of " Design Product" (DESIGNO) IX - Composite Information Model of...User Interface Requirements; and the Architecture of Design . This work was performed during the period of 29 September 1978 through 10

  4. Autonomy Community of Interest (COI) Test and Evaluation, Verification and Validation (TEVV) Working Group: Technology Investment Strategy 2015-2018

    DTIC Science & Technology

    2015-05-01

    Evaluation Center of Excellence SUAS Small Unmanned Aircraft System SUT System under Test T&E Test and Evaluation TARDEC Tank Automotive Research...17 Distribution A: Distribution Unlimited 2 Background In the past decade, unmanned systems have significantly impacted warfare...environments at a speed and scale beyond manned capability. However, current unmanned systems operate with minimal autonomy. To meet warfighter needs and

  5. Design Strategies to Mitigate Unsteady Forcing (Preprint)

    DTIC Science & Technology

    2008-04-01

    Verification and Validation of CFD Simulation of Pulsating Laminar Flow in a Straight Pipe ,” AIAA Paper No. 2005-4863. [48] Guide for the...reduce the heat load to downstream components [41-44]. Although there is no effect on the potential field inside the vane row [45], there is...effect of design changes on the time-mean characteristics of the machine (e.g. aero- performance or heat load) or to estimate resonant stresses on

  6. Analysis of dynamics and fit of diving suits

    NASA Astrophysics Data System (ADS)

    Mahnic Naglic, M.; Petrak, S.; Gersak, J.; Rolich, T.

    2017-10-01

    Paper presents research on dynamical behaviour and fit analysis of customised diving suits. Diving suits models are developed using the 3D flattening method, which enables the construction of a garment model directly on the 3D computer body model and separation of discrete 3D surfaces as well as transformation into 2D cutting parts. 3D body scanning of male and female test subjects was performed with the purpose of body measurements analysis in static and dynamic postures and processed body models were used for construction and simulation of diving suits prototypes. All necessary parameters, for 3D simulation were applied on obtained cutting parts, as well as parameters values for mechanical properties of neoprene material. Developed computer diving suits prototypes were used for stretch analysis on areas relevant for body dimensional changes according to dynamic anthropometrics. Garment pressures against the body in static and dynamic conditions was also analysed. Garments patterns for which the computer prototype verification was conducted were used for real prototype production. Real prototypes were also used for stretch and pressure analysis in static and dynamic conditions. Based on the obtained results, correlation analysis between body changes in dynamic positions and dynamic stress, determined on computer and real prototypes, was performed.

  7. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressuremore » gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. Furthermore, these simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.« less

  8. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    NASA Astrophysics Data System (ADS)

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun; Lane, J. Matthew D.

    2018-05-01

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressure gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. These simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.

  9. Verification of experimental dynamic strength methods with atomistic ramp-release simulations

    DOE PAGES

    Moore, Alexander P.; Brown, Justin L.; Lim, Hojun; ...

    2018-05-04

    Material strength and moduli can be determined from dynamic high-pressure ramp-release experiments using an indirect method of Lagrangian wave profile analysis of surface velocities. This method, termed self-consistent Lagrangian analysis (SCLA), has been difficult to calibrate and corroborate with other experimental methods. Using nonequilibrium molecular dynamics, we validate the SCLA technique by demonstrating that it accurately predicts the same bulk modulus, shear modulus, and strength as those calculated from the full stress tensor data, especially where strain rate induced relaxation effects and wave attenuation are small. We show here that introducing a hold in the loading profile at peak pressuremore » gives improved accuracy in the shear moduli and relaxation-adjusted strength by reducing the effect of wave attenuation. When rate-dependent effects coupled with wave attenuation are large, we find that Lagrangian analysis overpredicts the maximum unload wavespeed, leading to increased error in the measured dynamic shear modulus. Furthermore, these simulations provide insight into the definition of dynamic strength, as well as a plausible explanation for experimental disagreement in reported dynamic strength values.« less

  10. Perspectives of human verification via binary QRS template matching of single-lead and 12-lead electrocardiogram.

    PubMed

    Krasteva, Vessela; Jekova, Irena; Schmid, Ramun

    2018-01-01

    This study aims to validate the 12-lead electrocardiogram (ECG) as a biometric modality based on two straightforward binary QRS template matching characteristics. Different perspectives of the human verification problem are considered, regarding the optimal lead selection and stability over sample size, gender, age, heart rate (HR). A clinical 12-lead resting ECG database, including a population of 460 subjects with two-session recordings (>1 year apart) is used. Cost-effective strategies for extraction of personalized QRS patterns (100ms) and binary template matching estimate similarity in the time scale (matching time) and dissimilarity in the amplitude scale (mismatch area). The two-class person verification task, taking the decision to validate or to reject the subject identity is managed by linear discriminant analysis (LDA). Non-redundant LDA models for different lead configurations (I,II,III,aVF,aVL,aVF,V1-V6) are trained on the first half of 230 subjects by stepwise feature selection until maximization of the area under the receiver operating characteristic curve (ROC AUC). The operating point on the training ROC at equal error rate (EER) is tested on the independent dataset (second half of 230 subjects) to report unbiased validation of test-ROC AUC and true verification rate (TVR = 100-EER). The test results are further evaluated in groups by sample size, gender, age, HR. The optimal QRS pattern projection for single-lead ECG biometric modality is found in the frontal plane sector (60°-0°) with best (Test-AUC/TVR) for lead II (0.941/86.8%) and slight accuracy drop for -aVR (-0.017/-1.4%), I (-0.01/-1.5%). Chest ECG leads have degrading accuracy from V1 (0.885/80.6%) to V6 (0.799/71.8%). The multi-lead ECG improves verification: 6-chest (0.97/90.9%), 6-limb (0.986/94.3%), 12-leads (0.995/97.5%). The QRS pattern matching model shows stable performance for verification of 10 to 230 individuals; insignificant degradation of TVR in women by (1.2-3.6%), adults ≥70 years (3.7%), younger <40 years (1.9%), HR<60bpm (1.2%), HR>90bpm (3.9%), no degradation for HR change (0 to >20bpm).

  11. Very fast road database verification using textured 3D city models obtained from airborne imagery

    NASA Astrophysics Data System (ADS)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.

  12. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    NASA Technical Reports Server (NTRS)

    Zavordsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use cases.

  13. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  14. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  15. Development of Biomarkers for Screening Hepatocellular Carcinoma Using Global Data Mining and Multiple Reaction Monitoring

    PubMed Central

    Yu, Su Jong; Jang, Eun Sun; Yu, Jiyoung; Cho, Geunhee; Yoon, Jung-Hwan; Kim, Youngsoo

    2013-01-01

    Hepatocellular carcinoma (HCC) is one of the most common and aggressive cancers and is associated with a poor survival rate. Clinically, the level of alpha-fetoprotein (AFP) has been used as a biomarker for the diagnosis of HCC. The discovery of useful biomarkers for HCC, focused solely on the proteome, has been difficult; thus, wide-ranging global data mining of genomic and proteomic databases from previous reports would be valuable in screening biomarker candidates. Further, multiple reaction monitoring (MRM), based on triple quadrupole mass spectrometry, has been effective with regard to high-throughput verification, complementing antibody-based verification pipelines. In this study, global data mining was performed using 5 types of HCC data to screen for candidate biomarker proteins: cDNA microarray, copy number variation, somatic mutation, epigenetic, and quantitative proteomics data. Next, we applied MRM to verify HCC candidate biomarkers in individual serum samples from 3 groups: a healthy control group, patients who have been diagnosed with HCC (Before HCC treatment group), and HCC patients who underwent locoregional therapy (After HCC treatment group). After determining the relative quantities of the candidate proteins by MRM, we compared their expression levels between the 3 groups, identifying 4 potential biomarkers: the actin-binding protein anillin (ANLN), filamin-B (FLNB), complementary C4-A (C4A), and AFP. The combination of 2 markers (ANLN, FLNB) improved the discrimination of the before HCC treatment group from the healthy control group compared with AFP. We conclude that the combination of global data mining and MRM verification enhances the screening and verification of potential HCC biomarkers. This efficacious integrative strategy is applicable to the development of markers for cancer and other diseases. PMID:23717429

  16. Development of biomarkers for screening hepatocellular carcinoma using global data mining and multiple reaction monitoring.

    PubMed

    Kim, Hyunsoo; Kim, Kyunggon; Yu, Su Jong; Jang, Eun Sun; Yu, Jiyoung; Cho, Geunhee; Yoon, Jung-Hwan; Kim, Youngsoo

    2013-01-01

    Hepatocellular carcinoma (HCC) is one of the most common and aggressive cancers and is associated with a poor survival rate. Clinically, the level of alpha-fetoprotein (AFP) has been used as a biomarker for the diagnosis of HCC. The discovery of useful biomarkers for HCC, focused solely on the proteome, has been difficult; thus, wide-ranging global data mining of genomic and proteomic databases from previous reports would be valuable in screening biomarker candidates. Further, multiple reaction monitoring (MRM), based on triple quadrupole mass spectrometry, has been effective with regard to high-throughput verification, complementing antibody-based verification pipelines. In this study, global data mining was performed using 5 types of HCC data to screen for candidate biomarker proteins: cDNA microarray, copy number variation, somatic mutation, epigenetic, and quantitative proteomics data. Next, we applied MRM to verify HCC candidate biomarkers in individual serum samples from 3 groups: a healthy control group, patients who have been diagnosed with HCC (Before HCC treatment group), and HCC patients who underwent locoregional therapy (After HCC treatment group). After determining the relative quantities of the candidate proteins by MRM, we compared their expression levels between the 3 groups, identifying 4 potential biomarkers: the actin-binding protein anillin (ANLN), filamin-B (FLNB), complementary C4-A (C4A), and AFP. The combination of 2 markers (ANLN, FLNB) improved the discrimination of the before HCC treatment group from the healthy control group compared with AFP. We conclude that the combination of global data mining and MRM verification enhances the screening and verification of potential HCC biomarkers. This efficacious integrative strategy is applicable to the development of markers for cancer and other diseases.

  17. Requirements Formulation and Dynamic Jitter Analysis for Fourier-Kelvin Stellar Interferometer

    NASA Technical Reports Server (NTRS)

    Liu, Kuo-Chia; Hyde, Tristram; Blaurock, Carl; Bolognese, Jeff; Howard, Joseph; Danchi, William

    2004-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) has been proposed to detect and characterize extra solar giant planets. The baseline configuration for FKSI is a two- aperture, structurally connected nulling interferometer, capable of providing null depth less than lo4 in the infrared. The objective of this paper is to summarize the process for setting the top level requirements and the jitter analysis performed on FKSI to date. The first part of the paper discusses the derivation of dynamic stability requirements, necessary for meeting the FKSI nulling demands. An integrated model including structures, optics, and control systems has been developed to support dynamic jitter analysis and requirements verification. The second part of the paper describes how the integrated model is used to investigate the effects of reaction wheel disturbances on pointing and optical path difference stabilities.

  18. Prediction of turning stability using receptance coupling

    NASA Astrophysics Data System (ADS)

    Jasiewicz, Marcin; Powałka, Bartosz

    2018-01-01

    This paper presents an issue of machining stability prediction of dynamic "lathe - workpiece" system evaluated using receptance coupling method. Dynamic properties of the lathe components (the spindle and the tailstock) are assumed to be constant and can be determined experimentally based on the results of the impact test. Hence, the variable of the system "machine tool - holder - workpiece" is the machined part, which can be easily modelled analytically. The method of receptance coupling enables a synthesis of experimental (spindle, tailstock) and analytical (machined part) models, so impact testing of the entire system becomes unnecessary. The paper presents methodology of analytical and experimental models synthesis, evaluation of the stability lobes and experimental validation procedure involving both the determination of the dynamic properties of the system and cutting tests. In the summary the experimental verification results would be presented and discussed.

  19. Coupled Facility-Payload Vibration Modeling Improvements

    NASA Technical Reports Server (NTRS)

    Carnahan, Timothy M.; Kaiser, Michael A.

    2015-01-01

    A major phase of aerospace hardware verification is vibration testing. The standard approach for such testing is to use a shaker to induce loads into the payload. In preparation for vibration testing at National Aeronautics and Space Administration/Goddard Space Flight Center an analysis is performed to assess the responses of the payload. A new method of modeling the test is presented that takes into account dynamic interactions between the facility and the payload. This dynamic interaction has affected testing in the past, but been ignored or adjusted for during testing. By modeling the combined dynamics of the facility and test article (payload) it is possible to improve the prediction of hardware responses. Many aerospace test facilities work in similar way to those at NASA/Goddard Space Flight Center. Lessons learned here should be applicable to other test facilities with similar setups.

  20. A Flight Control Approach for Small Reentry Vehicles

    NASA Technical Reports Server (NTRS)

    Bevacqoa, Tim; Adams, Tony; Zhu. J. Jim; Rao, P. Prabhakara

    2004-01-01

    Flight control of small crew return vehicles during atmospheric reentry will be an important technology in any human space flight mission undertaken in the future. The control system presented in this paper is applicable to small crew return vehicles in which reaction control system (RCS) thrusters are the only actuators available for attitude control. The control system consists of two modules: (i) the attitude controller using the trajectory linearization control (TLC) technique, and (ii) the reaction control system (RCS) control allocation module using a dynamic table-lookup technique. This paper describes the design and implementation of the TLC attitude control and the dynamic table-lookup RCS control allocation for nonimal flight along with design verification test results.

  1. ISS Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Laible, Michael R.

    2011-01-01

    The Microgravity performance assessment of the International Space Station (ISS) is comprised of a quasi-steady, structural dynamic and a vibro-acoustic analysis of the ISS assembly-complete vehicle configuration. The Boeing Houston (BHOU) Loads and Dynamics Team is responsible to verify compliance with the ISS System Specification (SSP 41000) and USOS Segment (SSP 41162) microgravity requirements. To verify the ISS environment, a series of accelerometers are on-board to monitor the current environment. This paper summarizes the results of the analysis that was performed for the Verification Analysis Cycle (VAC)-Assembly Complete (AC) and compares it to on-orbit acceleration values currently being reported. The analysis will include the predicted maximum and average environment on-board ISS during multiple activity scenarios

  2. First Scientific Working Group Meeting of Airborne Doppler Lidar Wind Velocity Measurement Program

    NASA Technical Reports Server (NTRS)

    Kaufman, J. W. (Editor)

    1980-01-01

    The purpose of the first scientific working group meeting was fourfold: (1) to identify flight test options for engineering verification of the MSFC Doppler Lidar; (2) to identify flight test options for gathering data for scientific/technology applications; (3) to identify additional support equipment needed on the CV 990 aircraft for the flight tests; and (4) to identify postflight data processing and data sets requirements. The working group identified approximately ten flight options for gathering data on atmospheric dynamics processes, including turbulence, valley breezes, and thunderstorm cloud anvil and cold air outflow dynamics. These test options will be used as a basis for planning the fiscal year 1981 tests of the Doppler Lidar system.

  3. Flexible rotor dynamics analysis

    NASA Technical Reports Server (NTRS)

    Shen, F. A.

    1973-01-01

    A digital computer program was developed to analyze the general nonaxisymmetric and nonsynchronous transient and steady-state rotor dynamic performance of a bending- and shear-wise flexible rotor-bearing system under various operating conditions. The effects of rotor material mechanical hysteresis, rotor torsion flexibility, transverse effects of rotor axial and torsional loading and the anisotropic, in-phase and out-of-phase bearing stiffness and damping force and moment coefficients were included in the program to broaden its capability. An optimum solution method was found and incorporated in the computer program. Computer simulation of experimental data was made and qualitative agreements observed. The mathematical formulations, computer program verification, test data simulation, and user instruction was presented and discussed.

  4. Equifinality in empirical studies of cultural transmission.

    PubMed

    Barrett, Brendan J

    2018-01-31

    Cultural systems exhibit equifinal behavior - a single final state may be arrived at via different mechanisms and/or from different initial states. Potential for equifinality exists in all empirical studies of cultural transmission including controlled experiments, observational field research, and computational simulations. Acknowledging and anticipating the existence of equifinality is important in empirical studies of social learning and cultural evolution; it helps us understand the limitations of analytical approaches and can improve our ability to predict the dynamics of cultural transmission. Here, I illustrate and discuss examples of equifinality in studies of social learning, and how certain experimental designs might be prone to it. I then review examples of equifinality discussed in the social learning literature, namely the use of s-shaped diffusion curves to discern individual from social learning and operational definitions and analytical approaches used in studies of conformist transmission. While equifinality exists to some extent in all studies of social learning, I make suggestions for how to address instances of it, with an emphasis on using data simulation and methodological verification alongside modern statistical approaches that emphasize prediction and model comparison. In cases where evaluated learning mechanisms are equifinal due to non-methodological factors, I suggest that this is not always a problem if it helps us predict cultural change. In some cases, equifinal learning mechanisms might offer insight into how both individual learning, social learning strategies and other endogenous social factors might by important in structuring cultural dynamics and within- and between-group heterogeneity. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Dynamics and Control of Newtonian and Viscoelastic Fluids

    NASA Astrophysics Data System (ADS)

    Lieu, Binh K.

    Transition to turbulence represents one of the most intriguing natural phenomena. Flows that are smooth and ordered may become complex and disordered as the flow strength increases. This process is known as transition to turbulence. In this dissertation, we develop theoretical and computational tools for analysis and control of transition and turbulence in shear flows of Newtonian, such as air and water, and complex viscoelastic fluids, such as polymers and molten plastics. Part I of the dissertation is devoted to the design and verification of sensor-free and feedback-based strategies for controlling the onset of turbulence in channel flows of Newtonian fluids. We use high fidelity simulations of the nonlinear flow dynamics to demonstrate the effectiveness of our model-based approach to flow control design. In Part II, we utilize systems theoretic tools to study transition and turbulence in channel flows of viscoelastic fluids. For flows with strong elastic forces, we demonstrate that flow fluctuations can experience significant amplification even in the absence of inertia. We use our theoretical developments to uncover the underlying physical mechanism that leads to this high amplification. For turbulent flows with polymer additives, we develop a model-based method for analyzing the influence of polymers on drag reduction. We demonstrate that our approach predicts drag reducing trends observed in full-scale numerical simulations. In Part III, we develop mathematical framework and computational tools for calculating frequency responses of spatially distributed systems. Using state-of-the-art automatic spectral collocation techniques and new integral formulation, we show that our approach yields more reliable and accurate solutions than currently available methods.

  6. Gender dimorphic ACL strain in response to combined dynamic 3D knee joint loading: implications for ACL injury risk.

    PubMed

    Mizuno, Kiyonori; Andrish, Jack T; van den Bogert, Antonie J; McLean, Scott G

    2009-12-01

    While gender-based differences in knee joint anatomies/laxities are well documented, the potential for them to precipitate gender-dimorphic ACL loading and resultant injury risk has not been considered. To this end, we generated gender-specific models of ACL strain as a function of any six degrees of freedom (6DOF) knee joint load state via a combined cadaveric and analytical approach. Continuously varying joint forces and torques were applied to five male and five female cadaveric specimens and recorded along with synchronous knee flexion and ACL strain data. All data (approximately 10,000 samples) were submitted to specimen-specific regression analyses, affording ACL strain predictions as a function of the combined 6 DOF knee loads. Following individual model verifications, generalized gender-specific models were generated and subjected to 6 DOF external load scenarios consistent with both a clinical examination and a dynamic sports maneuver. The ensuing model-based strain predictions were subsequently examined for gender-based discrepancies. Male and female specimen-specific models predicted ACL strain within 0.51%+/-0.10% and 0.52%+/-0.07% of the measured data respectively, and explained more than 75% of the associated variance in each case. Predicted female ACL strains were also significantly larger than respective male values for both simulated 6 DOF load scenarios. Outcomes suggest that the female ACL will rupture in response to comparatively smaller external load applications. Future work must address the underlying anatomical/laxity contributions to knee joint mechanical and resultant ACL loading, ultimately affording prevention strategies that may cater to individual joint vulnerabilities.

  7. Dynamic Multiple Work Stealing Strategy for Flexible Load Balancing

    NASA Astrophysics Data System (ADS)

    Adnan; Sato, Mitsuhisa

    Lazy-task creation is an efficient method of overcoming the overhead of the grain-size problem in parallel computing. Work stealing is an effective load balancing strategy for parallel computing. In this paper, we present dynamic work stealing strategies in a lazy-task creation technique for efficient fine-grain task scheduling. The basic idea is to control load balancing granularity depending on the number of task parents in a stack. The dynamic-length strategy of work stealing uses run-time information, which is information on the load of the victim, to determine the number of tasks that a thief is allowed to steal. We compare it with the bottommost first work stealing strategy used in StackThread/MP, and the fixed-length strategy of work stealing, where a thief requests to steal a fixed number of tasks, as well as other multithreaded frameworks such as Cilk and OpenMP task implementations. The experiments show that the dynamic-length strategy of work stealing performs well in irregular workloads such as in UTS benchmarks, as well as in regular workloads such as Fibonacci, Strassen's matrix multiplication, FFT, and Sparse-LU factorization. The dynamic-length strategy works better than the fixed-length strategy because it is more flexible than the latter; this strategy can avoid load imbalance due to overstealing.

  8. Diversifying selection in the wheat stem rust fungus acts predominantly on pathogen-associated gene families and reveals candidate effectors

    PubMed Central

    Sperschneider, Jana; Ying, Hua; Dodds, Peter N.; Gardiner, Donald M.; Upadhyaya, Narayana M.; Singh, Karam B.; Manners, John M.; Taylor, Jennifer M.

    2014-01-01

    Plant pathogens cause severe losses to crop plants and threaten global food production. One striking example is the wheat stem rust fungus, Puccinia graminis f. sp. tritici, which can rapidly evolve new virulent pathotypes in response to resistant host lines. Like several other filamentous fungal and oomycete plant pathogens, its genome features expanded gene families that have been implicated in host-pathogen interactions, possibly encoding effector proteins that interact directly with target host defense proteins. Previous efforts to understand virulence largely relied on the prediction of secreted, small and cysteine-rich proteins as candidate effectors and thus delivered an overwhelming number of candidates. Here, we implement an alternative analysis strategy that uses the signal of adaptive evolution as a line of evidence for effector function, combined with comparative information and expression data. We demonstrate that in planta up-regulated genes that are rapidly evolving are found almost exclusively in pathogen-associated gene families, affirming the impact of host-pathogen co-evolution on genome structure and the adaptive diversification of specialized gene families. In particular, we predict 42 effector candidates that are conserved only across pathogens, induced during infection and rapidly evolving. One of our top candidates has recently been shown to induce genotype-specific hypersensitive cell death in wheat. This shows that comparative genomics incorporating the evolutionary signal of adaptation is powerful for predicting effector candidates for laboratory verification. Our system can be applied to a wide range of pathogens and will give insight into host-pathogen dynamics, ultimately leading to progress in strategies for disease control. PMID:25225496

  9. The Iterative Design Process in Research and Development: A Work Experience Paper

    NASA Technical Reports Server (NTRS)

    Sullivan, George F. III

    2013-01-01

    The iterative design process is one of many strategies used in new product development. Top-down development strategies, like waterfall development, place a heavy emphasis on planning and simulation. The iterative process, on the other hand, is better suited to the management of small to medium scale projects. Over the past four months, I have worked with engineers at Johnson Space Center on a multitude of electronics projects. By describing the work I have done these last few months, analyzing the factors that have driven design decisions, and examining the testing and verification process, I will demonstrate that iterative design is the obvious choice for research and development projects.

  10. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  11. 6DOF Testing of the SLS Inertial Navigation Unit

    NASA Technical Reports Server (NTRS)

    Geohagan, Kevin; Bernard, Bill; Oliver, T. Emerson; Leggett, Jared; Strickland, Dennis

    2018-01-01

    The Navigation System on the NASA Space Launch System (SLS) Block 1 vehicle performs initial alignment of the Inertial Navigation System (INS) navigation frame through gyrocompass alignment (GCA). Because the navigation architecture for the SLS Block 1 vehicle is a purely inertial system, the accuracy of the achieved orbit relative to mission requirements is very sensitive to initial alignment accuracy. The assessment of this sensitivity and many others via simulation is a part of the SLS Model-Based Design and Model-Based Requirements approach. As a part of the aforementioned, 6DOF Monte Carlo simulation is used in large part to develop and demonstrate verification of program requirements. To facilitate this and the GN&C flight software design process, an SLS-Program-controlled Design Math Model (DMM) of the SLS INS was developed by the SLS Navigation Team. The SLS INS model implements all of the key functions of the hardware-namely, GCA, inertial navigation, and FDIR (Fault Detection, Isolation, and Recovery)-in support of SLS GN&C design requirements verification. Despite the strong sensitivity to initial alignment, GCA accuracy requirements were not verified by test due to program cost and schedule constraints. Instead, the system relies upon assessments performed using the SLS INS model. In order to verify SLS program requirements by analysis, the SLS INS model is verified and validated against flight hardware. In lieu of direct testing of GCA accuracy in support of requirement verification, the SLS Navigation Team proposed and conducted an engineering test to, among other things, validate the GCA performance and overall behavior of the SLS INS model through comparison with test data. This paper will detail dynamic hardware testing of the SLS INS, conducted by the SLS Navigation Team at Marshall Space Flight Center's 6DOF Table Facility, in support of GCA performance characterization and INS model validation. A 6-DOF motion platform was used to produce 6DOF pad twist and sway dynamics while a simulated SLS flight computer communicated with the INS. Tests conducted include an evaluation of GCA algorithm robustness to increasingly dynamic pad environments, an examination of GCA algorithm stability and accuracy over long durations, and a long-duration static test to gather enough data for Allan Variance analysis. Test setup, execution, and data analysis will be discussed, including analysis performed in support of SLS INS model validation.

  12. Uncertainty analysis as essential step in the establishment of the dynamic Design Space of primary drying during freeze-drying.

    PubMed

    Mortier, Séverine Thérèse F C; Van Bockstal, Pieter-Jan; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas

    2016-06-01

    Large molecules, such as biopharmaceuticals, are considered the key driver of growth for the pharmaceutical industry. Freeze-drying is the preferred way to stabilise these products when needed. However, it is an expensive, inefficient, time- and energy-consuming process. During freeze-drying, there are only two main process variables to be set, i.e. the shelf temperature and the chamber pressure, however preferably in a dynamic way. This manuscript focuses on the essential use of uncertainty analysis for the determination and experimental verification of the dynamic primary drying Design Space for pharmaceutical freeze-drying. Traditionally, the chamber pressure and shelf temperature are kept constant during primary drying, leading to less optimal process conditions. In this paper it is demonstrated how a mechanistic model of the primary drying step gives the opportunity to determine the optimal dynamic values for both process variables during processing, resulting in a dynamic Design Space with a well-known risk of failure. This allows running the primary drying process step as time efficient as possible, hereby guaranteeing that the temperature at the sublimation front does not exceed the collapse temperature. The Design Space is the multidimensional combination and interaction of input variables and process parameters leading to the expected product specifications with a controlled (i.e., high) probability. Therefore, inclusion of parameter uncertainty is an essential part in the definition of the Design Space, although it is often neglected. To quantitatively assess the inherent uncertainty on the parameters of the mechanistic model, an uncertainty analysis was performed to establish the borders of the dynamic Design Space, i.e. a time-varying shelf temperature and chamber pressure, associated with a specific risk of failure. A risk of failure acceptance level of 0.01%, i.e. a 'zero-failure' situation, results in an increased primary drying process time compared to the deterministic dynamic Design Space; however, the risk of failure is under control. Experimental verification revealed that only a risk of failure acceptance level of 0.01% yielded a guaranteed zero-defect quality end-product. The computed process settings with a risk of failure acceptance level of 0.01% resulted in a decrease of more than half of the primary drying time in comparison with a regular, conservative cycle with fixed settings. Copyright © 2016. Published by Elsevier B.V.

  13. The Immuno-Dynamics of Conflict Intervention in Social Systems

    PubMed Central

    Krakauer, David C.; Page, Karen; Flack, Jessica

    2011-01-01

    We present statistical evidence and dynamical models for the management of conflict and a division of labor (task specialization) in a primate society. Two broad intervention strategy classes are observed– a dyadic strategy – pacifying interventions, and a triadic strategy –policing interventions. These strategies, their respective degrees of specialization, and their consequences for conflict dynamics can be captured through empirically-grounded mathematical models inspired by immuno-dynamics. The spread of aggression, analogous to the proliferation of pathogens, is an epidemiological problem. We show analytically and computationally that policing is an efficient strategy as it requires only a small proportion of a population to police to reduce conflict contagion. Policing, but not pacifying, is capable of effectively eliminating conflict. These results suggest that despite implementation differences there might be universal features of conflict management mechanisms for reducing contagion-like dynamics that apply across biological and social levels. Our analyses further suggest that it can be profitable to conceive of conflict management strategies at the behavioral level as mechanisms of social immunity. PMID:21887221

  14. The immuno-dynamics of conflict intervention in social systems.

    PubMed

    Krakauer, David C; Page, Karen; Flack, Jessica

    2011-01-01

    We present statistical evidence and dynamical models for the management of conflict and a division of labor (task specialization) in a primate society. Two broad intervention strategy classes are observed--a dyadic strategy--pacifying interventions, and a triadic strategy--policing interventions. These strategies, their respective degrees of specialization, and their consequences for conflict dynamics can be captured through empirically-grounded mathematical models inspired by immuno-dynamics. The spread of aggression, analogous to the proliferation of pathogens, is an epidemiological problem. We show analytically and computationally that policing is an efficient strategy as it requires only a small proportion of a population to police to reduce conflict contagion. Policing, but not pacifying, is capable of effectively eliminating conflict. These results suggest that despite implementation differences there might be universal features of conflict management mechanisms for reducing contagion-like dynamics that apply across biological and social levels. Our analyses further suggest that it can be profitable to conceive of conflict management strategies at the behavioral level as mechanisms of social immunity.

  15. Enabling Dynamic Security Management of Networked Systems via Device-Embedded Security (Self-Securing Devices)

    DTIC Science & Technology

    2007-01-15

    it can detect specifically proscribed content changes to critical files (e.g., illegal shells inserted into /etc/ passwd ). Fourth, it can detect the...UNIX password management involves a pair of inter-related files (/etc/ passwd and /etc/shadow). The corresponding access patterns seen at the storage...content integrity verification is utilized. As a concrete example, consider a UNIX system password file (/etc/ passwd ), which consists of a set of well

  16. Statistical Moments in Variable Density Incompressible Mixing Flows

    DTIC Science & Technology

    2015-08-28

    front tracking method: Verification and application to simulation of the primary breakup of a liquid jet . SIAM J. Sci. Comput., 33:1505–1524, 2011. [15... elliptic problem. In case of failure, Generalized Minimal Residual (GMRES) method [78] is used instead. Then update face velocities as follows: u n+1...of the ACM Solid and Physical Modeling Symposium, pages 159–170, 2008. [51] D. D. Joseph. Fluid dynamics of two miscible liquids with diffusion and

  17. [Model for unplanned self extubation of ICU patients using system dynamics approach].

    PubMed

    Song, Yu Gil; Yun, Eun Kyoung

    2015-04-01

    In this study a system dynamics methodology was used to identify correlation and nonlinear feedback structure among factors affecting unplanned extubation (UE) of ICU patients and to construct and verify a simulation model. Factors affecting UE were identified through a theoretical background established by reviewing literature and preceding studies and referencing various statistical data. Related variables were decided through verification of content validity by an expert group. A causal loop diagram (CLD) was made based on the variables. Stock & Flow modeling using Vensim PLE Plus Version 6.0 b was performed to establish a model for UE. Based on the literature review and expert verification, 18 variables associated with UE were identified and CLD was prepared. From the prepared CLD, a model was developed by converting to the Stock & Flow Diagram. Results of the simulation showed that patient stress, patient in an agitated state, restraint application, patient movability, and individual intensive nursing were variables giving the greatest effect to UE probability. To verify agreement of the UE model with real situations, simulation with 5 cases was performed. Equation check and sensitivity analysis on TIME STEP were executed to validate model integrity. Results show that identification of a proper model enables prediction of UE probability. This prediction allows for adjustment of related factors, and provides basic data do develop nursing interventions to decrease UE.

  18. Dynamic knowledge representation using agent-based modeling: ontology instantiation and verification of conceptual models.

    PubMed

    An, Gary

    2009-01-01

    The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.

  19. Practical Formal Verification of MPI and Thread Programs

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Ganesh; Kirby, Robert M.

    Large-scale simulation codes in science and engineering are written using the Message Passing Interface (MPI). Shared memory threads are widely used directly, or to implement higher level programming abstractions. Traditional debugging methods for MPI or thread programs are incapable of providing useful formal guarantees about coverage. They get bogged down in the sheer number of interleavings (schedules), often missing shallow bugs. In this tutorial we will introduce two practical formal verification tools: ISP (for MPI C programs) and Inspect (for Pthread C programs). Unlike other formal verification tools, ISP and Inspect run directly on user source codes (much like a debugger). They pursue only the relevant set of process interleavings, using our own customized Dynamic Partial Order Reduction algorithms. For a given test harness, DPOR allows these tools to guarantee the absence of deadlocks, instrumented MPI object leaks and communication races (using ISP), and shared memory races (using Inspect). ISP and Inspect have been used to verify large pieces of code: in excess of 10,000 lines of MPI/C for ISP in under 5 seconds, and about 5,000 lines of Pthread/C code in a few hours (and much faster with the use of a cluster or by exploiting special cases such as symmetry) for Inspect. We will also demonstrate the Microsoft Visual Studio and Eclipse Parallel Tools Platform integrations of ISP (these will be available on the LiveCD).

  20. Space Shuttle Day-of-Launch Trajectory Design and Verification

    NASA Technical Reports Server (NTRS)

    Harrington, Brian E.

    2010-01-01

    A top priority of any launch vehicle is to insert as much mass into the desired orbit as possible. This requirement must be traded against vehicle capability in terms of dynamic control, thermal constraints, and structural margins. The vehicle is certified to a specific structural envelope which will yield certain performance characteristics of mass to orbit. Some envelopes cannot be certified generically and must be checked with each mission design. The most sensitive envelopes require an assessment on the day-of-launch. To further minimize vehicle loads while maximizing vehicle performance, a day-of-launch trajectory can be designed. This design is optimized according to that day s wind and atmospheric conditions, which will increase the probability of launch. The day-of-launch trajectory verification is critical to the vehicle's safety. The Day-Of-Launch I-Load Uplink (DOLILU) is the process by which the Space Shuttle Program redesigns the vehicle steering commands to fit that day's environmental conditions and then rigorously verifies the integrated vehicle trajectory's loads, controls, and performance. The Shuttle methodology is very similar to other United States unmanned launch vehicles. By extension, this method would be similar to the methods employed for any future NASA launch vehicles. This presentation will provide an overview of the Shuttle's day-of-launch trajectory optimization and verification as an example of a more generic application of dayof- launch design and validation.

  1. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yidong; Andrs, David; Martineau, Richard Charles

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for timemore » integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.« less

  2. The Complexity of Quantitative Concurrent Parity Games

    DTIC Science & Technology

    2004-11-01

    for each player. In this paper we study only zero-sum games [20, 11], where the objectives of the two players are strictly competitive . In other words...Aided Verification, volume 1102 of LNCS, pages 75–86. Springer, 1996. [14] R.J. Lipton, E . Markakis, and A. Mehta. Playing large games using simple...strategies. In EC 03: Electronic Commerce, pages 36–41. ACM Press, 2003. 28 [15] D.A. Martin. The determinacy of Blackwell games . The Journal of Symbolic

  3. United States National Strategy and Defense Policy Objectives After Chemical Disarmament

    DTIC Science & Technology

    1989-03-19

    toxins. Because the 1972 BWT Convention does not adequately define toxins, the Soviets have a good case for considering artificial toxins as chemical...commits all parties to negotiate "in good faith" toward "the recognized objective of effective prohibition of chemical weapons."𔃼 The discussions on...34 argue: "that the effectiveness of verification measures is enhanced by a high level of chemical defense. Good defense greatly raises the scale of

  4. Multibody dynamical modeling for spacecraft docking process with spring-damper buffering device: A new validation approach

    NASA Astrophysics Data System (ADS)

    Daneshjou, Kamran; Alibakhshi, Reza

    2018-01-01

    In the current manuscript, the process of spacecraft docking, as one of the main risky operations in an on-orbit servicing mission, is modeled based on unconstrained multibody dynamics. The spring-damper buffering device is utilized here in the docking probe-cone system for micro-satellites. Owing to the impact occurs inevitably during docking process and the motion characteristics of multibody systems are remarkably affected by this phenomenon, a continuous contact force model needs to be considered. Spring-damper buffering device, keeping the spacecraft stable in an orbit when impact occurs, connects a base (cylinder) inserted in the chaser satellite and the end of docking probe. Furthermore, by considering a revolute joint equipped with torsional shock absorber, between base and chaser satellite, the docking probe can experience both translational and rotational motions simultaneously. Although spacecraft docking process accompanied by the buffering mechanisms may be modeled by constrained multibody dynamics, this paper deals with a simple and efficient formulation to eliminate the surplus generalized coordinates and solve the impact docking problem based on unconstrained Lagrangian mechanics. By an example problem, first, model verification is accomplished by comparing the computed results with those recently reported in the literature. Second, according to a new alternative validation approach, which is based on constrained multibody problem, the accuracy of presented model can be also evaluated. This proposed verification approach can be applied to indirectly solve the constrained multibody problems by minimum required effort. The time history of impact force, the influence of system flexibility and physical interaction between shock absorber and penetration depth caused by impact are the issues followed in this paper. Third, the MATLAB/SIMULINK multibody dynamic analysis software will be applied to build impact docking model to validate computed results and then, investigate the trajectories of both satellites to take place the successful capture process.

  5. Transferring control demands across incidental learning tasks – stronger sequence usage in serial reaction task after shortcut option in letter string checking

    PubMed Central

    Gaschler, Robert; Marewski, Julian N.; Wenke, Dorit; Frensch, Peter A.

    2014-01-01

    After incidentally learning about a hidden regularity, participants can either continue to solve the task as instructed or, alternatively, apply a shortcut. Past research suggests that the amount of conflict implied by adopting a shortcut seems to bias the decision for vs. against continuing instruction-coherent task processing. We explored whether this decision might transfer from one incidental learning task to the next. Theories that conceptualize strategy change in incidental learning as a learning-plus-decision phenomenon suggest that high demands to adhere to instruction-coherent task processing in Task 1 will impede shortcut usage in Task 2, whereas low control demands will foster it. We sequentially applied two established incidental learning tasks differing in stimuli, responses and hidden regularity (the alphabet verification task followed by the serial reaction task, SRT). While some participants experienced a complete redundancy in the task material of the alphabet verification task (low demands to adhere to instructions), for others the redundancy was only partial. Thus, shortcut application would have led to errors (high demands to follow instructions). The low control demand condition showed the strongest usage of the fixed and repeating sequence of responses in the SRT. The transfer results are in line with the learning-plus-decision view of strategy change in incidental learning, rather than with resource theories of self-control. PMID:25506336

  6. Rapid Verification of Candidate Serological Biomarkers Using Gel-based, Label-free Multiple Reaction Monitoring

    PubMed Central

    Tang, Hsin-Yao; Beer, Lynn A.; Barnhart, Kurt T.; Speicher, David W.

    2011-01-01

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves, quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1-D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μl serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers. PMID:21726088

  7. Rapid verification of candidate serological biomarkers using gel-based, label-free multiple reaction monitoring.

    PubMed

    Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W

    2011-09-02

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.

  8. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE PAGES

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    2015-12-10

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  9. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  10. Nonlinear Earthquake Analysis of Reinforced Concrete Frames with Fiber and Bernoulli-Euler Beam-Column Element

    PubMed Central

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched. PMID:24578667

  11. Dynamic Analysis and Control of Lightweight Manipulators with Flexible Parallel Link Mechanisms. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Lee, Jeh Won

    1990-01-01

    The objective is the theoretical analysis and the experimental verification of dynamics and control of a two link flexible manipulator with a flexible parallel link mechanism. Nonlinear equations of motion of the lightweight manipulator are derived by the Lagrangian method in symbolic form to better understand the structure of the dynamic model. The resulting equation of motion have a structure which is useful to reduce the number of terms calculated, to check correctness, or to extend the model to higher order. A manipulator with a flexible parallel link mechanism is a constrained dynamic system whose equations are sensitive to numerical integration error. This constrained system is solved using singular value decomposition of the constraint Jacobian matrix. Elastic motion is expressed by the assumed mode method. Mode shape functions of each link are chosen using the load interfaced component mode synthesis. The discrepancies between the analytical model and the experiment are explained using a simplified and a detailed finite element model.

  12. Dynamic fracture toughness of ASME SA508 Class 2a ASME SA533 grade A Class 2 base and heat affected zone material and applicable weld metals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logsdon, W.A.; Begley, J.A.; Gottshall, C.L.

    1978-03-01

    The ASME Boiler and Pressure Vessel Code, Section III, Article G-2000, requires that dynamic fracture toughness data be developed for materials with specified minimum yield strengths greater than 50 ksi to provide verification and utilization of the ASME specified minimum reference toughness K/sub IR/ curve. In order to qualify ASME SA508 Class 2a and ASME SA533 Grade A Class 2 pressure vessel steels (minimum yield strengths equal 65 kip/in./sup 2/ and 70 kip/in./sup 2/, respectively) per this requirement, dynamic fracture toughness tests were performed on these materials. All dynamic fracture toughness values of SA508 Class 2a base and HAZ material,more » SA533 Grade A Class 2 base and HAZ material, and applicable weld metals exceeded the ASME specified minimum reference toughness K/sub IR/ curve.« less

  13. Verification of a 2 kWe Closed-Brayton-Cycle Power Conversion System Mechanical Dynamics Model

    NASA Technical Reports Server (NTRS)

    Ludwiczak, Damian R.; Le, Dzu K.; McNelis, Anne M.; Yu, Albert C.; Samorezov, Sergey; Hervol, Dave S.

    2005-01-01

    Vibration test data from an operating 2 kWe closed-Brayton-cycle (CBC) power conversion system (PCS) located at the NASA Glenn Research Center was used for a comparison with a dynamic disturbance model of the same unit. This effort was performed to show that a dynamic disturbance model of a CBC PCS can be developed that can accurately predict the torque and vibration disturbance fields of such class of rotating machinery. The ability to accurately predict these disturbance fields is required before such hardware can be confidently integrated onto a spacecraft mission. Accurate predictions of CBC disturbance fields will be used for spacecraft control/structure interaction analyses and for understanding the vibration disturbances affecting the scientific instrumentation onboard. This paper discusses how test cell data measurements for the 2 kWe CBC PCS were obtained, the development of a dynamic disturbance model used to predict the transient torque and steady state vibration fields of the same unit, and a comparison of the two sets of data.

  14. Experimental Results from the Thermal Energy Storage-1 (TES-1) Flight Experiment

    NASA Technical Reports Server (NTRS)

    Wald, Lawrence W.; Tolbert, Carol; Jacqmin, David

    1995-01-01

    The Thermal Energy Storage-1 (TES-1) is a flight experiment that flew on the Space Shuttle Columbia (STS-62), in March 1994, as part of the OAST-2 mission. TES-1 is the first experiment in a four experiment suite designed to provide data for understanding the long duration microgravity behavior of thermal energy storage fluoride salts that undergo repeated melting and freezing. Such data have never been obtained before and have direct application for the development of space-based solar dynamic (SD) power systems. These power systems will store solar energy in a thermal energy salt such as lithium fluoride or calcium fluoride. The stored energy is extracted during the shade portion of the orbit. This enables the solar dynamic power system to provide constant electrical power over the entire orbit. Analytical computer codes have been developed for predicting performance of a spaced-based solar dynamic power system. Experimental verification of the analytical predictions is needed prior to using the analytical results for future space power design applications. The four TES flight experiments will be used to obtain the needed experimental data. This paper will focus on the flight results from the first experiment, TES-1, in comparison to the predicted results from the Thermal Energy Storage Simulation (TESSIM) analytical computer code. The TES-1 conceptual development, hardware design, final development, and system verification testing were accomplished at the NASA lewis Research Center (LeRC). TES-1 was developed under the In-Space Technology Experiment Program (IN-STEP), which sponsors NASA, industry, and university flight experiments designed to enable and enhance space flight technology. The IN-STEP Program is sponsored by the Office of Space Access and Technology (OSAT).

  15. Development of emergent processing loops as a system of systems concept

    NASA Astrophysics Data System (ADS)

    Gainey, James C., Jr.; Blasch, Erik P.

    1999-03-01

    This paper describes an engineering approach toward implementing the current neuroscientific understanding of how the primate brain fuses, or integrates, 'information' in the decision-making process. We describe a System of Systems (SoS) design for improving the overall performance, capabilities, operational robustness, and user confidence in Identification (ID) systems and show how it could be applied to biometrics security. We use the Physio-associative temporal sensor integration algorithm (PATSIA) which is motivated by observed functions and interactions of the thalamus, hippocampus, and cortical structures in the brain. PATSIA utilizes signal theory mathematics to model how the human efficiently perceives and uses information from the environment. The hybrid architecture implements a possible SoS-level description of the Joint Directors of US Laboratories for Fusion Working Group's functional description involving 5 levels of fusion and their associated definitions. This SoS architecture propose dynamic sensor and knowledge-source integration by implementing multiple Emergent Processing Loops for predicting, feature extracting, matching, and Searching both static and dynamic database like MSTAR's PEMS loops. Biologically, this effort demonstrates these objectives by modeling similar processes from the eyes, ears, and somatosensory channels, through the thalamus, and to the cortices as appropriate while using the hippocampus for short-term memory search and storage as necessary. The particular approach demonstrated incorporates commercially available speaker verification and face recognition software and hardware to collect data and extract features to the PATSIA. The PATSIA maximizes the confidence levels for target identification or verification in dynamic situations using a belief filter. The proof of concept described here is easily adaptable and scaleable to other military and nonmilitary sensor fusion applications.

  16. Multi-site assessment of the precision and reproducibility of multiple reaction monitoring–based measurements of proteins in plasma

    PubMed Central

    Addona, Terri A; Abbatiello, Susan E; Schilling, Birgit; Skates, Steven J; Mani, D R; Bunk, David M; Spiegelman, Clifford H; Zimmerman, Lisa J; Ham, Amy-Joan L; Keshishian, Hasmik; Hall, Steven C; Allen, Simon; Blackman, Ronald K; Borchers, Christoph H; Buck, Charles; Cardasis, Helene L; Cusack, Michael P; Dodder, Nathan G; Gibson, Bradford W; Held, Jason M; Hiltke, Tara; Jackson, Angela; Johansen, Eric B; Kinsinger, Christopher R; Li, Jing; Mesri, Mehdi; Neubert, Thomas A; Niles, Richard K; Pulsipher, Trenton C; Ransohoff, David; Rodriguez, Henry; Rudnick, Paul A; Smith, Derek; Tabb, David L; Tegeler, Tony J; Variyath, Asokan M; Vega-Montoto, Lorenzo J; Wahlander, Åsa; Waldemarson, Sofia; Wang, Mu; Whiteaker, Jeffrey R; Zhao, Lei; Anderson, N Leigh; Fisher, Susan J; Liebler, Daniel C; Paulovich, Amanda G; Regnier, Fred E; Tempst, Paul; Carr, Steven A

    2010-01-01

    Verification of candidate biomarkers relies upon specific, quantitative assays optimized for selective detection of target proteins, and is increasingly viewed as a critical step in the discovery pipeline that bridges unbiased biomarker discovery to preclinical validation. Although individual laboratories have demonstrated that multiple reaction monitoring (MRM) coupled with isotope dilution mass spectrometry can quantify candidate protein biomarkers in plasma, reproducibility and transferability of these assays between laboratories have not been demonstrated. We describe a multilaboratory study to assess reproducibility, recovery, linear dynamic range and limits of detection and quantification of multiplexed, MRM-based assays, conducted by NCI-CPTAC. Using common materials and standardized protocols, we demonstrate that these assays can be highly reproducible within and across laboratories and instrument platforms, and are sensitive to low µg/ml protein concentrations in unfractionated plasma. We provide data and benchmarks against which individual laboratories can compare their performance and evaluate new technologies for biomarker verification in plasma. PMID:19561596

  17. Performance verification and system parameter identification of spacecraft tape recorder control servo

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1979-01-01

    Design adequacy of the lead-lag compensator of the frequency loop, accuracy checking of the analytical expression for the electrical motor transfer function, and performance evaluation of the speed control servo of the digital tape recorder used on-board the 1976 Viking Mars Orbiters and Voyager 1977 Jupiter-Saturn flyby spacecraft are analyzed. The transfer functions of the most important parts of a simplified frequency loop used for test simulation are described and ten simulation cases are reported. The first four of these cases illustrate the method of selecting the most suitable transfer function for the hysteresis synchronous motor, while the rest verify and determine the servo performance parameters and alternative servo compensation schemes. It is concluded that the linear methods provide a starting point for the final verification/refinement of servo design by nonlinear time response simulation and that the variation of the parameters of the static/dynamic Coulomb friction is as expected in a long-life space mission environment.

  18. Seismic behavior of a low-rise horizontal cylindrical tank

    NASA Astrophysics Data System (ADS)

    Fiore, Alessandra; Rago, Carlo; Vanzi, Ivo; Greco, Rita; Briseghella, Bruno

    2018-05-01

    Cylindrical storage tanks are widely used for various types of liquids, including hazardous contents, thus requiring suitable and careful design for seismic actions. The study herein presented deals with the dynamic analysis of a ground-based horizontal cylindrical tank containing butane and with its safety verification. The analyses are based on a detailed finite element (FE) model; a simplified one-degree-of-freedom idealization is also set up and used for verification of the FE results. Particular attention is paid to sloshing and asynchronous seismic input effects. Sloshing effects are investigated according to the current literature state of the art. An efficient methodology based on an "impulsive-convective" decomposition of the container-fluid motion is adopted for the calculation of the seismic force. The effects of asynchronous ground motion are studied by suitable pseudo-static analyses. Comparison between seismic action effects, obtained with and without consideration of sloshing and asynchronous seismic input, shows a rather important influence of these conditions on the final results.

  19. Specification and Verification of Web Applications in Rewriting Logic

    NASA Astrophysics Data System (ADS)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  20. Performance assessment of FY-3C/MERSI on early orbit

    NASA Astrophysics Data System (ADS)

    Hu, Xiuqing; Xu, Na; Wu, Ronghua; Chen, Lin; Min, Min; Wang, Ling; Xu, Hanlie; Sun, Ling; Yang, Zhongdong; Zhang, Peng

    2014-11-01

    FY-3C/MERSI has some remarkable improvements compared to the previous MERSIs including better spectral response function (SRF) consistency of different detectors within one band, increasing the capability of lunar observation by space view (SV) and the improvement of radiometric response stability of solar bands. During the In-orbit verification (IOV) commissioning phase, early results that indicate the MERSI representative performance were derived, including the signal noise ratio (SNR), dynamic range, MTF, B2B registration, calibration bias and instrument stability. The SNRs at the solar bands (Bands 1-4 and 6-20) was largely beyond the specifications except for two NIR bands. The in-flight calibration and verification for these bands are also heavily relied on the vicarious techniques such as China radiometric calibration sites(CRCS), cross-calibration, lunar calibration, DCC calibration, stability monitoring using Pseudo Invariant Calibration Sites (PICS) and multi-site radiance simulation. This paper will give the results of the above several calibration methods and monitoring the instrument degradation in early on-orbit time.

  1. Evaluation of an expert system for fault detection, isolation, and recovery in the manned maneuvering unit

    NASA Technical Reports Server (NTRS)

    Rushby, John; Crow, Judith

    1990-01-01

    The authors explore issues in the specification, verification, and validation of artificial intelligence (AI) based software, using a prototype fault detection, isolation and recovery (FDIR) system for the Manned Maneuvering Unit (MMU). They use this system as a vehicle for exploring issues in the semantics of C-Language Integrated Production System (CLIPS)-style rule-based languages, the verification of properties relating to safety and reliability, and the static and dynamic analysis of knowledge based systems. This analysis reveals errors and shortcomings in the MMU FDIR system and raises a number of issues concerning software engineering in CLIPs. The authors came to realize that the MMU FDIR system does not conform to conventional definitions of AI software, despite the fact that it was intended and indeed presented as an AI system. The authors discuss this apparent disparity and related questions such as the role of AI techniques in space and aircraft operations and the suitability of CLIPS for critical applications.

  2. Design Considerations and Experimental Verification of a Rail Brake Armature Based on Linear Induction Motor Technology

    NASA Astrophysics Data System (ADS)

    Sakamoto, Yasuaki; Kashiwagi, Takayuki; Hasegawa, Hitoshi; Sasakawa, Takashi; Fujii, Nobuo

    This paper describes the design considerations and experimental verification of an LIM rail brake armature. In order to generate power and maximize the braking force density despite the limited area between the armature and the rail and the limited space available for installation, we studied a design method that is suitable for designing an LIM rail brake armature; we considered adoption of a ring winding structure. To examine the validity of the proposed design method, we developed a prototype ring winding armature for the rail brakes and examined its electromagnetic characteristics in a dynamic test system with roller rigs. By repeating various tests, we confirmed that unnecessary magnetic field components, which were expected to be present under high speed running condition or when a ring winding armature was used, were not present. Further, the necessary magnetic field component and braking force attained the desired values. These studies have helped us to develop a basic design method that is suitable for designing the LIM rail brake armatures.

  3. Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena

    2010-01-01

    The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.

  4. TESTOSTERONE AND SPORT: CURRENT PERSPECTIVES

    PubMed Central

    Wood, Ruth I.; Stanton, Steven J.

    2011-01-01

    Testosterone and other anabolic-androgenic steroids enhance athletic performance in men and women. As a result, exogenous androgen is banned from most competitive sports. However, due to variability in endogenous secretion, and similarities with exogenous testosterone, it has been challenging to establish allowable limits for testosterone in competition. Endogenous androgen production is dynamically regulated by both exercise and winning in competition. Furthermore, testosterone may promote athletic performance, not only through its long-term anabolic actions, but also through rapid effects on behavior. In women, excess production of endogenous testosterone due to inborn disorders of sexual development (DSD) may convey a competitive advantage. For many years, female competitors have been subject to tests of sexual genotype and phenotype known as gender verification. Although gender verification has not identified any normal man competing as a woman, this process has identified women athletes with DSD. As understanding of DSD has expanded in recent years, women with DSD are increasingly able to continue athletic competition. PMID:21983229

  5. Transfer function verification and block diagram simplification of a very high-order distributed pole closed-loop servo by means of non-linear time-response simulation

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1975-01-01

    Linear frequency domain methods are inadequate in analyzing the 1975 Viking Orbiter (VO75) digital tape recorder servo due to dominant nonlinear effects such as servo signal limiting, unidirectional servo control, and static/dynamic Coulomb friction. The frequency loop (speed control) servo of the VO75 tape recorder is used to illustrate the analytical tools and methodology of system redundancy elimination and high order transfer function verification. The paper compares time-domain performance parameters derived from a series of nonlinear time responses with the available experimental data in order to select the best possible analytical transfer function representation of the tape transport (mechanical segment of the tape recorder) from several possible candidates. The study also shows how an analytical time-response simulation taking into account most system nonlinearities can pinpoint system redundancy and overdesign stemming from a strictly empirical design approach. System order reduction is achieved through truncation of individual transfer functions and elimination of redundant blocks.

  6. Using Dynamic Interface Modeling and Simulation to Develop a Launch and Recovery Flight Simulation for a UH-60A Blackhawk

    NASA Technical Reports Server (NTRS)

    Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike

    2001-01-01

    Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and integrating a real-time ship motion model developed based on a batch model from Naval Surface Warfare Center. Engineering development and integration of a three degrees-of-freedom (DOF) dynamic seat to simulate high frequency rotor-dynamics dependent motion cues for use in conjunction with the large motion system was accomplished. The development of an LHA visual model in several different levels of resolution and an aural cueing system in which three separate fidelity levels could be selected were developed. VMS also integrated a PC-based E&S simFUSION system to investigate cost effective IG alternatives. The DIMSS project consists of three phases that follow an approved Validation, Verification and accreditation (VV&A) process. The first phase will support the accreditation of the individual subsystems and models. The second will follow the verification and validation of the integrated subsystems and models, and will address fidelity requirements of the integrated models and subsystems. The third and final phase will allow the verification and validation of the full system integration. This VV&A process will address the utility of the simulated WOD launch and recovery envelope. Simulations supporting the first two stages have been completed and the data is currently being reviewed and analyzed.

  7. Efficient and Scalable Graph Similarity Joins in MapReduce

    PubMed Central

    Chen, Yifan; Zhang, Weiming; Tang, Jiuyang

    2014-01-01

    Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results. PMID:25121135

  8. Efficient and scalable graph similarity joins in MapReduce.

    PubMed

    Chen, Yifan; Zhao, Xiang; Xiao, Chuan; Zhang, Weiming; Tang, Jiuyang

    2014-01-01

    Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results.

  9. Summary of the 2014 Sandia V&V Challenge Workshop

    DOE PAGES

    Schroeder, Benjamin B.; Hu, Kenneth T.; Mullins, Joshua Grady; ...

    2016-02-19

    A discussion of the five responses to the 2014 Sandia Verification and Validation (V&V) Challenge Problem, presented within this special issue, is provided hereafter. Overviews of the challenge problem workshop, workshop participants, and the problem statement are also included. Brief summations of teams' responses to the challenge problem are provided. Issues that arose throughout the responses that are deemed applicable to the general verification, validation, and uncertainty quantification (VVUQ) community are the main focal point of this paper. The discussion is oriented and organized into big picture comparison of data and model usage, VVUQ activities, and differentiating conceptual themes behindmore » the teams' VVUQ strategies. Significant differences are noted in the teams' approaches toward all VVUQ activities, and those deemed most relevant are discussed. Beyond the specific details of VVUQ implementations, thematic concepts are found to create differences among the approaches; some of the major themes are discussed. Lastly, an encapsulation of the key contributions, the lessons learned, and advice for the future are presented.« less

  10. SCA security verification on wireless sensor network node

    NASA Astrophysics Data System (ADS)

    He, Wei; Pizarro, Carlos; de la Torre, Eduardo; Portilla, Jorge; Riesgo, Teresa

    2011-05-01

    Side Channel Attack (SCA) differs from traditional mathematic attacks. It gets around of the exhaustive mathematic calculation and precisely pin to certain points in the cryptographic algorithm to reveal confidential information from the running crypto-devices. Since the introduction of SCA by Paul Kocher et al [1], it has been considered to be one of the most critical threats to the resource restricted but security demanding applications, such as wireless sensor networks. In this paper, we focus our work on the SCA-concerned security verification on WSN (wireless sensor network). A detailed setup of the platform and an analysis of the results of DPA (power attack) and EMA (electromagnetic attack) is presented. The setup follows the way of low-cost setup to make effective SCAs. Meanwhile, surveying the weaknesses of WSNs in resisting SCA attacks, especially for the EM attack. Finally, SCA-Prevention suggestions based on Differential Security Strategy for the FPGA hardware implementation in WSN will be given, helping to get an improved compromise between security and cost.

  11. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  12. Method and apparatus for routing data in an inter-nodal communications lattice of a massively parallel computer system by dynamically adjusting local routing strategies

    DOEpatents

    Archer, Charles Jens; Musselman, Roy Glenn; Peters, Amanda; Pinnow, Kurt Walter; Swartz, Brent Allen; Wallenfelt, Brian Paul

    2010-03-16

    A massively parallel computer system contains an inter-nodal communications network of node-to-node links. Each node implements a respective routing strategy for routing data through the network, the routing strategies not necessarily being the same in every node. The routing strategies implemented in the nodes are dynamically adjusted during application execution to shift network workload as required. Preferably, adjustment of routing policies in selective nodes is performed at synchronization points. The network may be dynamically monitored, and routing strategies adjusted according to detected network conditions.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    VANNONI, MICHAEL G.; BIRINGER, KENT L.; TROST, LAWRENCE C.

    Missiles are attractive weapon systems because of their flexibility, survivability, and relatively low cost. Consequently, many nations are seeking to build missile forces resulting in regional arms races. Missile forces can be both stabilizing (e.g., providing a survivable force for deterrence) and destabilizing (e.g., creating strategic asymmetries). Efforts to control missile proliferation must account for these effects. A number of strategies to control the destabilizing effects of missiles were developed during the Cold War. Some of these strategies are applicable to regional missile control but new approaches, tailored to regional geographic and security conditions, are needed. Regional missile nonproliferation canmore » be pursued in a variety of ways: Reducing the demand for missiles by decreasing the perception of national threats; Restricting the export of missiles and associated equipment by supplier countries; Restricting information describing missile technology; Limiting missile development activities such as flight or engine tests; Restricting the operational deployment of existing missile forces; and Reducing existing missile forces by number and/or type. Even when development is complete, limits on deployment within range of potential targets or limits on operational readiness can help stabilize potential missile confrontations. Implementing these strategies often involves the collection and exchange of information about activities related to missile development or deployment. Monitoring is the process of collecting information used to for subsequent verification of commitments. A systematic approach to implementing verification is presented that identifies areas where monitoring could support missile nonproliferation agreements. The paper presents both non-technical and technical techniques for monitoring. Examples of non-technical techniques are declarations about planned test launches or on-site inspections. Examples of technical monitoring include remote monitoring (i.e., a sensor that is physically present at a facility) and remote sensing (i.e., a sensor that records activity without being physically present at a facility).« less

  14. Identification of Multiple Novel Protein Biomarkers Shed by Human Serous Ovarian Tumors into the Blood of Immunocompromised Mice and Verified in Patient Sera

    PubMed Central

    Beer, Lynn A.; Wang, Huan; Tang, Hsin-Yao; Cao, Zhijun; Chang-Wong, Tony; Tanyi, Janos L.; Zhang, Rugang; Liu, Qin; Speicher, David W.

    2013-01-01

    The most cancer-specific biomarkers in blood are likely to be proteins shed directly by the tumor rather than less specific inflammatory or other host responses. The use of xenograft mouse models together with in-depth proteome analysis for identification of human proteins in the mouse blood is an under-utilized strategy that can clearly identify proteins shed by the tumor. In the current study, 268 human proteins shed into mouse blood from human OVCAR-3 serous tumors were identified based upon human vs. mouse species differences using a four-dimensional plasma proteome fractionation strategy. A multi-step prioritization and verification strategy was subsequently developed to efficiently select some of the most promising biomarkers from this large number of candidates. A key step was parallel analysis of human proteins detected in the tumor supernatant, because substantially greater sequence coverage for many of the human proteins initially detected in the xenograft mouse plasma confirmed assignments as tumor-derived human proteins. Verification of candidate biomarkers in patient sera was facilitated by in-depth, label-free quantitative comparisons of serum pools from patients with ovarian cancer and benign ovarian tumors. The only proteins that advanced to multiple reaction monitoring (MRM) assay development were those that exhibited increases in ovarian cancer patients compared with benign tumor controls. MRM assays were facilely developed for all 11 novel biomarker candidates selected by this process and analysis of larger pools of patient sera suggested that all 11 proteins are promising candidate biomarkers that should be further evaluated on individual patient blood samples. PMID:23544127

  15. Star tracking method based on multiexposure imaging for intensified star trackers.

    PubMed

    Yu, Wenbo; Jiang, Jie; Zhang, Guangjun

    2017-07-20

    The requirements for the dynamic performance of star trackers are rapidly increasing with the development of space exploration technologies. However, insufficient knowledge of the angular acceleration has largely decreased the performance of the existing star tracking methods, and star trackers may even fail to track under highly dynamic conditions. This study proposes a star tracking method based on multiexposure imaging for intensified star trackers. The accurate estimation model of the complete motion parameters, including the angular velocity and angular acceleration, is established according to the working characteristic of multiexposure imaging. The estimation of the complete motion parameters is utilized to generate the predictive star image accurately. Therefore, the correct matching and tracking between stars in the real and predictive star images can be reliably accomplished under highly dynamic conditions. Simulations with specific dynamic conditions are conducted to verify the feasibility and effectiveness of the proposed method. Experiments with real starry night sky observation are also conducted for further verification. Simulations and experiments demonstrate that the proposed method is effective and shows excellent performance under highly dynamic conditions.

  16. Cooperation guided by the coexistence of imitation dynamics and aspiration dynamics in structured populations

    NASA Astrophysics Data System (ADS)

    Xu, Kuangyi; Li, Kun; Cong, Rui; Wang, Long

    2017-02-01

    In the framework of the evolutionary game theory, two fundamentally different mechanisms, the imitation process and the aspiration-driven dynamics, can be adopted by players to update their strategies. In the former case, individuals imitate the strategy of a more successful peer, while in the latter case individuals change their strategies based on a comparison of payoffs they collect in the game to their own aspiration levels. Here we explore how cooperation evolves for the coexistence of these two dynamics. Intriguingly, cooperation reaches its lowest level when a certain moderate fraction of individuals pick aspiration-level-driven rule while the others choose pairwise comparison rule. Furthermore, when individuals can adjust their update rules besides their strategies, either imitation dynamics or aspiration-driven dynamics will finally take over the entire population, and the stationary cooperation level is determined by the outcome of competition between these two dynamics. We find that appropriate synergetic effects and moderate aspiration level boost the fixation probability of aspiration-driven dynamics most effectively. Our work may be helpful in understanding the cooperative behavior induced by the coexistence of imitation dynamics and aspiration dynamics in the society.

  17. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  18. Influence of backup bearings and support structure dynamics on the behavior of rotors with active supports

    NASA Technical Reports Server (NTRS)

    Flowers, George T.

    1994-01-01

    Substantial progress has been made toward the goals of this research effort in the past six months. A simplified rotor model with a flexible shaft and backup bearings has been developed. The model is based upon the work of Ishii and Kirk. Parameter studies of the behavior of this model are currently being conducted. A simple rotor model which includes a flexible disk and bearings with clearance has been developed and the dynamics of the model investigated. The study consists of simulation work coupled with experimental verification. The work is documented in the attached paper. A rotor model based upon the T-501 engine has been developed which includes backup bearing effects. The dynamics of this model are currently being studied with the objective of verifying the conclusions obtained from the simpler models. Parallel simulation runs are being conducted using an ANSYS based finite element model of the T-501.

  19. Verification, Validation, and Solution Quality in Computational Physics: CFD Methods Applied to Ice Sheet Physics

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    2005-01-01

    Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.

  20. Design and Performance Evaluation of an Electro-Hydraulic Camless Engine Valve Actuator for Future Vehicle Applications

    PubMed Central

    Nam, Kanghyun; Cho, Kwanghyun; Park, Sang-Shin; Choi, Seibum B.

    2017-01-01

    This paper details the new design and dynamic simulation of an electro-hydraulic camless engine valve actuator (EH-CEVA) and experimental verification with lift position sensors. In general, camless engine technologies have been known for improving fuel efficiency, enhancing power output, and reducing emissions of internal combustion engines. Electro-hydraulic valve actuators are used to eliminate the camshaft of an existing internal combustion engines and used to control the valve timing and valve duration independently. This paper presents novel electro-hydraulic actuator design, dynamic simulations, and analysis based on design specifications required to satisfy the operation performances. An EH-CEVA has initially been designed and modeled by means of a powerful hydraulic simulation software, AMESim, which is useful for the dynamic simulations and analysis of hydraulic systems. Fundamental functions and performances of the EH-CEVA have been validated through comparisons with experimental results obtained in a prototype test bench. PMID:29258270

  1. Material Model Evaluation of a Composite Honeycomb Energy Absorber

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Annett, Martin S.; Fasanella, Edwin L.; Polanco, Michael A.

    2012-01-01

    A study was conducted to evaluate four different material models in predicting the dynamic crushing response of solid-element-based models of a composite honeycomb energy absorber, designated the Deployable Energy Absorber (DEA). Dynamic crush tests of three DEA components were simulated using the nonlinear, explicit transient dynamic code, LS-DYNA . In addition, a full-scale crash test of an MD-500 helicopter, retrofitted with DEA blocks, was simulated. The four material models used to represent the DEA included: *MAT_CRUSHABLE_FOAM (Mat 63), *MAT_HONEYCOMB (Mat 26), *MAT_SIMPLIFIED_RUBBER/FOAM (Mat 181), and *MAT_TRANSVERSELY_ANISOTROPIC_CRUSHABLE_FOAM (Mat 142). Test-analysis calibration metrics included simple percentage error comparisons of initial peak acceleration, sustained crush stress, and peak compaction acceleration of the DEA components. In addition, the Roadside Safety Verification and Validation Program (RSVVP) was used to assess similarities and differences between the experimental and analytical curves for the full-scale crash test.

  2. Design and Performance Evaluation of an Electro-Hydraulic Camless Engine Valve Actuator for Future Vehicle Applications.

    PubMed

    Nam, Kanghyun; Cho, Kwanghyun; Park, Sang-Shin; Choi, Seibum B

    2017-12-18

    This paper details the new design and dynamic simulation of an electro-hydraulic camless engine valve actuator (EH-CEVA) and experimental verification with lift position sensors. In general, camless engine technologies have been known for improving fuel efficiency, enhancing power output, and reducing emissions of internal combustion engines. Electro-hydraulic valve actuators are used to eliminate the camshaft of an existing internal combustion engines and used to control the valve timing and valve duration independently. This paper presents novel electro-hydraulic actuator design, dynamic simulations, and analysis based on design specifications required to satisfy the operation performances. An EH-CEVA has initially been designed and modeled by means of a powerful hydraulic simulation software, AMESim, which is useful for the dynamic simulations and analysis of hydraulic systems. Fundamental functions and performances of the EH-CEVA have been validated through comparisons with experimental results obtained in a prototype test bench.

  3. Transport composite fuselage technology: Impact dynamics and acoustic transmission

    NASA Technical Reports Server (NTRS)

    Jackson, A. C.; Balena, F. J.; Labarge, W. L.; Pei, G.; Pitman, W. A.; Wittlin, G.

    1986-01-01

    A program was performed to develop and demonstrate the impact dynamics and acoustic transmission technology for a composite fuselage which meets the design requirements of a 1990 large transport aircraft without substantial weight and cost penalties. The program developed the analytical methodology for the prediction of acoustic transmission behavior of advanced composite stiffened shell structures. The methodology predicted that the interior noise level in a composite fuselage due to turbulent boundary layer will be less than in a comparable aluminum fuselage. The verification of these analyses will be performed by NASA Langley Research Center using a composite fuselage shell fabricated by filament winding. The program also developed analytical methodology for the prediction of the impact dynamics behavior of lower fuselage structure constructed with composite materials. Development tests were performed to demonstrate that the composite structure designed to the same operating load requirement can have at least the same energy absorption capability as aluminum structure.

  4. Heave-pitch-roll analysis and testing of air cushion landing systems

    NASA Technical Reports Server (NTRS)

    Boghani, A. B.; Captain, K. M.; Wormley, D. N.

    1978-01-01

    The analytical tools (analysis and computer simulation) needed to explain and predict the dynamic operation of air cushion landing systems (ACLS) is described. The following tasks were performed: the development of improved analytical models for the fan and the trunk; formulation of a heave pitch roll analysis for the complete ACLS; development of a general purpose computer simulation to evaluate landing and taxi performance of an ACLS equipped aircraft; and the verification and refinement of the analysis by comparison with test data obtained through lab testing of a prototype cushion. Demonstration of simulation capabilities through typical landing and taxi simulation of an ACLS aircraft are given. Initial results show that fan dynamics have a major effect on system performance. Comparison with lab test data (zero forward speed) indicates that the analysis can predict most of the key static and dynamic parameters (pressure, deflection, acceleration, etc.) within a margin of a 10 to 25 percent.

  5. Comparative analysis of dynamic pricing strategies for managed lanes.

    DOT National Transportation Integrated Search

    2015-06-01

    The objective of this research is to investigate and compare the performances of different : dynamic pricing strategies for managed lanes facilities. These pricing strategies include real-time : traffic responsive methods, as well as refund options a...

  6. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  7. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  8. An analytical Study on Dynamics of Public Procurement System and Bidding-Strategy in Local Contractor's Management

    NASA Astrophysics Data System (ADS)

    Ninomiya, Hitoshi; Nanerikawa, Susumu

    Public procurement system such as Overall-Evaluation dynamically has been changed on local public works in Japan. However some characteristics of Bidding-Strategy and procurement system have not enough clarified. This paper attempt to analysis for a syatem dynamics and mechanisum of Overall-Evaluation by developing new simulation model focused on Bidding-Strategy, to propose some improvement scenario.

  9. Hypothesis testing in students: Sequences, stages, and instructional strategies

    NASA Astrophysics Data System (ADS)

    Moshman, David; Thompson, Pat A.

    Six sequences in the development of hypothesis-testing conceptions are proposed, involving (a) interpretation of the hypothesis; (b) the distinction between using theories and testing theories; (c) the consideration of multiple possibilities; (d) the relation of theory and data; (e) the nature of verification and falsification; and (f) the relation of truth and falsity. An alternative account is then provided involving three global stages: concrete operations, formal operations, and a postformal metaconstructivestage. Relative advantages and difficulties of the stage and sequence conceptualizations are discussed. Finally, three families of teaching strategy are distinguished, which emphasize, respectively: (a) social transmission of knowledge; (b) carefully sequenced empirical experience by the student; and (c) self-regulated cognitive activity of the student. It is argued on the basis of Piaget's theory that the last of these plays a crucial role in the construction of such logical reasoning strategies as those involved in testing hypotheses.

  10. Hypothesis-Testing Demands Trustworthy Data—A Simulation Approach to Inferential Statistics Advocating the Research Program Strategy

    PubMed Central

    Krefeld-Schwalb, Antonia; Witte, Erich H.; Zenker, Frank

    2018-01-01

    In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H0-hypothesis to a statistical H1-verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a “pure” Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis. PMID:29740363

  11. Hypothesis-Testing Demands Trustworthy Data-A Simulation Approach to Inferential Statistics Advocating the Research Program Strategy.

    PubMed

    Krefeld-Schwalb, Antonia; Witte, Erich H; Zenker, Frank

    2018-01-01

    In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H 0 -hypothesis to a statistical H 1 -verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a "pure" Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis.

  12. Adaptive Control Based Harvesting Strategy for a Predator-Prey Dynamical System.

    PubMed

    Sen, Moitri; Simha, Ashutosh; Raha, Soumyendu

    2018-04-23

    This paper deals with designing a harvesting control strategy for a predator-prey dynamical system, with parametric uncertainties and exogenous disturbances. A feedback control law for the harvesting rate of the predator is formulated such that the population dynamics is asymptotically stabilized at a positive operating point, while maintaining a positive, steady state harvesting rate. The hierarchical block strict feedback structure of the dynamics is exploited in designing a backstepping control law, based on Lyapunov theory. In order to account for unknown parameters, an adaptive control strategy has been proposed in which the control law depends on an adaptive variable which tracks the unknown parameter. Further, a switching component has been incorporated to robustify the control performance against bounded disturbances. Proofs have been provided to show that the proposed adaptive control strategy ensures asymptotic stability of the dynamics at a desired operating point, as well as exact parameter learning in the disturbance-free case and learning with bounded error in the disturbance prone case. The dynamics, with uncertainty in the death rate of the predator, subjected to a bounded disturbance has been simulated with the proposed control strategy.

  13. A Low-Cost Data Acquisition System for Automobile Dynamics Applications

    PubMed Central

    González, Alejandro; Vinolas, Jordi

    2018-01-01

    This project addresses the need for the implementation of low-cost acquisition technology in the field of vehicle engineering: the design, development, manufacture, and verification of a low-cost Arduino-based data acquisition platform to be used in <80 Hz data acquisition in vehicle dynamics, using low-cost accelerometers. In addition to this, a comparative study is carried out of professional vibration acquisition technologies and low-cost systems, obtaining optimum results for low- and medium-frequency operations with an error of 2.19% on road tests. It is therefore concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources. PMID:29382039

  14. A Low-Cost Data Acquisition System for Automobile Dynamics Applications.

    PubMed

    González, Alejandro; Olazagoitia, José Luis; Vinolas, Jordi

    2018-01-27

    This project addresses the need for the implementation of low-cost acquisition technology in the field of vehicle engineering: the design, development, manufacture, and verification of a low-cost Arduino-based data acquisition platform to be used in <80 Hz data acquisition in vehicle dynamics, using low-cost accelerometers. In addition to this, a comparative study is carried out of professional vibration acquisition technologies and low-cost systems, obtaining optimum results for low- and medium-frequency operations with an error of 2.19% on road tests. It is therefore concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources.

  15. Benchmark solution of the dynamic response of a spherical shell at finite strain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Versino, Daniele; Brock, Jerry S.

    2016-09-28

    Our paper describes the development of high fidelity solutions for the study of homogeneous (elastic and inelastic) spherical shells subject to dynamic loading and undergoing finite deformations. The goal of the activity is to provide high accuracy results that can be used as benchmark solutions for the verification of computational physics codes. Furthermore, the equilibrium equations for the geometrically non-linear problem are solved through mode expansion of the displacement field and the boundary conditions are enforced in a strong form. Time integration is performed through high-order implicit Runge–Kutta schemes. Finally, we evaluate accuracy and convergence of the proposed method bymore » means of numerical examples with finite deformations and material non-linearities and inelasticity.« less

  16. Methods for evaluating the predictive accuracy of structural dynamic models

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, Jon D.

    1990-01-01

    Uncertainty of frequency response using the fuzzy set method and on-orbit response prediction using laboratory test data to refine an analytical model are emphasized with respect to large space structures. Two aspects of the fuzzy set approach were investigated relative to its application to large structural dynamics problems: (1) minimizing the number of parameters involved in computing possible intervals; and (2) the treatment of extrema which may occur in the parameter space enclosed by all possible combinations of the important parameters of the model. Extensive printer graphics were added to the SSID code to help facilitate model verification, and an application of this code to the LaRC Ten Bay Truss is included in the appendix to illustrate this graphics capability.

  17. Coherent optimal control of photosynthetic molecules

    NASA Astrophysics Data System (ADS)

    Caruso, F.; Montangero, S.; Calarco, T.; Huelga, S. F.; Plenio, M. B.

    2012-04-01

    We demonstrate theoretically that open-loop quantum optimal control techniques can provide efficient tools for the verification of various quantum coherent transport mechanisms in natural and artificial light-harvesting complexes under realistic experimental conditions. To assess the feasibility of possible biocontrol experiments, we introduce the main settings and derive optimally shaped and robust laser pulses that allow for the faithful preparation of specified initial states (such as localized excitation or coherent superposition, i.e., propagating and nonpropagating states) of the photosystem and probe efficiently the subsequent dynamics. With these tools, different transport pathways can be discriminated, which should facilitate the elucidation of genuine quantum dynamical features of photosystems and therefore enhance our understanding of the role that coherent processes may play in actual biological complexes.

  18. Research on flow stress model and dynamic recrystallization model of X12CrMoWVNbN10-1-1 steel

    NASA Astrophysics Data System (ADS)

    Sui, Da-shan; Wang, Wei; Fu, Bo; Cui, Zhen-shan

    2013-05-01

    Plastic deformation behavior of X12CrMoWVNbN10-1-1 ferrite heat-resistant steel was studied systematically at high temperature. The stress-strain curves were measured at the temperature of 950°C-1250°C and strain rate of 0.0005s-1-0.1s-1 by Gleeble thermo-mechanical simulator. The flow stress model and dynamic recrystallization model were established based on Laasraoui two-stage model. The activation energy was calculated and the parameters were determined accordingly based on the experimental results and Sellars creep equation. The verification was performed to prove the models and it indicated the calculated results were identical to the experimental data.

  19. A complex systems approach to evaluate HIV prevention in metropolitan areas: preliminary implications for combination intervention strategies.

    PubMed

    Marshall, Brandon D L; Paczkowski, Magdalena M; Seemann, Lars; Tempalski, Barbara; Pouget, Enrique R; Galea, Sandro; Friedman, Samuel R

    2012-01-01

    HIV transmission among injecting and non-injecting drug users (IDU, NIDU) is a significant public health problem. Continuing propagation in endemic settings and emerging regional outbreaks have indicated the need for comprehensive and coordinated HIV prevention. We describe the development of a conceptual framework and calibration of an agent-based model (ABM) to examine how combinations of interventions may reduce and potentially eliminate HIV transmission among drug-using populations. A multidisciplinary team of researchers from epidemiology, sociology, geography, and mathematics developed a conceptual framework based on prior ethnographic and epidemiologic research. An ABM was constructed and calibrated through an iterative design and verification process. In the model, "agents" represent IDU, NIDU, and non-drug users who interact with each other and within risk networks, engaging in sexual and, for IDUs, injection-related risk behavior over time. Agents also interact with simulated HIV prevention interventions (e.g., syringe exchange programs, substance abuse treatment, HIV testing) and initiate antiretroviral treatment (ART) in a stochastic manner. The model was constructed to represent the New York metropolitan statistical area (MSA) population, and calibrated by comparing output trajectories for various outcomes (e.g., IDU/NIDU prevalence, HIV prevalence and incidence) against previously validated MSA-level data. The model closely approximated HIV trajectories in IDU and NIDU observed in New York City between 1992 and 2002, including a linear decrease in HIV prevalence among IDUs. Exploratory results are consistent with empirical studies demonstrating that the effectiveness of a combination of interventions, including syringe exchange expansion and ART provision, dramatically reduced HIV prevalence among IDUs during this time period. Complex systems models of adaptive HIV transmission dynamics can be used to identify potential collective benefits of hypothetical combination prevention interventions. Future work will seek to inform novel strategies that may lead to more effective and equitable HIV prevention strategies for drug-using populations.

  20. Gravity Fields and Interiors of the Saturnian Satellites

    NASA Technical Reports Server (NTRS)

    Rappaport, N. J.; Armstrong, J. W.; Asmar, Sami W.; Iess, L.; Tortora, P.; Somenzi, L.; Zingoni, F.

    2006-01-01

    This viewgraph presentation reviews the Gravity Science Objectives and accomplishments of the Cassini Radio Science Team: (1) Mass and density of icy satellites (2) Quadrupole field of Titan and Rhea (3) Dynamic Love number of Titan (4) Moment of inertia of Titan (in collaboration with the Radar Team) (5) Gravity field of Saturn. The proposed measurements for the extended tour are: (1) Quadrupole field of Enceladus (2) More accurate measurement of Titan k2 (3) Local gravity/topography correlations for Iapetus (4) Verification/disproof of "Pioneer anomaly".

  1. Virtual Factory Framework for Supporting Production Planning and Control.

    PubMed

    Kibira, Deogratias; Shao, Guodong

    2017-01-01

    Developing optimal production plans for smart manufacturing systems is challenging because shop floor events change dynamically. A virtual factory incorporating engineering tools, simulation, and optimization generates and communicates performance data to guide wise decision making for different control levels. This paper describes such a platform specifically for production planning. We also discuss verification and validation of the constituent models. A case study of a machine shop is used to demonstrate data generation for production planning in a virtual factory.

  2. Verification of a Non-Hydrostatic Dynamical Core Using Horizontally Spectral Element Vertically Finite Difference Method: 2D Aspects

    DTIC Science & Technology

    2014-04-01

    hydrostatic pressure vertical coordinate, which are the same as those used in the Weather Research and Forecasting ( WRF ) model, but a hybrid sigma...hydrostatic pressure vertical coordinate, which are the 33 same as those used in the Weather Research and Forecasting ( WRF ) model, but a hybrid 34 sigma...Weather Research and Forecasting 79 ( WRF ) Model. The Euler equations are in flux form based on the hydrostatic pressure vertical 80 coordinate. In

  3. A computer simulation of Skylab dynamics and attitude control for performance verification and operational support

    NASA Technical Reports Server (NTRS)

    Buchanan, H.; Nixon, D.; Joyce, R.

    1974-01-01

    A simulation of the Skylab attitude and pointing control system (APCS) is outlined and discussed. Implementation is via a large hybrid computer and includes those factors affecting system momentum management, propellant consumption, and overall vehicle performance. The important features of the flight system are discussed; the mathematical models necessary for this treatment are outlined; and the decisions involved in implementation are discussed. A brief summary of the goals and capabilities of this tool is also included.

  4. Analytic, empirical and delta method temperature derivatives of D-D and D-T fusion reactivity formulations, as a means of verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langenbrunner, James R.; Booker, Jane M.

    We examine the derivatives with respect to temperature, for various deuterium-tritium (DT) and deuterium-deuterium (D-D) fusion-reactivity formulations. Langenbrunner and Makaruk [1] had studied this as a means of understanding the time and temperature domain of reaction history measured in dynamic fusion experiments. Presently, we consider the temperature derivative dependence of fusion reactivity as a means of exercising and verifying the consistency of the various reactivity formulations.

  5. Heterogeneous delivering capability promotes traffic efficiency in complex networks

    NASA Astrophysics Data System (ADS)

    Zhu, Yan-Bo; Guan, Xiang-Min; Zhang, Xue-Jun

    2015-12-01

    Traffic is one of the most fundamental dynamical processes in networked systems. With the homogeneous delivery capability of nodes, the global dynamic routing strategy proposed by Ling et al. [Phys. Rev. E81, 016113 (2010)] adequately uses the dynamic information during the process and thus it can reach a quite high network capacity. In this paper, based on the global dynamic routing strategy, we proposed a heterogeneous delivery allocation strategy of nodes on scale-free networks with consideration of nodes degree. It is found that the network capacity as well as some other indexes reflecting transportation efficiency are further improved. Our work may be useful for the design of more efficient routing strategies in communication or transportation systems.

  6. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  7. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  8. Evolutionary dynamics of fearfulness and boldness.

    PubMed

    Ji, Ting; Zhang, Boyu; Sun, Yuehua; Tao, Yi

    2009-02-21

    A negative relationship between reproductive effort and survival is consistent with life-history. Evolutionary dynamics and evolutionarily stable strategy (ESS) for the trade-off between survival and reproduction are investigated using a simple model with two phenotypes, fearfulness and boldness. The dynamical stability of the pure strategy model and analysis of ESS conditions reveal that: (i) the simple coexistence of fearfulness and boldness is impossible; (ii) a small population size is favorable to fearfulness, but a large population size is favorable to boldness, i.e., neither fearfulness, nor boldness is always favored by natural selection; and (iii) the dynamics of population density is crucial for a proper understanding of the strategy dynamics.

  9. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  10. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  11. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  12. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  13. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  14. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  15. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  16. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  17. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  18. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  19. A Fluid Structure Interaction Strategy with Application to Low Reynolds Number Flapping Flight

    DTIC Science & Technology

    2010-01-01

    using a predictor - corrector strategy. Dynamic fluid grid adaptation is implemented to reduce the number of grid points and computation costs...governing the dynamics of the ow and the structure are simultaneously advanced in time by using a predictor - corrector strategy. Dynamic uid grid...colleague Patrick Rabenold, the math-guy, who provided the seminal work on adaptive mesh refine- ment for incompressible flow using the Paramesh c

  20. Gender Dimorphic ACL Strain In Response to Combined Dynamic 3D Knee Joint Loading: Implications for ACL Injury Risk

    PubMed Central

    Mizuno, Kiyonori; Andrish, Jack T.; van den Bogert, Antonie J.; McLean, Scott G.

    2009-01-01

    While gender-based differences in knee joint anatomies/laxities are well documented, the potential for them to precipitate gender-dimorphic ACL loading and resultant injury risk has not been considered. To this end, we generated gender-specific models of ACL strain as a function of any six degrees of freedom (6DOF) knee joint load state via a combined cadaveric and analytical approach. Continuously varying joint forces and torques were applied to five male and five female cadaveric specimens and recorded along with synchronous knee flexion and ACL strain data. All data (~10,000 samples) were submitted to specimen-specific regression analyses, affording ACL strain predictions as a function of the combined 6 DOF knee loads. Following individual model verifications, generalized gender-specific models were generated and subjected to 6 DOF external load scenarios consistent with both a clinical examination and a dynamic sports maneuver. The ensuing model-based strain predictions were subsequently examined for gender-based discrepancies. Male and female specimen specific models predicted ACL strain within 0.51% ± 0.10% and 0.52% ± 0.07% of the measured data respectively, and explained more than 75% of the associated variance in each case. Predicted female ACL strains were also significantly larger than respective male values for both of simulated 6 DOF load scenarios. Outcomes suggest that the female ACL will rupture in response to comparatively smaller external load applications. Future work must address the underlying anatomical/laxity contributions to knee joint mechanical and resultant ACL loading, ultimately affording prevention strategies that may cater to individual joint vulnerabilities. PMID:19464897

Top