Science.gov

Sample records for battery algorithm verification

  1. Battery algorithm verification and development using hardware-in-the-loop testing

    NASA Astrophysics Data System (ADS)

    He, Yongsheng; Liu, Wei; Koch, Brain J.

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.

  2. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  3. Universal charge algorithm for telecommunication batteries

    SciTech Connect

    Tsenter, B.; Schwartzmiller, F.

    1997-12-01

    Three chemistries are used extensively in today`s portable telecommunication devices: nickel-cadmium, nickel-metal hydride, and lithium-ion. Nickel-cadmium and nickel-metal hydride batteries (also referred to as nickel-based batteries) are well known while lithium-ion batteries are less known. An universal charging algorithm should satisfactorily charge all chemistries while providing recognition among them. Total Battery Management, Inc. (TBM) has developed individual charging algorithms for nickel-based and lithium-ion batteries and a procedure for recognition, if necessary, to incorporate in an universal algorithm. TBM`s charging philosophy is the first to understand the battery from the chemical point of view and then provide an electronic solution.

  4. Fusing face-verification algorithms and humans.

    PubMed

    O'Toole, Alice J; Abdi, Hervé; Jiang, Fang; Phillips, P Jonathon

    2007-10-01

    It has been demonstrated recently that state-of-the-art face-recognition algorithms can surpass human accuracy at matching faces over changes in illumination. The ranking of algorithms and humans by accuracy, however, does not provide information about whether algorithms and humans perform the task comparably or whether algorithms and humans can be fused to improve performance. In this paper, we fused humans and algorithms using partial least square regression (PLSR). In the first experiment, we applied PLSR to face-pair similarity scores generated by seven algorithms participating in the Face Recognition Grand Challenge. The PLSR produced an optimal weighting of the similarity scores, which we tested for generality with a jackknife procedure. Fusing the algorithms' similarity scores using the optimal weights produced a twofold reduction of error rate over the most accurate algorithm. Next, human-subject-generated similarity scores were added to the PLSR analysis. Fusing humans and algorithms increased the performance to near-perfect classification accuracy. These results are discussed in terms of maximizing face-verification accuracy with hybrid systems consisting of multiple algorithms and humans. PMID:17926698

  5. Testing Conducted for Lithium-Ion Cell and Battery Verification

    NASA Technical Reports Server (NTRS)

    Reid, Concha M.; Miller, Thomas B.; Manzo, Michelle A.

    2004-01-01

    The NASA Glenn Research Center has been conducting in-house testing in support of NASA's Lithium-Ion Cell Verification Test Program, which is evaluating the performance of lithium-ion cells and batteries for NASA mission operations. The test program is supported by NASA's Office of Aerospace Technology under the NASA Aerospace Flight Battery Systems Program, which serves to bridge the gap between the development of technology advances and the realization of these advances into mission applications. During fiscal year 2003, much of the in-house testing effort focused on the evaluation of a flight battery originally intended for use on the Mars Surveyor Program 2001 Lander. Results of this testing will be compared with the results for similar batteries being tested at the Jet Propulsion Laboratory, the Air Force Research Laboratory, and the Naval Research Laboratory. Ultimately, this work will be used to validate lithium-ion battery technology for future space missions. The Mars Surveyor Program 2001 Lander battery was characterized at several different voltages and temperatures before life-cycle testing was begun. During characterization, the battery displayed excellent capacity and efficiency characteristics across a range of temperatures and charge/discharge conditions. Currently, the battery is undergoing lifecycle testing at 0 C and 40-percent depth of discharge under low-Earth-orbit (LEO) conditions.

  6. Control Algorithms Charge Batteries Faster

    NASA Technical Reports Server (NTRS)

    2012-01-01

    On March 29, 2011, NASA s Mercury Surface, Space Environment, Geochemistry and Ranging (MESSENGER) spacecraft beamed a milestone image to Earth: the first photo of Mercury taken from orbit around the solar system s innermost planet. (MESSENGER is also the first spacecraft to orbit Mercury.) Like most of NASA s deep space probes, MESSENGER is enabled by a complex power system that allows its science instruments and communications to function continuously as it travels millions of miles from Earth. "Typically, there isn't one particular power source that can support the entire mission," says Linda Taylor, electrical engineer in Glenn Research Center s Power Systems Analysis Branch. "If you have solar arrays and you are in orbit, at some point you re going to be in eclipse." Because of this, Taylor explains, spacecraft like MESSENGER feature hybrid power systems. MESSENGER is powered by a two-panel solar array coupled with a nickel hydrogen battery. The solar arrays provide energy to the probe and charge the battery; when the spacecraft s orbit carries it behind Mercury and out of the Sun s light, the spacecraft switches to battery power to continue operations. Typically, hybrid systems with multiple power inputs and a battery acting alternately as storage and a power source require multiple converters to handle the power flow between the devices, Taylor says. (Power converters change the qualities of electrical energy, such as from alternating current to direct current, or between different levels of voltage or frequency.) This contributes to a pair of major concerns for spacecraft design. "Weight and size are big drivers for any space application," Taylor says, noting that every pound added to a space vehicle incurs significant costs. For an innovative solution to managing power flows in a lightweight, cost-effective manner, NASA turned to a private industry partner.

  7. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  8. Battery Technology Life Verification Test Manual Revision 1

    SciTech Connect

    Jon P. Christophersen

    2012-12-01

    The purpose of this Technology Life Verification Test (TLVT) Manual is to help guide developers in their effort to successfully commercialize advanced energy storage devices such as battery and ultracapacitor technologies. The experimental design and data analysis discussed herein are focused on automotive applications based on the United States Advanced Battery Consortium (USABC) electric vehicle, hybrid electric vehicle, and plug-in hybrid electric vehicle (EV, HEV, and PHEV, respectively) performance targets. However, the methodology can be equally applied to other applications as well. This manual supersedes the February 2005 version of the TLVT Manual (Reference 1). It includes criteria for statistically-based life test matrix designs as well as requirements for test data analysis and reporting. Calendar life modeling and estimation techniques, including a user’s guide to the corresponding software tool is now provided in the Battery Life Estimator (BLE) Manual (Reference 2).

  9. Effect of object identification algorithms on feature based verification scores

    NASA Astrophysics Data System (ADS)

    Weniger, Michael; Friederichs, Petra

    2015-04-01

    Many modern spatial verification techniques rely on feature identification algorithms. We study the importance of the choice of algorithm and its parameters for the resulting scores. SAL is used as an example to show that these choices have a statistically significant impact on the distributions of object dependent scores. Non-continuous operators used for feature identification are identified as the underlying reason for the observed stability issues, with implications for many feature based verification techniques.

  10. Verification of IEEE Compliant Subtractive Division Algorithms

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.; Leathrum, James F., Jr.

    1996-01-01

    A parameterized definition of subtractive floating point division algorithms is presented and verified using PVS. The general algorithm is proven to satisfy a formal definition of an IEEE standard for floating point arithmetic. The utility of the general specification is illustrated using a number of different instances of the general algorithm.

  11. Formal verification of a fault tolerant clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Rushby, John; Vonhenke, Frieder

    1989-01-01

    A formal specification and mechanically assisted verification of the interactive convergence clock synchronization algorithm of Lamport and Melliar-Smith is described. Several technical flaws in the analysis given by Lamport and Melliar-Smith were discovered, even though their presentation is unusally precise and detailed. It seems that these flaws were not detected by informal peer scrutiny. The flaws are discussed and a revised presentation of the analysis is given that not only corrects the flaws but is also more precise and easier to follow. Some of the corrections to the flaws require slight modifications to the original assumptions underlying the algorithm and to the constraints on its parameters, and thus change the external specifications of the algorithm. The formal analysis of the interactive convergence clock synchronization algorithm was performed using the Enhanced Hierarchical Development Methodology (EHDM) formal specification and verification environment. This application of EHDM provides a demonstration of some of the capabilities of the system.

  12. Monitoring and Commissioning Verification Algorithms for CHP Systems

    SciTech Connect

    Brambley, Michael R.; Katipamula, Srinivas; Jiang, Wei

    2008-03-31

    This document provides the algorithms for CHP system performance monitoring and commissioning verification (CxV). It starts by presenting system-level and component-level performance metrics, followed by descriptions of algorithms for performance monitoring and commissioning verification, using the metric presented earlier. Verification of commissioning is accomplished essentially by comparing actual measured performance to benchmarks for performance provided by the system integrator and/or component manufacturers. The results of these comparisons are then automatically interpreted to provide conclusions regarding whether the CHP system and its components have been properly commissioned and where problems are found, guidance is provided for corrections. A discussion of uncertainty handling is then provided, which is followed by a description of how simulations models can be used to generate data for testing the algorithms. A model is described for simulating a CHP system consisting of a micro-turbine, an exhaust-gas heat recovery unit that produces hot water, a absorption chiller and a cooling tower. The process for using this model for generating data for testing the algorithms for a selected set of faults is described. The next section applies the algorithms developed to CHP laboratory and field data to illustrate their use. The report then concludes with a discussion of the need for laboratory testing of the algorithms on a physical CHP systems and identification of the recommended next steps.

  13. Formal Verification of Air Traffic Conflict Prevention Bands Algorithms

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony J.; Munoz, Cesar A.; Dowek, Gilles

    2010-01-01

    In air traffic management, a pairwise conflict is a predicted loss of separation between two aircraft, referred to as the ownship and the intruder. A conflict prevention bands system computes ranges of maneuvers for the ownship that characterize regions in the airspace that are either conflict-free or 'don't go' zones that the ownship has to avoid. Conflict prevention bands are surprisingly difficult to define and analyze. Errors in the calculation of prevention bands may result in incorrect separation assurance information being displayed to pilots or air traffic controllers. This paper presents provably correct 3-dimensional prevention bands algorithms for ranges of track angle; ground speed, and vertical speed maneuvers. The algorithms have been mechanically verified in the Prototype Verification System (PVS). The verification presented in this paper extends in a non-trivial way that of previously published 2-dimensional algorithms.

  14. Mechanical verification of a schematic Byzantine clock synchronization algorithm

    NASA Technical Reports Server (NTRS)

    Shankar, Natarajan

    1991-01-01

    Schneider generalizes a number of protocols for Byzantine fault tolerant clock synchronization and presents a uniform proof for their correctness. The authors present a machine checked proof of this schematic protocol that revises some of the details in Schneider's original analysis. The verification was carried out with the EHDM system developed at the SRI Computer Science Laboratory. The mechanically checked proofs include the verification that the egocentric mean function used in Lamport and Melliar-Smith's Interactive Convergence Algorithm satisfies the requirements of Schneider's protocol.

  15. Modeling and verification of a lithium iron phosphate battery pack system for automotive applications

    NASA Astrophysics Data System (ADS)

    Guo, Lin

    In recent years, Lithium chemistry based batteries have gained popularity with all automotive manufacturers. Thousands of battery cells are put into a battery pack to satisfy the need of power consumption of vehicles using electric traction. Managing the battery pack for hybrid and electric vehicles is a challenging problem. Despite the advantage of power density and charge retaining capabilities, Lithium ion batteries do not handle over-charge and over-discharge very well compared to other battery chemistries. Therefore, creating an accurate model to predict the battery pack behavior is essential in research and development for battery management systems. This work presents a general technique to extend accepted modeling methodologies for single cells to models for large packs. The theoretical framework is accompanied by parameter identification process based on the circuit model, and experimental verification procedures supporting the validity of this approach.

  16. ON THE VERIFICATION AND VALIDATION OF GEOSPATIAL IMAGE ANALYSIS ALGORITHMS

    SciTech Connect

    Roberts, Randy S.; Trucano, Timothy G.; Pope, Paul A.; Aragon, Cecilia R.; Jiang , Ming; Wei, Thomas; Chilton, Lawrence; Bakel, A. J.

    2010-07-25

    Verification and validation (V&V) of geospatial image analysis algorithms is a difficult task and is becoming increasingly important. While there are many types of image analysis algorithms, we focus on developing V&V methodologies for algorithms designed to provide textual descriptions of geospatial imagery. In this paper, we present a novel methodological basis for V&V that employs a domain-specific ontology, which provides a naming convention for a domain-bounded set of objects and a set of named relationship between these objects. We describe a validation process that proceeds through objectively comparing benchmark imagery, produced using the ontology, with algorithm results. As an example, we describe how the proposed V&V methodology would be applied to algorithms designed to provide textual descriptions of facilities

  17. Research on registration algorithm for check seal verification

    NASA Astrophysics Data System (ADS)

    Wang, Shuang; Liu, Tiegen

    2008-03-01

    Nowadays seals play an important role in China. With the development of social economy, the traditional method of manual check seal identification can't meet the need s of banking transactions badly. This paper focus on pre-processing and registration algorithm for check seal verification using theory of image processing and pattern recognition. First of all, analyze the complex characteristics of check seals. To eliminate the difference of producing conditions and the disturbance caused by background and writing in check image, many methods are used in the pre-processing of check seal verification, such as color components transformation, linearity transform to gray-scale image, medium value filter, Otsu, close calculations and labeling algorithm of mathematical morphology. After the processes above, the good binary seal image can be obtained. On the basis of traditional registration algorithm, a double-level registration method including rough and precise registration method is proposed. The deflection angle of precise registration method can be precise to 0.1°. This paper introduces the concepts of difference inside and difference outside and use the percent of difference inside and difference outside to judge whether the seal is real or fake. The experimental results of a mass of check seals are satisfied. It shows that the methods and algorithmic presented have good robustness to noise sealing conditions and satisfactory tolerance of difference within class.

  18. Imaging for dismantlement verification: information management and analysis algorithms

    SciTech Connect

    Robinson, Sean M.; Jarman, Kenneth D.; Pitts, W. Karl; Seifert, Allen; Misner, Alex C.; Woodring, Mitchell L.; Myjak, Mitchell J.

    2012-01-11

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  19. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  20. A SIFT feature based registration algorithm in automatic seal verification

    NASA Astrophysics Data System (ADS)

    He, Jin; Ding, Xuewen; Zhang, Hao; Liu, Tiegen

    2012-11-01

    A SIFT (Scale Invariant Feature Transform) feature based registration algorithm is presented to prepare for the seal verification, especially for the verification of high quality counterfeit sample seal. The similarities and the spatial relationships between the matched SIFT features are combined for the registration. SIFT features extracted from the binary model seal and sample seal images are matched according to their similarities. The matching rate is used to define the similar sample seal that is similar with its model seal. For the similar sample seal, the false matches are eliminated according to the position relationship. Then the homography between model seal and sample seal is constructed and named HS . The theoretical homography is namedH . The accuracy of registration is evaluated by the Frobenius norm of H-HS . In experiments, translation, filling and rotation transformations are applied to seals with different shapes, stroke number and structures. After registering the transformed seals and their model seals, the maximum value of the Frobenius norm of their H-HS is not more than 0.03. The results prove that this algorithm can accomplish accurate registration, which is invariant to translation, filling, and rotation transformation, and there is no limit to the seal shapes, stroke number and structures.

  1. Imaging for dismantlement verification: information management and analysis algorithms

    SciTech Connect

    Seifert, Allen; Miller, Erin A.; Myjak, Mitchell J.; Robinson, Sean M.; Jarman, Kenneth D.; Misner, Alex C.; Pitts, W. Karl; Woodring, Mitchell L.

    2010-09-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute. However, this process must be performed with care. Computing the perimeter, area, and intensity of an object, for example, might reveal sensitive information relating to shape, size, and material composition. This paper presents three analysis algorithms that reduce full image information to non-sensitive feature information. Ultimately, the algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We evaluate the algorithms on both their technical performance in image analysis, and their application with and without an explicitly constructed information barrier. The underlying images can be highly detailed, since they are dynamically generated behind the information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography.

  2. On the Formal Verification of Conflict Detection Algorithms

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Butler, Ricky W.; Carreno, Victor A.; Dowek, Gilles

    2001-01-01

    Safety assessment of new air traffic management systems is a main issue for civil aviation authorities. Standard techniques such as testing and simulation have serious limitations in new systems that are significantly more autonomous than the older ones. In this paper, we present an innovative approach, based on formal verification, for establishing the correctness of conflict detection systems. Fundamental to our approach is the concept of trajectory, which is a continuous path in the x-y plane constrained by physical laws and operational requirements. From the Model of trajectories, we extract, and formally prove, high level properties that can serve as a framework to analyze conflict scenarios. We use the Airborne Information for Lateral Spacing (AILS) alerting algorithm as a case study of our approach.

  3. Formal Verification of a Conflict Resolution and Recovery Algorithm

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey; Butler, Ricky; Geser, Alfons; Munoz, Cesar

    2004-01-01

    New air traffic management concepts distribute the duty of traffic separation among system participants. As a consequence, these concepts have a greater dependency and rely heavily on on-board software and hardware systems. One example of a new on-board capability in a distributed air traffic management system is air traffic conflict detection and resolution (CD&R). Traditional methods for safety assessment such as human-in-the-loop simulations, testing, and flight experiments may not be sufficient for this highly distributed system as the set of possible scenarios is too large to have a reasonable coverage. This paper proposes a new method for the safety assessment of avionics systems that makes use of formal methods to drive the development of critical systems. As a case study of this approach, the mechanical veri.cation of an algorithm for air traffic conflict resolution and recovery called RR3D is presented. The RR3D algorithm uses a geometric optimization technique to provide a choice of resolution and recovery maneuvers. If the aircraft adheres to these maneuvers, they will bring the aircraft out of conflict and the aircraft will follow a conflict-free path to its original destination. Veri.cation of RR3D is carried out using the Prototype Verification System (PVS).

  4. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  5. Specification of Selected Performance Monitoring and Commissioning Verification Algorithms for CHP Systems

    SciTech Connect

    Brambley, Michael R.; Katipamula, Srinivas

    2006-10-06

    Pacific Northwest National Laboratory (PNNL) is assisting the U.S. Department of Energy (DOE) Distributed Energy (DE) Program by developing advanced control algorithms that would lead to development of tools to enhance performance and reliability, and reduce emissions of distributed energy technologies, including combined heat and power technologies. This report documents phase 2 of the program, providing a detailed functional specification for algorithms for performance monitoring and commissioning verification, scheduled for development in FY 2006. The report identifies the systems for which algorithms will be developed, the specific functions of each algorithm, metrics which the algorithms will output, and inputs required by each algorithm.

  6. An Algorithmic Form Of Verification Of Appointed Phases Of The Project Documentation For A Building Investment

    NASA Astrophysics Data System (ADS)

    Kochanek, Anna

    2015-12-01

    The process of area development and planning in compliance with conditions outlined in the Zoning Scheme is significant because of the current rapid development of rural and urban areas. The verification of project documentation in terms of observing constant and nationally binding norms, legislation and local laws is based on certain standards. In order to streamline the process of verification undertaken by the relevant public authorities, it is necessary to create formal algorithms that will automate the existing method of control of architecture-building documentation. The objective of this article is algorithmisation of the project documentation verification allowing further streamlining and automation of the process.

  7. ECG Sensor Card with Evolving RBP Algorithms for Human Verification.

    PubMed

    Tseng, Kuo-Kun; Huang, Huang-Nan; Zeng, Fufu; Tu, Shu-Yi

    2015-01-01

    It is known that cardiac and respiratory rhythms in electrocardiograms (ECGs) are highly nonlinear and non-stationary. As a result, most traditional time-domain algorithms are inadequate for characterizing the complex dynamics of the ECG. This paper proposes a new ECG sensor card and a statistical-based ECG algorithm, with the aid of a reduced binary pattern (RBP), with the aim of achieving faster ECG human identity recognition with high accuracy. The proposed algorithm has one advantage that previous ECG algorithms lack-the waveform complex information and de-noising preprocessing can be bypassed; therefore, it is more suitable for non-stationary ECG signals. Experimental results tested on two public ECG databases (MIT-BIH) from MIT University confirm that the proposed scheme is feasible with excellent accuracy, low complexity, and speedy processing. To be more specific, the advanced RBP algorithm achieves high accuracy in human identity recognition and is executed at least nine times faster than previous algorithms. Moreover, based on the test results from a long-term ECG database, the evolving RBP algorithm also demonstrates superior capability in handling long-term and non-stationary ECG signals. PMID:26307995

  8. ECG Sensor Card with Evolving RBP Algorithms for Human Verification

    PubMed Central

    Tseng, Kuo-Kun; Huang, Huang-Nan; Zeng, Fufu; Tu, Shu-Yi

    2015-01-01

    It is known that cardiac and respiratory rhythms in electrocardiograms (ECGs) are highly nonlinear and non-stationary. As a result, most traditional time-domain algorithms are inadequate for characterizing the complex dynamics of the ECG. This paper proposes a new ECG sensor card and a statistical-based ECG algorithm, with the aid of a reduced binary pattern (RBP), with the aim of achieving faster ECG human identity recognition with high accuracy. The proposed algorithm has one advantage that previous ECG algorithms lack—the waveform complex information and de-noising preprocessing can be bypassed; therefore, it is more suitable for non-stationary ECG signals. Experimental results tested on two public ECG databases (MIT-BIH) from MIT University confirm that the proposed scheme is feasible with excellent accuracy, low complexity, and speedy processing. To be more specific, the advanced RBP algorithm achieves high accuracy in human identity recognition and is executed at least nine times faster than previous algorithms. Moreover, based on the test results from a long-term ECG database, the evolving RBP algorithm also demonstrates superior capability in handling long-term and non-stationary ECG signals. PMID:26307995

  9. Hill-Climbing Attacks and Robust Online Signature Verification Algorithm against Hill-Climbing Attacks

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo

    Attacks using hill-climbing methods have been reported as a vulnerability of biometric authentication systems. In this paper, we propose a robust online signature verification algorithm against such attacks. Specifically, the attack considered in this paper is a hill-climbing forged data attack. Artificial forgeries are generated offline by using the hill-climbing method, and the forgeries are input to a target system to be attacked. In this paper, we analyze the menace of hill-climbing forged data attacks using six types of hill-climbing forged data and propose a robust algorithm by incorporating the hill-climbing method into an online signature verification algorithm. Experiments to evaluate the proposed system were performed using a public online signature database. The proposed algorithm showed improved performance against this kind of attack.

  10. Verification Studies for Multi-Fluid Plasma Algorithms with Applications to Fast MHD Physics

    NASA Astrophysics Data System (ADS)

    Becker, Joe; Hakim, Ammar; Loverich, John; Stoltz, Peter

    2011-10-01

    In this paper we present a series of verification studies for finite volume algorithms in Nautilus, a numerical solver for fluid plasmas. Results include a set of typical Euler, Maxwell, MHD and Two-fluid benchmarks. In addition results and algorithms for a set of hyperbolic gauge cleaning schemes that can be applied to the MHD and Two-fluid systems using finite volume type methods will be presented. Finally we move onto applications in field reversed configuration (FRC) plasmas.

  11. Building a medical image processing algorithm verification database

    NASA Astrophysics Data System (ADS)

    Brown, C. Wayne

    2000-06-01

    The design of a database containing head Computed Tomography (CT) studies is presented, along with a justification for the database's composition. The database will be used to validate software algorithms that screen normal head CT studies from studies that contain pathology. The database is designed to have the following major properties: (1) a size sufficient for statistical viability, (2) inclusion of both normal (no pathology) and abnormal scans, (3) inclusion of scans due to equipment malfunction, technologist error, and uncooperative patients, (4) inclusion of data sets from multiple scanner manufacturers, (5) inclusion of data sets from different gender and age groups, and (6) three independent diagnosis of each data set. Designed correctly, the database will provide a partial basis for FDA (United States Food and Drug Administration) approval of image processing algorithms for clinical use. Our goal for the database is the proof of viability of screening head CT's for normal anatomy using computer algorithms. To put this work into context, a classification scheme for 'computer aided diagnosis' systems is proposed.

  12. Performance analysis results of a battery fuel gauge algorithm at multiple temperatures

    NASA Astrophysics Data System (ADS)

    Balasingam, B.; Avvari, G. V.; Pattipati, K. R.; Bar-Shalom, Y.

    2015-01-01

    Evaluating a battery fuel gauge (BFG) algorithm is a challenging problem due to the fact that there are no reliable mathematical models to represent the complex features of a Li-ion battery, such as hysteresis and relaxation effects, temperature effects on parameters, aging, power fade (PF), and capacity fade (CF) with respect to the chemical composition of the battery. The existing literature is largely focused on developing different BFG strategies and BFG validation has received little attention. In this paper, using hardware in the loop (HIL) data collected form three Li-ion batteries at nine different temperatures ranging from -20 °C to 40 °C, we demonstrate detailed validation results of a battery fuel gauge (BFG) algorithm. The BFG validation is based on three different BFG validation metrics; we provide implementation details of these three BFG evaluation metrics by proposing three different BFG validation load profiles that satisfy varying levels of user requirements.

  13. Plenoptic camera image simulation for reconstruction algorithm verification

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2014-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

  14. Algorithm for ventricular capture verification based on the mechanical evoked response.

    PubMed

    Yaacoby, E; Akselrod, S; Eldar, M; Glikson, M

    2005-07-01

    Automatic pacemaker capture verification is important for maintaining safety and low energy consumption in pacemaker patients. A new algorithm was developed, based on impedance measurement between pacing electrode poles, which reflects the distribution of the conducting medium between the poles and changes with effective contraction. Data acquired during pacemaker implant in 17 subjects were analysed, with intracardiac impedance recorded while pacing was performed in the ventricle at varying energies, resulting in multiple-captured and non-captured beats. The impedance signals of all captured/non-captured beats were analysed using three different algorithms, based on the morphology of the impedance signal. The algorithm decision for each beat was compared with an actual capture or non-capture, as determined from the simultaneous recording of surface ECG. Two of the three algorithms (Z1 and Zn) were based on impedance values, and one (Z'n) was based on the first derivative of the impedance. Z1 was based on a single sample, whereas Z'n and Z'n were based on several samples in each beat. The total accuracy for each was Z1: 43%, Zn: 87%, Z'n: 92%. It was concluded that impedance-based capture verification is feasible, that a multiple rather than single sample approach for signal classification is both feasible and superior, and that first derivative analysis with multiple samples (Z'n) provides the best results. PMID:16255434

  15. Experimental verification of an interpolation algorithm for improved estimates of animal position

    NASA Astrophysics Data System (ADS)

    Schell, Chad; Jaffe, Jules S.

    2004-07-01

    This article presents experimental verification of an interpolation algorithm that was previously proposed in Jaffe [J. Acoust. Soc. Am. 105, 3168-3175 (1999)]. The goal of the algorithm is to improve estimates of both target position and target strength by minimizing a least-squares residual between noise-corrupted target measurement data and the output of a model of the sonar's amplitude response to a target at a set of known locations. Although this positional estimator was shown to be a maximum likelihood estimator, in principle, experimental verification was desired because of interest in understanding its true performance. Here, the accuracy of the algorithm is investigated by analyzing the correspondence between a target's true position and the algorithm's estimate. True target position was measured by precise translation of a small test target (bead) or from the analysis of images of fish from a coregistered optical imaging system. Results with the stationary spherical test bead in a high signal-to-noise environment indicate that a large increase in resolution is possible, while results with commercial aquarium fish indicate a smaller increase is obtainable. However, in both experiments the algorithm provides improved estimates of target position over those obtained by simply accepting the angular positions of the sonar beam with maximum output as target position. In addition, increased accuracy in target strength estimation is possible by considering the effects of the sonar beam patterns relative to the interpolated position. A benefit of the algorithm is that it can be applied ``ex post facto'' to existing data sets from commercial multibeam sonar systems when only the beam intensities have been stored after suitable calibration.

  16. Experimental verification of an interpolation algorithm for improved estimates of animal position.

    PubMed

    Schell, Chad; Jaffe, Jules S

    2004-07-01

    This article presents experimental verification of an interpolation algorithm that was previously proposed in Jaffe [J. Acoust. Soc. Am. 105, 3168-3175 (1999)]. The goal of the algorithm is to improve estimates of both target position and target strength by minimizing a least-squares residual between noise-corrupted target measurement data and the output of a model of the sonar's amplitude response to a target at a set of known locations. Although this positional estimator was shown to be a maximum likelihood estimator, in principle, experimental verification was desired because of interest in understanding its true performance. Here, the accuracy of the algorithm is investigated by analyzing the correspondence between a target's true position and the algorithm's estimate. True target position was measured by precise translation of a small test target (bead) or from the analysis of images of fish from a coregistered optical imaging system. Results with the stationary spherical test bead in a high signal-to-noise environment indicate that a large increase in resolution is possible, while results with commercial aquarium fish indicate a smaller increase is obtainable. However, in both experiments the algorithm provides improved estimates of target position over those obtained by simply accepting the angular positions of the sonar beam with maximum output as target position. In addition, increased accuracy in target strength estimation is possible by considering the effects of the sonar beam patterns relative to the interpolated position. A benefit of the algorithm is that it can be applied "ex post facto" to existing data sets from commercial multibeam sonar systems when only the beam intensities have been stored after suitable calibration. PMID:15295985

  17. Multi-objective optimization of lithium-ion battery model using genetic algorithm approach

    NASA Astrophysics Data System (ADS)

    Zhang, Liqiang; Wang, Lixin; Hinds, Gareth; Lyu, Chao; Zheng, Jun; Li, Junfu

    2014-12-01

    A multi-objective parameter identification method for modeling of Li-ion battery performance is presented. Terminal voltage and surface temperature curves at 15 °C and 30 °C are used as four identification objectives. The Pareto fronts of two types of Li-ion battery are obtained using the modified multi-objective genetic algorithm NSGA-II and the final identification results are selected using the multiple criteria decision making method TOPSIS. The simulated data using the final identification results are in good agreement with experimental data under a range of operating conditions. The validation results demonstrate that the modified NSGA-II and TOPSIS algorithms can be used as robust and reliable tools for identifying parameters of multi-physics models for many types of Li-ion batteries.

  18. Adaptive Kalman filter based state of charge estimation algorithm for lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Zheng, Hong; Liu, Xu; Wei, Min

    2015-09-01

    In order to improve the accuracy of the battery state of charge (SOC) estimation, in this paper we take a lithium-ion battery as an example to study the adaptive Kalman filter based SOC estimation algorithm. Firstly, the second-order battery system model is introduced. Meanwhile, the temperature and charge rate are introduced into the model. Then, the temperature and the charge rate are adopted to estimate the battery SOC, with the help of the parameters of an adaptive Kalman filter based estimation algorithm model. Afterwards, it is verified by the numerical simulation that in the ideal case, the accuracy of SOC estimation can be enhanced by adding two elements, namely, the temperature and charge rate. Finally, the actual road conditions are simulated with ADVISOR, and the simulation results show that the proposed method improves the accuracy of battery SOC estimation under actual road conditions. Thus, its application scope in engineering is greatly expanded. Project supported by the National Natural Science Foundation of China (Grant Nos. 61004048 and 61201010).

  19. Verification of the ASTER/TIR atmospheric correction algorithm based on water surface emissivity retrieved

    NASA Astrophysics Data System (ADS)

    Tonooka, Hideyuki; Palluconi, Frank D.

    2002-02-01

    The standard atmospheric correction algorithm for five thermal infrared (TIR) bands of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is currently based on radiative transfer computations with global assimilation data on a pixel-by-pixel basis. In the present paper, we verify this algorithm using 100 ASTER scenes globally acquired during the early mission period. In this verification, the max-min difference (MMD) of the water surface emissivity retrieved from each scene is used as an atmospheric correction error index, since the water surface emissivity is well known; if the MMD retrieved is large, an atmospheric correction error also will be possibly large. As the results, the error of the MMD retrieved by the standard atmospheric correction algorithm and a typical temperature/emissivity separation algorithm is shown to be remarkably related with precipitable water vapor, latitude, elevation, and surface temperature. It is also mentioned that the expected error on the MMD retrieved is 0.05 for the precipitable water vapor of 3 cm.

  20. Remaining Sites Verification Package for the 120-B-1, 105-B Battery Acid Sump, Waste Site Reclassification Form 2006-057

    SciTech Connect

    L. M. Dittmer

    2006-09-25

    The 120-B-1 waste site, located in the 100-BC-1 Operable Unit of the Hanford Site, consisted of a concrete battery acid sump that was used from 1944 to 1969 to neutralize the spent sulfuric acid from lead cell batteries of emergency power packs and the emergency lighting system. The battery acid sump was associated with the 105-B Reactor Building and was located adjacent to the building's northwest corner. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  1. Environmental technology verification report: Rechargeable alkaline household battery system, Rayovac Corporation Renewal[trademark

    SciTech Connect

    Escarda, T.; Lewis, N.

    1999-03-01

    The EPA's ETV Program, in partnership with recognized testing organizations, objectively and systematically documents the performance of commercial ready technologies. Together, with the full participation of the technology developer, they develop plans, conduct tests, collect and analyze data, and report findings. Rayovac redesigned their alkaline household batteries so that they could be recharged. The additional charge cycles extend battery life by increasing the energy capacity, which benefits the environment by generating less waste. The design changes include increased void space, and addition of lead and silver. The Rayovac Renewal[trademark] Rechargeable Alkaline Battery System consists of rechargable alkaline zinc-manganese dioxide 1.5 volt batteries, in sizes AAA, AA, C, and D, and a recharging device for the batteries. Typical consumer applications of household batteries include toys and games, portable audio equipment, cameras, sporting goods equipment, test equipment, personal care products, hearing aids, portable data terminals, sub-notebook computers and personal digital assistants, watches, flashlights, lanterns, and cellular phones. Such applications typically require continuous currents of up to 400 milliamperes (mA), which is within the range of the Renewal[trademark] batteries, sized AA, C, and D. Size AAA can supply up to 150 mA continuous current, which is sufficient for applications such as clocks.

  2. Environmental technology verification report: Rechargeable alkaline household battery system, Rayovac Corporation Renewal{trademark}

    SciTech Connect

    Escarda, T.; Lewis, N.

    1999-03-01

    The EPA`s ETV Program, in partnership with recognized testing organizations, objectively and systematically documents the performance of commercial ready technologies. Together, with the full participation of the technology developer, they develop plans, conduct tests, collect and analyze data, and report findings. Rayovac redesigned their alkaline household batteries so that they could be recharged. The additional charge cycles extend battery life by increasing the energy capacity, which benefits the environment by generating less waste. The design changes include increased void space, and addition of lead and silver. The Rayovac Renewal{trademark} Rechargeable Alkaline Battery System consists of rechargable alkaline zinc-manganese dioxide 1.5 volt batteries, in sizes AAA, AA, C, and D, and a recharging device for the batteries. Typical consumer applications of household batteries include toys and games, portable audio equipment, cameras, sporting goods equipment, test equipment, personal care products, hearing aids, portable data terminals, sub-notebook computers and personal digital assistants, watches, flashlights, lanterns, and cellular phones. Such applications typically require continuous currents of up to 400 milliamperes (mA), which is within the range of the Renewal{trademark} batteries, sized AA, C, and D. Size AAA can supply up to 150 mA continuous current, which is sufficient for applications such as clocks.

  3. Comparison and quantitative verification of mapping algorithms for whole-genome bisulfite sequencing.

    PubMed

    Kunde-Ramamoorthy, Govindarajan; Coarfa, Cristian; Laritsky, Eleonora; Kessler, Noah J; Harris, R Alan; Xu, Mingchu; Chen, Rui; Shen, Lanlan; Milosavljevic, Aleksandar; Waterland, Robert A

    2014-04-01

    Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitative accuracy has been reported. We sequenced bisulfite-converted DNA from two tissues from each of two healthy human adults and systematically compared five widely used Bisulfite-seq mapping algorithms: Bismark, BSMAP, Pash, BatMeth and BS Seeker. We evaluated their computational speed and genomic coverage and verified their percentage methylation estimates. With the exception of BatMeth, all mappers covered >70% of CpG sites genome-wide and yielded highly concordant estimates of percentage methylation (r(2) ≥ 0.95). Fourfold variation in mapping time was found between BSMAP (fastest) and Pash (slowest). In each library, 8-12% of genomic regions covered by Bismark and Pash were not covered by BSMAP. An experiment using simulated reads confirmed that Pash has an exceptional ability to uniquely map reads in genomic regions of structural variation. Independent verification by bisulfite pyrosequencing generally confirmed the percentage methylation estimates by the mappers. Of these algorithms, Bismark provides an attractive combination of processing speed, genomic coverage and quantitative accuracy, whereas Pash offers considerably higher genomic coverage. PMID:24391148

  4. Comparison and quantitative verification of mapping algorithms for whole-genome bisulfite sequencing

    PubMed Central

    Kunde-Ramamoorthy, Govindarajan; Coarfa, Cristian; Laritsky, Eleonora; Kessler, Noah J.; Harris, R. Alan; Xu, Mingchu; Chen, Rui; Shen, Lanlan; Milosavljevic, Aleksandar; Waterland, Robert A.

    2014-01-01

    Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitative accuracy has been reported. We sequenced bisulfite-converted DNA from two tissues from each of two healthy human adults and systematically compared five widely used Bisulfite-seq mapping algorithms: Bismark, BSMAP, Pash, BatMeth and BS Seeker. We evaluated their computational speed and genomic coverage and verified their percentage methylation estimates. With the exception of BatMeth, all mappers covered >70% of CpG sites genome-wide and yielded highly concordant estimates of percentage methylation (r2 ≥ 0.95). Fourfold variation in mapping time was found between BSMAP (fastest) and Pash (slowest). In each library, 8–12% of genomic regions covered by Bismark and Pash were not covered by BSMAP. An experiment using simulated reads confirmed that Pash has an exceptional ability to uniquely map reads in genomic regions of structural variation. Independent verification by bisulfite pyrosequencing generally confirmed the percentage methylation estimates by the mappers. Of these algorithms, Bismark provides an attractive combination of processing speed, genomic coverage and quantitative accuracy, whereas Pash offers considerably higher genomic coverage. PMID:24391148

  5. Battery

    NASA Astrophysics Data System (ADS)

    1980-11-01

    Contents: Outlook for lead, zinc and cadmium in India; Future for lead production and recycling - a British view; AKERLOW lead recovery plant; Expanded lead battery grids; Resume of first solder seminar in India; Automatic paste soldering adds sparks to zinc-carbon batteries; 122-ton lead battery used for testing BEST facility; Press release on Pb 80; Research and development; Second International Symposium on Industrial and Oriented Basic Electrochemistry; Industry news; Book review and new publications; Battery abstracts.

  6. A Probabilistic Mass Estimation Algorithm for a Novel 7- Channel Capacitive Sample Verification Sensor

    NASA Technical Reports Server (NTRS)

    Wolf, Michael

    2012-01-01

    A document describes an algorithm created to estimate the mass placed on a sample verification sensor (SVS) designed for lunar or planetary robotic sample return missions. A novel SVS measures the capacitance between a rigid bottom plate and an elastic top membrane in seven locations. As additional sample material (soil and/or small rocks) is placed on the top membrane, the deformation of the membrane increases the capacitance. The mass estimation algorithm addresses both the calibration of each SVS channel, and also addresses how to combine the capacitances read from each of the seven channels into a single mass estimate. The probabilistic approach combines the channels according to the variance observed during the training phase, and provides not only the mass estimate, but also a value for the certainty of the estimate. SVS capacitance data is collected for known masses under a wide variety of possible loading scenarios, though in all cases, the distribution of sample within the canister is expected to be approximately uniform. A capacitance-vs-mass curve is fitted to this data, and is subsequently used to determine the mass estimate for the single channel s capacitance reading during the measurement phase. This results in seven different mass estimates, one for each SVS channel. Moreover, the variance of the calibration data is used to place a Gaussian probability distribution function (pdf) around this mass estimate. To blend these seven estimates, the seven pdfs are combined into a single Gaussian distribution function, providing the final mean and variance of the estimate. This blending technique essentially takes the final estimate as an average of the estimates of the seven channels, weighted by the inverse of the channel s variance.

  7. Real-Time Simulation for Verification and Validation of Diagnostic and Prognostic Algorithms

    NASA Technical Reports Server (NTRS)

    Aguilar, Robet; Luu, Chuong; Santi, Louis M.; Sowers, T. Shane

    2005-01-01

    To verify that a health management system (HMS) performs as expected, a virtual system simulation capability, including interaction with the associated platform or vehicle, very likely will need to be developed. The rationale for developing this capability is discussed and includes the limited capability to seed faults into the actual target system due to the risk of potential damage to high value hardware. The capability envisioned would accurately reproduce the propagation of a fault or failure as observed by sensors located at strategic locations on and around the target system and would also accurately reproduce the control system and vehicle response. In this way, HMS operation can be exercised over a broad range of conditions to verify that it meets requirements for accurate, timely response to actual faults with adequate margin against false and missed detections. An overview is also presented of a real-time rocket propulsion health management system laboratory which is available for future rocket engine programs. The health management elements and approaches of this lab are directly applicable for future space systems. In this paper the various components are discussed and the general fault detection, diagnosis, isolation and the response (FDIR) concept is presented. Additionally, the complexities of V&V (Verification and Validation) for advanced algorithms and the simulation capabilities required to meet the changing state-of-the-art in HMS are discussed.

  8. Extraction of battery parameters using a multi-objective genetic algorithm with a non-linear circuit model

    NASA Astrophysics Data System (ADS)

    Malik, Aimun; Zhang, Zheming; Agarwal, Ramesh K.

    2014-08-01

    There is need for a battery model that can accurately describe the battery performance for an electrical system, such as the electric drive train of electric vehicles. In this paper, both linear and non-linear equivalent circuit models (ECM) are employed as a means of extracting the battery parameters that can be used to model the performance of a battery. The linear and non-linear equivalent circuit models differ in the numbers of capacitance and resistance; the non-linear model has an added circuit; however their numerical characteristics are equivalent. A multi-objective genetic algorithm is employed to accurately extract the values of the battery model parameters. The battery model parameters are obtained for several existing industrial batteries as well as for two recently proposed high performance batteries. Once the model parameters are optimally determined, the results demonstrate that both linear and non-linear equivalent circuit models can predict with acceptable accuracy the performance of various batteries of different sizes, characteristics, capacities, and materials. However, the comparisons of results with catalog and experimental data shows that the predictions of results using the non-linear equivalent circuit model are slightly better than those predicted by the linear model, calculating voltages that are closer to the manufacturers' values.

  9. Development of a charge algorithm for the optimized charging of a 120-V flooded lead-acid lighthouse battery with forced electrolyte destratification. Final report

    SciTech Connect

    Nowak, D.

    1989-10-01

    Proper charging was identified as the most important requirement for the reliable and economical operation of a battery that is part of the hybrid power system for remote lighthouses. Therefore a charge algorithm was developed to optimize charging of a flooded lead-acid battery with forced electrolyte destratification. This algorithm is independent of the operating temperature, the state of charge and the battery age. It controls charging according to the weakest battery module in the pack and is able in the course of several cycles to automatically equalize the performance of the modules in the battery pack without excessive overcharging. The charge algorithm prevents overheating due to bad battery connectors and quite generally responds to all causes of poor charge acceptance with a gentle treatment of the battery during charging.

  10. Thermal contact algorithms in SIERRA mechanics : mathematical background, numerical verification, and evaluation of performance.

    SciTech Connect

    Copps, Kevin D.; Carnes, Brian R.

    2008-04-01

    We examine algorithms for the finite element approximation of thermal contact models. We focus on the implementation of thermal contact algorithms in SIERRA Mechanics. Following the mathematical formulation of models for tied contact and resistance contact, we present three numerical algorithms: (1) the multi-point constraint (MPC) algorithm, (2) a resistance algorithm, and (3) a new generalized algorithm. We compare and contrast both the correctness and performance of the algorithms in three test problems. We tabulate the convergence rates of global norms of the temperature solution on sequentially refined meshes. We present the results of a parameter study of the effect of contact search tolerances. We outline best practices in using the software for predictive simulations, and suggest future improvements to the implementation.

  11. Optimal Battery Sizing in Photovoltaic Based Distributed Generation Using Enhanced Opposition-Based Firefly Algorithm for Voltage Rise Mitigation

    PubMed Central

    Wong, Ling Ai; Shareef, Hussain; Mohamed, Azah; Ibrahim, Ahmad Asrul

    2014-01-01

    This paper presents the application of enhanced opposition-based firefly algorithm in obtaining the optimal battery energy storage systems (BESS) sizing in photovoltaic generation integrated radial distribution network in order to mitigate the voltage rise problem. Initially, the performance of the original firefly algorithm is enhanced by utilizing the opposition-based learning and introducing inertia weight. After evaluating the performance of the enhanced opposition-based firefly algorithm (EOFA) with fifteen benchmark functions, it is then adopted to determine the optimal size for BESS. Two optimization processes are conducted where the first optimization aims to obtain the optimal battery output power on hourly basis and the second optimization aims to obtain the optimal BESS capacity by considering the state of charge constraint of BESS. The effectiveness of the proposed method is validated by applying the algorithm to the 69-bus distribution system and by comparing the performance of EOFA with conventional firefly algorithm and gravitational search algorithm. Results show that EOFA has the best performance comparatively in terms of mitigating the voltage rise problem. PMID:25054184

  12. Optimal battery sizing in photovoltaic based distributed generation using enhanced opposition-based firefly algorithm for voltage rise mitigation.

    PubMed

    Wong, Ling Ai; Shareef, Hussain; Mohamed, Azah; Ibrahim, Ahmad Asrul

    2014-01-01

    This paper presents the application of enhanced opposition-based firefly algorithm in obtaining the optimal battery energy storage systems (BESS) sizing in photovoltaic generation integrated radial distribution network in order to mitigate the voltage rise problem. Initially, the performance of the original firefly algorithm is enhanced by utilizing the opposition-based learning and introducing inertia weight. After evaluating the performance of the enhanced opposition-based firefly algorithm (EOFA) with fifteen benchmark functions, it is then adopted to determine the optimal size for BESS. Two optimization processes are conducted where the first optimization aims to obtain the optimal battery output power on hourly basis and the second optimization aims to obtain the optimal BESS capacity by considering the state of charge constraint of BESS. The effectiveness of the proposed method is validated by applying the algorithm to the 69-bus distribution system and by comparing the performance of EOFA with conventional firefly algorithm and gravitational search algorithm. Results show that EOFA has the best performance comparatively in terms of mitigating the voltage rise problem. PMID:25054184

  13. Double-patterning decomposition, design compliance, and verification algorithms at 32nm hp

    NASA Astrophysics Data System (ADS)

    Tritchkov, Alexander; Glotov, Petr; Komirenko, Sergiy; Sahouria, Emile; Torres, Andres; Seoud, Ahmed; Wiaux, Vincent

    2008-10-01

    Double patterning (DP) technology is one of the main candidates for RET of critical layers at 32nm hp. DP technology is a strong RET technique that must be considered throughout the IC design and post tapeout flows. We present a complete DP technology strategy including a DRC/DFM component, physical synthesis support and mask synthesis. In particular, the methodology contains: - A DRC-like layout DP compliance and design verification functions; - A parameterization scheme that codifies manufacturing knowledge and capability; - Judicious use of physical effect simulation to improve double-patterning quality; - An efficient, high capacity mask synthesis function for post-tapeout processing; - A verification function to determine the correctness and qualify of a DP solution; Double patterning technology requires decomposition of the design to relax the pitch and effectively allows processing with k1 factors smaller than the theoretical Rayleigh limit of 0.25. The traditional DP processes Litho-Etch-Litho- Etch (LELE) [1] requires an additional develop and etch step, which eliminates the resolution degradation which occurs in multiple exposure processed in the same resist layer. The theoretical k1 for a double-patterning technology applied to a 32nm half-pitch design using a 1.35NA 193nm imaging system is 0.44, whereas the k1 for a single-patterning of this same design would be 0.22 [2], which is sub-resolution. This paper demonstrates the methods developed at Mentor Graphics for double patterning design compliance and decomposition in an effort to minimize the impact of mask-to-mask registration and process variance. It also demonstrates verification solution implementation in the chip design flow and post-tapeout flow.

  14. Estimation of stratospheric NO2 from nadir-viewing satellites: The MPI-C TROPOMI verification algorithm

    NASA Astrophysics Data System (ADS)

    Beirle, Steffen; Wagner, Thomas

    2015-04-01

    The retrieval of tropospheric column densities of NO2 requires the subtraction of the stratospheric fraction from the total columns derived by DOAS. Here we present a modified reference sector method, which estimates the stratosphere over "clean" regions, as well as over clouded scenarios in which the tropospheric column is shielded. The selection of "clean" pixels is realized gradually by assingning weighting factors to the individual ground pixels, instead of applying binary flags. Global stratospheric fields are then compiled by "weighted convolution". In a second iteration, unphysical negative tropospheric residues are suppressed by adjusting the weights respectively. This algorithm is foreseen as "verification algorithm" for the upcoming TROPOMI on S5p. We show the resulting stratospheric estimates and tropospheric residues for a test data set based on OMI observations. The dependencies on the a-priori settings (definition of weighting factors and convolution kernels) are discussed, and the results are compared to other products, in particular to DOMINO v.2 (based on assimilation, similar to the TROPOMI prototype algorithm) and the NASA standard product (based on a similar reference-region-type approach).

  15. Verification of visual odometry algorithms with an OpenGL-based software tool

    NASA Astrophysics Data System (ADS)

    Skulimowski, Piotr; Strumillo, Pawel

    2015-05-01

    We present a software tool called a stereovision egomotion sequence generator that was developed for testing visual odometry (VO) algorithms. Various approaches to single and multicamera VO algorithms are reviewed first, and then a reference VO algorithm that has served to demonstrate the program's features is described. The program offers simple tools for defining virtual static three-dimensional scenes and arbitrary six degrees of freedom motion paths within such scenes and output sequences of stereovision images, disparity ground-truth maps, and segmented scene images. A simple script language is proposed that simplifies tests of VO algorithms for user-defined scenarios. The program's capabilities are demonstrated by testing a reference VO technique that employs stereoscopy and feature tracking.

  16. Verification of new cloud discrimination algorithm using GOSAT TANSO-CAI in the Amazon

    NASA Astrophysics Data System (ADS)

    Oishi, Y.; Ishida, H.; Nakajima, T. Y.

    2015-12-01

    Greenhouse gases Observing SATellite (GOSAT) was launched in 2009 to measure the global atmospheric CO2 and CH4 concentrations. GOSAT is equipped with two sensors: the Thermal And Near-infrared Sensor for carbon Observation-Fourier Transform Spectrometer (TANSO-FTS) and the Cloud and Aerosol Imager (TANSO-CAI). The presence of clouds in the instantaneous field-of-view (IFOV) of the FTS leads to incorrect estimates of the concentrations. Thus, the FTS data which are suspected to be cloud-contaminated must be identified using a CAI cloud discrimination algorithm and rejected. Conversely, overestimation of clouds leads to reduce the amount of the FTS data which can be used to estimate the greenhouse gases concentrations. It becomes a serious problem in the region of tropical rainforest such as the Amazon, where there are very few remaining FTS data by cloud cover. The preparation for the launch of the GOSAT-2 in fiscal 2017 has been progressing. To improve the accuracy of estimates of the greenhouse gases concentrations, we need to refine the existing CAI cloud discrimination algorithm. For the reason, a new cloud discrimination algorithm using support vector machines (SVM) was developed. Visual inspections can use the locally optimized thresholds, though the existing CAI cloud discrimination algorithm uses the common thresholds all over the world. Thus, it is certain that the accuracy of visual inspections is better than these algorithms in the limited region without areas such as ice and snow, where it is difficult to discriminate between clouds and ground surfaces. In this study we evaluated the accuracy of the new cloud discrimination algorithm by comparing with the existing CAI cloud discrimination algorithm and visual inspections of the same CAI images in the Amazon. We will present our latest results.

  17. Verification of the Astronomical Almanac's algorithm for approximate the position of the sun

    NASA Astrophysics Data System (ADS)

    Zheng, Lin; Shen, Guotu; Cai, Jiguang; Dong, Zhanhai; Gao, Jing

    2013-09-01

    With the consumption of the resources, it's important to develop clean solar energy that can solve the problem of energy shortage. Obtaining an accurate position of the sun is the premise of using the solar energy efficiently. An accurate solar position includes two factors that are elevation and azimuth. In the paper, Joseph J. Michalsky's algorithm for calculation of the solar position is verified that is taken from the American Astronomical Almanac. The algorithm has been written into program by Joseph J. Michalsky in FORTRAN. In the paper, it's has been adapted to visual C++ that can calculate the solar elevation and azimuth and errors or some places that less accurate are corrected. The Chinese Astronomical Almanac for the year 1985 doesn't tabulate elevation and azimuth. The quantities needed to calculate elevation and azimuth are the right ascension, the declination and the Greenwich mean sidereal time that are tabulated in the Almanac. Comparing those variables that calculated from the algorithm with the data from the Chinese Astronomical Almanac for the year 1985, it can be found that the biggest difference of the two ways is only 0.01°, 0.01° and 0.0001h respectively, which prove the accuracy of the algorithm indirectly. The measured data that only include elevation comes from Basic Data of Geography in China written by Institute of Geography, Chinese Academy of Sciences. Comparing elevation given by the algorithm with the measured data, it shows that the algorithm can be accurate calculating the position of the sun in some extent. And the paper shows in detail the conversion from the local real solar time to Universal Time because the time in Basic Data of Geography in China is the local real solar time. Finally we notice that the 0.01° accuracy mentioned by other paper is not the accuracy of the elevation and azimuth of the sun, but the accuracy of the right ascension and declination. It's easy to understand why the difference of the results

  18. Tuning of Kalman Filter Parameters via Genetic Algorithm for State-of-Charge Estimation in Battery Management System

    PubMed Central

    Ting, T. O.; Lim, Eng Gee

    2014-01-01

    In this work, a state-space battery model is derived mathematically to estimate the state-of-charge (SoC) of a battery system. Subsequently, Kalman filter (KF) is applied to predict the dynamical behavior of the battery model. Results show an accurate prediction as the accumulated error, in terms of root-mean-square (RMS), is a very small value. From this work, it is found that different sets of Q and R values (KF's parameters) can be applied for better performance and hence lower RMS error. This is the motivation for the application of a metaheuristic algorithm. Hence, the result is further improved by applying a genetic algorithm (GA) to tune Q and R parameters of the KF. In an online application, a GA can be applied to obtain the optimal parameters of the KF before its application to a real plant (system). This simply means that the instantaneous response of the KF is not affected by the time consuming GA as this approach is applied only once to obtain the optimal parameters. The relevant workable MATLAB source codes are given in the appendix to ease future work and analysis in this area. PMID:25162041

  19. Verification measurements and clinical evaluation of the iPlan RT Monte Carlo dose algorithm for 6 MV photon energy

    NASA Astrophysics Data System (ADS)

    Petoukhova, A. L.; van Wingerden, K.; Wiggenraad, R. G. J.; van de Vaart, P. J. M.; van Egmond, J.; Franken, E. M.; van Santvoort, J. P. C.

    2010-08-01

    This study presents data for verification of the iPlan RT Monte Carlo (MC) dose algorithm (BrainLAB, Feldkirchen, Germany). MC calculations were compared with pencil beam (PB) calculations and verification measurements in phantoms with lung-equivalent material, air cavities or bone-equivalent material to mimic head and neck and thorax and in an Alderson anthropomorphic phantom. Dosimetric accuracy of MC for the micro-multileaf collimator (MLC) simulation was tested in a homogeneous phantom. All measurements were performed using an ionization chamber and Kodak EDR2 films with Novalis 6 MV photon beams. Dose distributions measured with film and calculated with MC in the homogeneous phantom are in excellent agreement for oval, C and squiggle-shaped fields and for a clinical IMRT plan. For a field with completely closed MLC, MC is much closer to the experimental result than the PB calculations. For fields larger than the dimensions of the inhomogeneities the MC calculations show excellent agreement (within 3%/1 mm) with the experimental data. MC calculations in the anthropomorphic phantom show good agreement with measurements for conformal beam plans and reasonable agreement for dynamic conformal arc and IMRT plans. For 6 head and neck and 15 lung patients a comparison of the MC plan with the PB plan was performed. Our results demonstrate that MC is able to accurately predict the dose in the presence of inhomogeneities typical for head and neck and thorax regions with reasonable calculation times (5-20 min). Lateral electron transport was well reproduced in MC calculations. We are planning to implement MC calculations for head and neck and lung cancer patients.

  20. A practical algorithm for autonomous integrity verification using the pseudo range residual

    NASA Astrophysics Data System (ADS)

    Parkinson, Bradford W.; Axelrad, Penina

    A practical method for using the range residual parameter as a test statistic for GPS satellite failure detection and isolation is presented. The method was developed using empirical results from Monte Carlo computer simulations which indicated performance in terms of false alarm and missed detections. Tests of the system show that the algorithm can always detect failed satellites with 100 m bias errors and can detect failed satellites with 50 m bias errors 99.9 percent of the time. Isolation of a failed satellite can be accomplished 72.2 percent of the time for a 100 m bias and 50.5 percent of the time for a 50 m bias.

  1. Multi-objective optimal design of lithium-ion battery packs based on evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Severino, Bernardo; Gana, Felipe; Palma-Behnke, Rodrigo; Estévez, Pablo A.; Calderón-Muñoz, Williams R.; Orchard, Marcos E.; Reyes, Jorge; Cortés, Marcelo

    2014-12-01

    Lithium-battery energy storage systems (LiBESS) are increasingly being used on electric mobility and stationary applications. Despite its increasing use and improvements of the technology there are still challenges associated with cost reduction, increasing lifetime and capacity, and higher safety. A correct battery thermal management system (BTMS) design is critical to achieve these goals. In this paper, a general framework for obtaining optimal BTMS designs is proposed. Due to the trade-off between the BTMS's design goals and the complex modeling of thermal response inside the battery pack, this paper proposes to solve this problem using a novel Multi-Objective Particle Swarm Optimization (MOPSO) approach. A theoretical case of a module with 6 cells and a real case of a pack used in a Solar Race Car are presented. The results show the capabilities of the proposal methodology, in which improved designs for battery packs are obtained.

  2. Verification of IMRT dose calculations using AAA and PBC algorithms in dose buildup regions.

    PubMed

    Oinam, Arun S; Singh, Lakhwant

    2010-01-01

    The purpose of this comparative study was to test the accuracy of anisotropic analytical algorithm (AAA) and pencil beam convolution (PBC) algorithms of Eclipse treatment planning system (TPS) for dose calculations in the low- and high-dose buildup regions. AAA and PBC algorithms were used to create two intensity-modulated radiotherapy (IMRT) plans of the same optimal fluence generated from a clinically simulated oropharynx case in an in-house fabricated head and neck phantom. The TPS computed buildup doses were compared with the corresponding measured doses in the phantom using thermoluminescence dosimeters (TLD 100). Analysis of dose distribution calculated using PBC and AAA shows an increase in gamma value in the dose buildup region indicating large dose deviation. For the surface areas of 1, 50 and 100 cm2, PBC overestimates doses as compared to AAA calculated value in the range of 1.34%-3.62% at 0.6 cm depth, 1.74%-2.96% at 0.4 cm depth, and 1.96%-4.06% at 0.2 cm depth, respectively. In high-dose buildup region, AAA calculated doses were lower by an average of -7.56% (SD = 4.73%), while PBC was overestimated by 3.75% (SD = 5.70%) as compared to TLD measured doses at 0.2 cm depth. However, at 0.4 and 0.6 cm depth, PBC overestimated TLD measured doses by 5.84% (SD = 4.38%) and 2.40% (SD = 4.63%), respectively, while AAA underestimated the TLD measured doses by -0.82% (SD = 4.24%) and -1.10% (SD = 4.14%) at the same respective depth. In low-dose buildup region, both AAA and PBC overestimated the TLD measured doses at all depths except -2.05% (SD = 10.21%) by AAA at 0.2 cm depth. The differences between AAA and PBC at all depths were statistically significant (p < 0.05) in high-dose buildup region, whereas it is not statistically significant in low-dose buildup region. In conclusion, AAA calculated the dose more accurately than PBC in clinically important high-dose buildup region at 0.4 cm and 0.6 cm depths. The use of an orfit cast increases the dose buildup

  3. Generation of synthetic image sequences for the verification of matching and tracking algorithms for deformation analysis

    NASA Astrophysics Data System (ADS)

    Bethmann, F.; Jepping, C.; Luhmann, T.

    2013-04-01

    This paper reports on a method for the generation of synthetic image data for almost arbitrary static or dynamic 3D scenarios. Image data generation is based on pre-defined 3D objects, object textures, camera orientation data and their imaging properties. The procedure does not focus on the creation of photo-realistic images under consideration of complex imaging and reflection models as they are used by common computer graphics programs. In contrast, the method is designed with main emphasis on geometrically correct synthetic images without radiometric impact. The calculation process includes photogrammetric distortion models, hence cameras with arbitrary geometric imaging characteristics can be applied. Consequently, image sets can be created that are consistent to mathematical photogrammetric models to be used as sup-pixel accurate data for the assessment of high-precision photogrammetric processing methods. In the first instance the paper describes the process of image simulation under consideration of colour value interpolation, MTF/PSF and so on. Subsequently the geometric quality of the synthetic images is evaluated with ellipse operators. Finally, simulated image sets are used to investigate matching and tracking algorithms as they have been developed at IAPG for deformation measurement in car safety testing.

  4. Basic dosimetric verification in water of the anisotropic analytical algorithm for Varian, Elekta and Siemens linacs.

    PubMed

    Cozzi, Luca; Nicolini, Giorgia; Vanetti, Eugenio; Clivio, Alessandro; Glashörster, Marco; Schiefer, Hans; Fogliata, Antonella

    2008-01-01

    Since early 2007 a new version of the Anisotropic Analytical Algorithm (AAA) for photon dose calculations was released by Varian Medical Systems for clinical usage on Elekta linacs and also, with some restrictions, for Siemens linacs. Basic validation studies were performed and reported for three beams. 4,6 and 15 MV for an Elekta Synergy, 6 and 15 MV for a Siemens Primus and, as a reference, for 6 and 15 MV from a Varian Clinac 2100C/D. Generally AAA calculations reproduced well measured data and small deviations were observed for open and wedged fields. PDD curves showed in average differences between calculation and measurement smaller than 1% or 1.2 mm for Elekta beams, 1% or 1.8 mm for Siemens beams and 1% or 1 mm for Varian beams. Profiles in the flattened region matched measurements with deviations smaller than 1% for Elekta and Varian beams, 2% for Siemens. Percentage differences in Output Factors were observed as small as 1% in average. PMID:18705613

  5. Development and verification of algorithms for spacecraft formation flight using the SPHERES testbed: application to TPF

    NASA Astrophysics Data System (ADS)

    Kong, Edmund M.; Hilstad, Mark O.; Nolet, Simon; Miller, David W.

    2004-10-01

    The MIT Space Systems Laboratory and Payload Systems Inc. has developed the SPHERES testbed for NASA and DARPA as a risk-tolerant medium for the development and maturation of spacecraft formation flight and docking algorithms. The testbed, which is designed to operate both onboard the International Space Station and on the ground, provides researchers with a unique long-term, replenishable, and upgradeable platform for the validation of high-risk control and autonomy technologies critical to the operation of distributed spacecraft missions such as the proposed formation flying interferometer version of Terrestrial Planet Finder (TPF). In November 2003, a subset of the key TPF-like maneuvers has been performed onboard NASA's KC-135 microgravity facility, followed by 2-D demonstrations of two and three spacecraft maneuvers at the Marshall Space Flight Center (MSFC) in June 2004. Due to the short experiment duration, only elements of a TPF lost in space maneuver were implemented and validated. The longer experiment time at the MSFC flat-floor facility allows more elaborate maneuvers such as array spin-up/down, array resizing and array rotation be tested but in a less representative environment. The results obtained from these experiments are presented together with the basic estimator and control building blocks used in these experiments.

  6. Verification of the Solar Dynamics Observatory High Gain Antenna Pointing Algorithm Using Flight Data

    NASA Technical Reports Server (NTRS)

    Bourkland, Kristin L.; Liu, Kuo-Chia

    2011-01-01

    presentehat shows the readback delay does not have a negative impact on gimbal control. The decision was made to consider implementing two of the jitter mitigation techniques on board the spacecraft: stagger stepping and the NSR. Flight data from two sets of handovers, one set without jitter mitigation and the other with mitigation enabled, were examined. The trajectory of the predicted handover was compared with the measured trajectory for the two cases, showing that tracking was not negatively impacted with the addition of the jitter mitigation techniques. Additionally, the individual gimbal steps were examined, and it was confirmed that the stagger stepping and NSRs worked as designed. An Image Quality Test was performed to determine the amount of cumulative jitter from the reaction wheels, HGAs, and instruments during various combinations of typical operations. In this paper, the flight results are examined from a test where the HGAs are following the path of a nominal handover with stagger stepping on and HMI NSRs enabled. In this case, the reaction wheels are moving at low speed and the instruments are taking pictures in their standard sequence. The flight data shows the level of jitter that the instruments see when their shutters are open. The HGA-induced jitter is well within the jitter requirement when the stagger step and NSR mitigation options are enabled. The SDO HGA pointing algorithm was designed to achieve nominal antenna pointing at the ground station, perform slews during handover season, and provide three HGA-induced jitter mitigation options without compromising pointing objectives. During the commissioning phase, flight data sets were collected to verify the HGA pointing algorithm and demonstrate its jitter mitigation capabilities.

  7. Identification of individuals with ADHD using the Dean-Woodcock sensory motor battery and a boosted tree algorithm.

    PubMed

    Finch, Holmes W; Davis, Andrew; Dean, Raymond S

    2015-03-01

    The accurate and early identification of individuals with pervasive conditions such as attention deficit hyperactivity disorder (ADHD) is crucial to ensuring that they receive appropriate and timely assistance and treatment. Heretofore, identification of such individuals has proven somewhat difficult, typically involving clinical decision making based on descriptions and observations of behavior, in conjunction with the administration of cognitive assessments. The present study reports on the use of a sensory motor battery in conjunction with a recursive partitioning computer algorithm, boosted trees, to develop a prediction heuristic for identifying individuals with ADHD. Results of the study demonstrate that this method is able to do so with accuracy rates of over 95 %, much higher than the popular logistic regression model against which it was compared. Implications of these results for practice are provided. PMID:24771321

  8. TH-E-BRE-11: Adaptive-Beamlet Based Finite Size Pencil Beam (AB-FSPB) Dose Calculation Algorithm for Independent Verification of IMRT and VMAT

    SciTech Connect

    Park, C; Arhjoul, L; Yan, G; Lu, B; Li, J; Liu, C

    2014-06-15

    Purpose: In current IMRT and VMAT settings, the use of sophisticated dose calculation procedure is inevitable in order to account complex treatment field created by MLCs. As a consequence, independent volumetric dose verification procedure is time consuming which affect the efficiency of clinical workflow. In this study, the authors present an efficient Pencil Beam based dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of Finite Size Pencil Beam (FSPB) algorithm is proportional to the number of infinitesimal identical beamlets that constitute the arbitrary field shape. In AB-FSPB, the dose distribution from each beamlet is mathematically modelled such that the sizes of beamlets to represent arbitrary field shape are no longer needed to be infinitesimal nor identical. In consequence, it is possible to represent arbitrary field shape with combinations of different sized and minimal number of beamlets. Results: On comparing FSPB with AB-FSPB, the complexity of the algorithm has been reduced significantly. For 25 by 25 cm2 squared shaped field, 1 beamlet of 25 by 25 cm2 was sufficient to calculate dose in AB-FSPB, whereas in conventional FSPB, minimum 2500 beamlets of 0.5 by 0.5 cm2 size were needed to calculate dose that was comparable to the Result computed from Treatment Planning System (TPS). The algorithm was also found to be GPU compatible to maximize its computational speed. On calculating 3D dose of IMRT (∼30 control points) and VMAT plan (∼90 control points) with grid size 2.0 mm (200 by 200 by 200), the dose could be computed within 3∼5 and 10∼15 seconds. Conclusion: Authors have developed an efficient Pencil Beam type dose calculation algorithm called AB-FSPB. The fast computation nature along with GPU compatibility has shown performance better than conventional FSPB. This completely enables the implantation of AB-FSPB in the clinical environment for independent

  9. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning.

    PubMed

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J

    2010-08-21

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m(3) MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within +/-4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC

  10. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    NASA Astrophysics Data System (ADS)

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.

    2010-08-01

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC

  11. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  12. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  13. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  14. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  15. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification....

  16. Algorithm development and verification of UASCM for multi-dimension and multi-group neutron kinetics model

    SciTech Connect

    Si, S.

    2012-07-01

    The Universal Algorithm of Stiffness Confinement Method (UASCM) for neutron kinetics model of multi-dimensional and multi-group transport equations or diffusion equations has been developed. The numerical experiments based on transport theory code MGSNM and diffusion theory code MGNEM have demonstrated that the algorithm has sufficient accuracy and stability. (authors)

  17. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Koch, Nicholas C.; Newhauser, Wayne D.

    2010-02-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  18. Differential evolution algorithm based photonic structure design: numerical and experimental verification of subwavelength λ/5 focusing of light

    NASA Astrophysics Data System (ADS)

    Bor, E.; Turduev, M.; Kurt, H.

    2016-08-01

    Photonic structure designs based on optimization algorithms provide superior properties compared to those using intuition-based approaches. In the present study, we numerically and experimentally demonstrate subwavelength focusing of light using wavelength scale absorption-free dielectric scattering objects embedded in an air background. An optimization algorithm based on differential evolution integrated into the finite-difference time-domain method was applied to determine the locations of each circular dielectric object with a constant radius and refractive index. The multiobjective cost function defined inside the algorithm ensures strong focusing of light with low intensity side lobes. The temporal and spectral responses of the designed compact photonic structure provided a beam spot size in air with a full width at half maximum value of 0.19λ, where λ is the wavelength of light. The experiments were carried out in the microwave region to verify numerical findings, and very good agreement between the two approaches was found. The subwavelength light focusing is associated with a strong interference effect due to nonuniformly arranged scatterers and an irregular index gradient. Improving the focusing capability of optical elements by surpassing the diffraction limit of light is of paramount importance in optical imaging, lithography, data storage, and strong light-matter interaction.

  19. Differential evolution algorithm based photonic structure design: numerical and experimental verification of subwavelength λ/5 focusing of light

    PubMed Central

    Bor, E.; Turduev, M.; Kurt, H.

    2016-01-01

    Photonic structure designs based on optimization algorithms provide superior properties compared to those using intuition-based approaches. In the present study, we numerically and experimentally demonstrate subwavelength focusing of light using wavelength scale absorption-free dielectric scattering objects embedded in an air background. An optimization algorithm based on differential evolution integrated into the finite-difference time-domain method was applied to determine the locations of each circular dielectric object with a constant radius and refractive index. The multiobjective cost function defined inside the algorithm ensures strong focusing of light with low intensity side lobes. The temporal and spectral responses of the designed compact photonic structure provided a beam spot size in air with a full width at half maximum value of 0.19λ, where λ is the wavelength of light. The experiments were carried out in the microwave region to verify numerical findings, and very good agreement between the two approaches was found. The subwavelength light focusing is associated with a strong interference effect due to nonuniformly arranged scatterers and an irregular index gradient. Improving the focusing capability of optical elements by surpassing the diffraction limit of light is of paramount importance in optical imaging, lithography, data storage, and strong light-matter interaction. PMID:27477060

  20. Differential evolution algorithm based photonic structure design: numerical and experimental verification of subwavelength λ/5 focusing of light.

    PubMed

    Bor, E; Turduev, M; Kurt, H

    2016-01-01

    Photonic structure designs based on optimization algorithms provide superior properties compared to those using intuition-based approaches. In the present study, we numerically and experimentally demonstrate subwavelength focusing of light using wavelength scale absorption-free dielectric scattering objects embedded in an air background. An optimization algorithm based on differential evolution integrated into the finite-difference time-domain method was applied to determine the locations of each circular dielectric object with a constant radius and refractive index. The multiobjective cost function defined inside the algorithm ensures strong focusing of light with low intensity side lobes. The temporal and spectral responses of the designed compact photonic structure provided a beam spot size in air with a full width at half maximum value of 0.19λ, where λ is the wavelength of light. The experiments were carried out in the microwave region to verify numerical findings, and very good agreement between the two approaches was found. The subwavelength light focusing is associated with a strong interference effect due to nonuniformly arranged scatterers and an irregular index gradient. Improving the focusing capability of optical elements by surpassing the diffraction limit of light is of paramount importance in optical imaging, lithography, data storage, and strong light-matter interaction. PMID:27477060

  1. Evaluation of the Eclipse eMC algorithm for bolus electron conformal therapy using a standard verification dataset.

    PubMed

    Carver, Robert L; Sprunger, Conrad P; Hogstrom, Kenneth R; Popple, Richard A; Antolak, John A

    2016-01-01

     The purpose of this study was to evaluate the accuracy and calculation speed of electron dose distributions calculated by the Eclipse electron Monte Carlo (eMC) algorithm for use with bolus electron conformal therapy (ECT). The recent com-mercial availability of bolus ECT technology requires further validation of the eMC dose calculation algorithm. eMC-calculated electron dose distributions for bolus ECT have been compared to previously measured TLD-dose points throughout patient-based cylindrical phantoms (retromolar trigone and nose), whose axial cross sections were based on the mid-PTV (planning treatment volume) CT anatomy. The phantoms consisted of SR4 muscle substitute, SR4 bone substitute, and air. The treatment plans were imported into the Eclipse treatment planning system, and electron dose distributions calculated using 1% and < 0.2% statistical uncertainties. The accuracy of the dose calculations using moderate smoothing and no smooth-ing were evaluated. Dose differences (eMC-calculated less measured dose) were evaluated in terms of absolute dose difference, where 100% equals the given dose, as well as distance to agreement (DTA). Dose calculations were also evaluated for calculation speed. Results from the eMC for the retromolar trigone phantom using 1% statistical uncertainty without smoothing showed calculated dose at 89% (41/46) of the measured TLD-dose points was within 3% dose difference or 3 mm DTA of the measured value. The average dose difference was -0.21%, and the net standard deviation was 2.32%. Differences as large as 3.7% occurred immediately distal to the mandible bone. Results for the nose phantom, using 1% statistical uncertainty without smoothing, showed calculated dose at 93% (53/57) of the measured TLD-dose points within 3% dose difference or 3 mm DTA. The average dose difference was 1.08%, and the net standard deviation was 3.17%. Differences as large as 10% occurred lateral to the nasal air cavities. Including smoothing had

  2. Verification of dynamic initial pointing algorithm on two-dimensional rotating platform based on GPS/INS

    NASA Astrophysics Data System (ADS)

    Yang, Baohua; Wang, Juanjuan; Wang, Jian

    2015-10-01

    In order to achieve rapid establishment of long-distance laser communication links, it is an effective program to adopt a GPS / INS integrated navigation system (GINS) for completing the initial pointing of the dynamic laser communication. Firstly, we present a dynamic initial pointing algorithm (DIPA), which can be applied to get the pointing angle (PA) by calculating the real-time data received from GINS. Next, the feasibility of the pointing system is analyzed and the hardware system as well as PC software is designed. Then, experiments in the outdoor are carried out to prove the DIPA. Finally, the correctness and reliability of the pointing system is analyzed.

  3. Verification and application of the extended spectral deconvolution algorithm (SDA+) methodology to estimate aerosol fine and coarse mode extinction coefficients in the marine boundary layer

    NASA Astrophysics Data System (ADS)

    Kaku, K. C.; Reid, J. S.; O'Neill, N. T.; Quinn, P. K.; Coffman, D. J.; Eck, T. F.

    2014-10-01

    The spectral deconvolution algorithm (SDA) and SDA+ (extended SDA) methodologies can be employed to separate the fine and coarse mode extinction coefficients from measured total aerosol extinction coefficients, but their common use is currently limited to AERONET (AErosol RObotic NETwork) aerosol optical depth (AOD). Here we provide the verification of the SDA+ methodology on a non-AERONET aerosol product, by applying it to fine and coarse mode nephelometer and particle soot absorption photometer (PSAP) data sets collected in the marine boundary layer. Using data sets collected on research vessels by NOAA-PMEL(National Oceanic and Atmospheric Administration - Pacific Marine Environmental Laboratory), we demonstrate that with accurate input, SDA+ is able to predict the fine and coarse mode scattering and extinction coefficient partition in global data sets representing a range of aerosol regimes. However, in low-extinction regimes commonly found in the clean marine boundary layer, SDA+ output accuracy is sensitive to instrumental calibration errors. This work was extended to the calculation of coarse and fine mode scattering coefficients with similar success. This effort not only verifies the application of the SDA+ method to in situ data, but by inference verifies the method as a whole for a host of applications, including AERONET. Study results open the door to much more extensive use of nephelometers and PSAPs, with the ability to calculate fine and coarse mode scattering and extinction coefficients in field campaigns that do not have the resources to explicitly measure these values.

  4. Formal Verification of Safety Properties for Aerospace Systems Through Algorithms Based on Exhaustive State-Space Exploration

    NASA Technical Reports Server (NTRS)

    Ciardo, Gianfranco

    2004-01-01

    The Runway Safety Monitor (RSM) designed by Lockheed Martin is part of NASA's effort to reduce aviation accidents. We developed a Petri net model of the RSM protocol and used the model checking functions of our tool SMART to investigate a number of safety properties in RSM. To mitigate the impact of state-space explosion, we built a highly discretized model of the system, obtained by partitioning the monitored runway zone into a grid of smaller volumes and by considering scenarios involving only two aircraft. The model also assumes that there are no communication failures, such as bad input from radar or lack of incoming data, thus it relies on a consistent view of reality by all participants. In spite of these simplifications, we were able to expose potential problems in the RSM conceptual design. Our findings were forwarded to the design engineers, who undertook corrective action. Additionally, the results stress the efficiency attained by the new model checking algorithms implemented in SMART, and demonstrate their applicability to real-world systems. Attempts to verify RSM with NuSMV and SPIN have failed due to excessive memory consumption.

  5. Dosimetric verification of the anisotropic analytical algorithm in lung equivalent heterogeneities with and without bone equivalent heterogeneities

    PubMed Central

    Ono, Kaoru; Endo, Satoru; Tanaka, Kenichi; Hoshi, Masaharu; Hirokawa, Yutaka

    2010-01-01

    Purpose: In this study, the authors evaluated the accuracy of dose calculations performed by the convolution∕superposition based anisotropic analytical algorithm (AAA) in lung equivalent heterogeneities with and without bone equivalent heterogeneities. Methods: Calculations of PDDs using the AAA and Monte Carlo simulations (MCNP4C) were compared to ionization chamber measurements with a heterogeneous phantom consisting of lung equivalent and bone equivalent materials. Both 6 and 10 MV photon beams of 4×4 and 10×10 cm2 field sizes were used for the simulations. Furthermore, changes of energy spectrum with depth for the heterogeneous phantom using MCNP were calculated. Results: The ionization chamber measurements and MCNP calculations in a lung equivalent phantom were in good agreement, having an average deviation of only 0.64±0.45%. For both 6 and 10 MV beams, the average deviation was less than 2% for the 4×4 and 10×10 cm2 fields in the water-lung equivalent phantom and the 4×4 cm2 field in the water-lung-bone equivalent phantom. Maximum deviations for the 10×10 cm2 field in the lung equivalent phantom before and after the bone slab were 5.0% and 4.1%, respectively. The Monte Carlo simulation demonstrated an increase of the low-energy photon component in these regions, more for the 10×10 cm2 field compared to the 4×4 cm2 field. Conclusions: The low-energy photon by Monte Carlo simulation component increases sharply in larger fields when there is a significant presence of bone equivalent heterogeneities. This leads to great changes in the build-up and build-down at the interfaces of different density materials. The AAA calculation modeling of the effect is not deemed to be sufficiently accurate. PMID:20879604

  6. Dosimetric verification of the anisotropic analytical algorithm in lung equivalent heterogeneities with and without bone equivalent heterogeneities

    SciTech Connect

    Ono, Kaoru; Endo, Satoru; Tanaka, Kenichi; Hoshi, Masaharu; Hirokawa, Yutaka

    2010-08-15

    Purpose: In this study, the authors evaluated the accuracy of dose calculations performed by the convolution/superposition based anisotropic analytical algorithm (AAA) in lung equivalent heterogeneities with and without bone equivalent heterogeneities. Methods: Calculations of PDDs using the AAA and Monte Carlo simulations (MCNP4C) were compared to ionization chamber measurements with a heterogeneous phantom consisting of lung equivalent and bone equivalent materials. Both 6 and 10 MV photon beams of 4x4 and 10x10 cm{sup 2} field sizes were used for the simulations. Furthermore, changes of energy spectrum with depth for the heterogeneous phantom using MCNP were calculated. Results: The ionization chamber measurements and MCNP calculations in a lung equivalent phantom were in good agreement, having an average deviation of only 0.64{+-}0.45%. For both 6 and 10 MV beams, the average deviation was less than 2% for the 4x4 and 10x10 cm{sup 2} fields in the water-lung equivalent phantom and the 4x4 cm{sup 2} field in the water-lung-bone equivalent phantom. Maximum deviations for the 10x10 cm{sup 2} field in the lung equivalent phantom before and after the bone slab were 5.0% and 4.1%, respectively. The Monte Carlo simulation demonstrated an increase of the low-energy photon component in these regions, more for the 10x10 cm{sup 2} field compared to the 4x4 cm{sup 2} field. Conclusions: The low-energy photon by Monte Carlo simulation component increases sharply in larger fields when there is a significant presence of bone equivalent heterogeneities. This leads to great changes in the build-up and build-down at the interfaces of different density materials. The AAA calculation modeling of the effect is not deemed to be sufficiently accurate.

  7. Crewed Space Vehicle Battery Safety Requirements

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.; Darcy, Eric C.

    2014-01-01

    This requirements document is applicable to all batteries on crewed spacecraft, including vehicle, payload, and crew equipment batteries. It defines the specific provisions required to design a battery that is safe for ground personnel and crew members to handle and/or operate during all applicable phases of crewed missions, safe for use in the enclosed environment of a crewed space vehicle, and safe for use in launch vehicles, as well as in unpressurized spaces adjacent to the habitable portion of a space vehicle. The required provisions encompass hazard controls, design evaluation, and verification. The extent of the hazard controls and verification required depends on the applicability and credibility of the hazard to the specific battery design and applicable missions under review. Evaluation of the design and verification program results shall be completed prior to certification for flight and ground operations. This requirements document is geared toward the designers of battery systems to be used in crewed vehicles, crew equipment, crew suits, or batteries to be used in crewed vehicle systems and payloads (or experiments). This requirements document also applies to ground handling and testing of flight batteries. Specific design and verification requirements for a battery are dependent upon the battery chemistry, capacity, complexity, charging, environment, and application. The variety of battery chemistries available, combined with the variety of battery-powered applications, results in each battery application having specific, unique requirements pertinent to the specific battery application. However, there are basic requirements for all battery designs and applications, which are listed in section 4. Section 5 includes a description of hazards and controls and also includes requirements.

  8. Button batteries

    MedlinePlus

    Swallowing batteries ... These devices use button batteries: Calculators Cameras Hearing aids Penlights Watches ... If a person puts the battery up their nose and breathes it further in, ... problems Cough Pneumonia (if the battery goes unnoticed) ...

  9. Button batteries

    MedlinePlus

    These devices use button batteries: Calculators Cameras Hearing aids Penlights Watches ... locate the battery. Blood and urine tests. Bronchoscopy . Camera placed down the throat into the lungs to ...

  10. Batteries for terrestrial applications

    SciTech Connect

    Kulin, T.M.

    1998-07-01

    Extensive research has been conducted in the design and manufacture of very long life vented and sealed maintenance free nickel-cadmium aircraft batteries. These batteries have also been used in a number of terrestrial applications with good success. This study presents an overview of the Ni-Cd chemistry and technology as well as detailed analysis of the advantages and disadvantages of the Ni-Cd couple for terrestrial applications. The performance characteristics of both sealed and vented Ni-Cd's are presented. Various charge algorithms are examined and evaluated for effectiveness and ease of implementation. Hardware requirements for charging are also presented and evaluated. The discharge characteristics of vented and sealed Ni-Cd's are presented and compared to other battery chemistries. The performance of Ni-Cd's under extreme environmental conditions is also compared to other battery chemistries. The history of various terrestrial applications is reviewed and some of the lessons learned are presented. Applications discussed include the NASA Middeck Payload Battery, Raytheon Aegis Missile System Battery, THAAD Launcher battery, and the Titan IV battery. The suitability of the Ni-Cd chemistry for other terrestrial applications such as electric vehicles and Uninterruptible Power Supply is discussed.

  11. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  12. Space Station battery system design and development

    NASA Technical Reports Server (NTRS)

    Haas, R. J.; Chawathe, A. K.; Van Ommering, G.

    1988-01-01

    The Space Station Electric Power System will rely on nickel-hydrogen batteries in its photovoltaic power subsystem for energy storage to support eclipse and contingency operations. These 81-Ah batteries will be designed for a 5-year life capability and are configured as orbital replaceable units (ORUs), permitting replacement of worn-out batteries over the anticipated 30-year Station life. This paper describes the baseline design and the development plans for the battery assemblies, the battery ORUs and the battery system. Key elements reviewed are the cells, mechanical and thermal design of the assembly, the ORU approach and interfaces, and the electrical design of the battery system. The anticipated operational approach is discussed, covering expected performance as well as the processor-controlled charge management and discharge load allocation techniques. Development plans cover verification of materials, cells, assemblies and ORUs, as well as system-level test and analyses.

  13. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct these other measurements to test the compensation algorithms during the...

  14. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct these other measurements to test the compensation algorithms during the...

  15. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct these other measurements to test the compensation algorithms during the...

  16. 40 CFR 1065.355 - H2O and CO2 interference verification for CO NDIR analyzers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct these other measurements to test the compensation algorithms during the...

  17. Use of COTS Batteries on ISS and Shuttle

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.

    2004-01-01

    This presentation focuses on COTS Battery testing for energy content, toxicity, hazards, failures modes and controls for different battery chemistries. It also discusses the current program requirements, challenges with COTS Batteries in manned vehicle COTS methodology, JSC test details, and gives a list of incidents from consumer protection safety commissions. The Battery test process involved testing new batteries for engineering certification, qualification of batteries, flight acceptance, cell and battery, environment, performance and abuse. Their conclusions and recommendations were that: high risk is undertaken with the use of COTS batteries, hazard control verification is required to allow the use of these batteries on manned space flights, failures during use cannot be understood if different scenarios of failure are not tested on the ground, and that testing is performed on small sample numbers due to restrictions on cost and time. They recommend testing of large sample size to gain more confidence in the operation of the hazard controls.

  18. ETV - VERIFICATION TESTING (ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM)

    EPA Science Inventory

    Verification testing is a major component of the Environmental Technology Verification (ETV) program. The ETV Program was instituted to verify the performance of innovative technical solutions to problems that threaten human health or the environment and was created to substantia...

  19. Dry cell battery poisoning

    MedlinePlus

    Batteries - dry cell ... Acidic dry cell batteries contain: Manganese dioxide Ammonium chloride Alkaline dry cell batteries contain: Sodium hydroxide Potassium hydroxide Lithium dioxide dry cell batteries ...

  20. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  1. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  2. A 3D superposition pencil beam dose calculation algorithm for a 60Co therapy unit and its verification by MC simulation

    NASA Astrophysics Data System (ADS)

    Koncek, O.; Krivonoska, J.

    2014-11-01

    The MCNP Monte Carlo code was used to simulate the collimating system of the 60Co therapy unit to calculate the primary and scattered photon fluences as well as the electron contamination incident to the isocentric plane as the functions of the irradiation field size. Furthermore, a Monte Carlo simulation for the polyenergetic Pencil Beam Kernels (PBKs) generation was performed using the calculated photon and electron spectra. The PBK was analytically fitted to speed up the dose calculation using the convolution technique in the homogeneous media. The quality of the PBK fit was verified by comparing the calculated and simulated 60Co broad beam profiles and depth dose curves in a homogeneous water medium. The inhomogeneity correction coefficients were derived from the PBK simulation of an inhomogeneous slab phantom consisting of various materials. The inhomogeneity calculation model is based on the changes in the PBK radial displacement and on the change of the forward and backward electron scattering. The inhomogeneity correction is derived from the electron density values gained from a complete 3D CT array and considers different electron densities through which the pencil beam is propagated as well as the electron density values located between the interaction point and the point of dose deposition. Important aspects and details of the algorithm implementation are also described in this study.

  3. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  4. An Overview of the NASA Aerospace Flight Battery Systems Program

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.

    2003-01-01

    The NASA Aerospace Flight Battery Systems Program is an agency-wide effort aimed at ensuring the quality, safety, reliability and performance of flight battery systems for NASA applications. The program provides for the validation of primary and secondary cell and battery level technology advances to ensure their availability and readiness for use in NASA missions. It serves to bridge the gap between the development of technology advances and the realization and incorporation of these advances into mission applications. The program is led by the Glenn Research Center and involves funded task activities at each of the NASA mission centers and JPL. The overall products are safe, reliable, high quality batteries for mission applications. The products are defined along three product lines: 1. Battery Systems Technology - Elements of this task area cover the systems aspects of battery operation and generally apply across chemistries. This includes the development of guidelines documents, the establishment and maintenance of a central battery database that serves a central repository for battery characterization and verification test data from tests performed under the support of this program, the NASA Battery Workshop, and general test facility support. 2. Secondary Battery Technology - l h s task area focuses on the validation of battery technology for nickel-cadmium, nickel-hydrogen, nickel-metal-hydride and lithium-ion secondary battery systems. Standardized test regimes are used to validate the quality of a cell lot or cell design for flight applications. In this area, efforts are now concentrated on the validation and verification of lithium-ion battery technology for aerospace applications. 3. Primary Battery Technology - The safety and reliability aspects for primary lithium battery systems that are used in manned operations on the Shuttle and International Space Station are addressed in the primary battery technology task area. An overview of the task areas

  5. The 2004 NASA Aerospace Battery Workshop

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Topics covered include: Super NiCd(TradeMark) Energy Storage for Gravity Probe-B Relativity Mission; Hubble Space Telescope 2004 Battery Update; The Development of Hermetically Sealed Aerospace Nickel-Metal Hydride Cell; Serial Charging Test on High Capacity Li-Ion Cells for the Orbiter Advanced Hydraulic Power System; Cell Equalization of Lithium-Ion Cells; The Long-Term Performance of Small-Cell Batteries Without Cell-Balancing Electronics; Identification and Treatment of Lithium Battery Cell Imbalance under Flight Conditions; Battery Control Boards for Li-Ion Batteries on Mars Exploration Rovers; Cell Over Voltage Protection and Balancing Circuit of the Lithium-Ion Battery; Lithium-Ion Battery Electronics for Aerospace Applications; Lithium-Ion Cell Charge Control Unit; Lithium Ion Battery Cell Bypass Circuit Test Results at the U.S. Naval Research Laboratory; High Capacity Battery Cell By-Pass Switches: High Current Pulse Testing of Lithium-Ion; Battery By-Pass Switches to Verify Their Ability to Withstand Short-Circuits; Incorporation of Physics-Based, Spatially-Resolved Battery Models into System Simulations; A Monte Carlo Model for Li-Ion Battery Life Projections; Thermal Behavior of Large Lithium-Ion Cells; Thermal Imaging of Aerospace Battery Cells; High Rate Designed 50 Ah Li-Ion Cell for LEO Applications; Evaluation of Corrosion Behavior in Aerospace Lithium-Ion Cells; Performance of AEA 80 Ah Battery Under GEO Profile; LEO Li-Ion Battery Testing; A Review of the Feasibility Investigation of Commercial Laminated Lithium-Ion Polymer Cells for Space Applications; Lithium-Ion Verification Test Program; Panasonic Small Cell Testing for AHPS; Lithium-Ion Small Cell Battery Shorting Study; Low-Earth-Orbit and Geosynchronous-Earth-Orbit Testing of 80 Ah Batteries under Real-Time Profiles; Update on Development of Lithium-Ion Cells for Space Applications at JAXA; Foreign Comparative Technology: Launch Vehicle Battery Cell Testing; 20V, 40 Ah Lithium Ion Polymer

  6. Reserve battery

    SciTech Connect

    Thiess, G.H.

    1988-12-27

    A reserve battery is described comprising: a battery cell compartment; an electrolyte reservoir containing pressurized electrolyte fluid; an elongate member formed of rigid material having interior walls defining a closed orifice between the battery cell compartment and the electrolyte fluid reservoir; and the elongate member including a groove adjacent the orifice to define a frangible portion such that upon angular displacement of the elongate member the elongate member is severed at the frangible portion to open the orifice and allow pressurized electrolyte fluid to be conveyed through the orifice to the battery cell compartment.

  7. power battery

    NASA Astrophysics Data System (ADS)

    Yunyun, Zhang; Guoqing, Zhang; Weixiong, Wu; Weixiong, Liang

    2014-07-01

    Under hard acceleration or on a hill climb of (hybrid) electronic vehicles, the battery temperature would increase rapidly. High temperature decreases the battery cycle life, increases the thermal runaway, and even causes a battery to explode, that making the management of battery temperature an important consideration in the safety using of electronic vehicles. A study of increasing heat transfer area from the beginning design phase has been conducted to determine and enhance the heat dissipation on the battery surface. Both experiment and simulation methods were used to analyze the cooling performance under identical battery capacities and heights. Optimal external dimensions and cell sizes with the consideration of better battery workability was obtained from the analysis. The heat transfer coefficients were investigated in order to regulate the battery temperature under safety operating range. It was found that the temperature of the experiment battery would be controlled under safety critical when the cell was designed for 180 mm × 30 mm × 185 mm sizes and the surface heat transfer coefficient was 20 W m-2 K-1 at least.

  8. Verifying a Computer Algorithm Mathematically.

    ERIC Educational Resources Information Center

    Olson, Alton T.

    1986-01-01

    Presents an example of mathematics from an algorithmic point of view, with emphasis on the design and verification of this algorithm. The program involves finding roots for algebraic equations using the half-interval search algorithm. The program listing is included. (JN)

  9. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  10. NASA Aerospace Flight Battery Program: Generic Safety, Handling and Qualification Guidelines for Lithium-Ion (Li-Ion) Batteries; Availability of Source Materials for Lithium-Ion (Li-Ion) Batteries; Maintaining Technical Communications Related to Aerospace Batteries (NASA Aerospace Battery Workshop). Volume 1, Part 1

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Brewer, Jeffrey C.; Bugga, Ratnakumar V.; Darcy, Eric C.; Jeevarajan, Judith A.; McKissock, Barbara I.; Schmitz, Paul C.

    2010-01-01

    This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 1 - Volume I: Generic Safety, Handling and Qualification Guidelines for Lithium-Ion (Li-Ion) Batteries, Availability of Source Materials for Lithium-Ion (Li-Ion) Batteries, and Maintaining Technical Communications Related to Aerospace Batteries (NASA Aerospace Battery Workshop).

  11. Mathematical Modeling of Ni/H2 and Li-Ion Batteries

    NASA Technical Reports Server (NTRS)

    Weidner, John W.; White, Ralph E.; Dougal, Roger A.

    2001-01-01

    The modelling effort outlined in this viewgraph presentation encompasses the following topics: 1) Electrochemical Deposition of Nickel Hydroxide; 2) Deposition rates of thin films; 3) Impregnation of porous electrodes; 4) Experimental Characterization of Nickel Hydroxide; 5) Diffusion coefficients of protons; 6) Self-discharge rates (i.e., oxygen-evolution kinetics); 7) Hysteresis between charge and discharge; 8) Capacity loss on cycling; 9) Experimental Verification of the Ni/H2 Battery Model; 10) Mathematical Modeling Li-Ion Batteries; 11) Experimental Verification of the Li-Ion Battery Model; 11) Integrated Power System Models for Satellites; and 12) Experimental Verification of Integrated-Systems Model.

  12. Flat battery

    SciTech Connect

    Buckler, S.A.; Cohen, F.S.; Kennedy, D.P.

    1980-12-30

    A description is given of the method of making a thin flat laminar battery comprising the steps of coating a substrate with a dispersion of zinc powder and water to produce an anode slurry, and thereafter diffusing electrolytes into said anode slurry; and electrical cells and batteries made by this process.

  13. A Battery Health Monitoring Framework for Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2014-01-01

    Batteries have seen an increased use in electric ground and air vehicles for commercial, military, and space applications as the primary energy source. An important aspect of using batteries in such contexts is battery health monitoring. Batteries must be carefully monitored such that the battery health can be determined, and end of discharge and end of usable life events may be accurately predicted. For planetary rovers, battery health estimation and prediction is critical to mission planning and decision-making. We develop a model-based approach utilizing computaitonally efficient and accurate electrochemistry models of batteries. An unscented Kalman filter yields state estimates, which are then used to predict the future behavior of the batteries and, specifically, end of discharge. The prediction algorithm accounts for possible future power demands on the rover batteries in order to provide meaningful results and an accurate representation of prediction uncertainty. The framework is demonstrated on a set of lithium-ion batteries powering a rover at NASA.

  14. Paintable Battery

    NASA Astrophysics Data System (ADS)

    Singh, Neelam; Galande, Charudatta; Miranda, Andrea; Mathkar, Akshay; Gao, Wei; Reddy, Arava Leela Mohana; Vlad, Alexandru; Ajayan, Pulickel M.

    2012-06-01

    If the components of a battery, including electrodes, separator, electrolyte and the current collectors can be designed as paints and applied sequentially to build a complete battery, on any arbitrary surface, it would have significant impact on the design, implementation and integration of energy storage devices. Here, we establish a paradigm change in battery assembly by fabricating rechargeable Li-ion batteries solely by multi-step spray painting of its components on a variety of materials such as metals, glass, glazed ceramics and flexible polymer substrates. We also demonstrate the possibility of interconnected modular spray painted battery units to be coupled to energy conversion devices such as solar cells, with possibilities of building standalone energy capture-storage hybrid devices in different configurations.

  15. Paintable battery.

    PubMed

    Singh, Neelam; Galande, Charudatta; Miranda, Andrea; Mathkar, Akshay; Gao, Wei; Reddy, Arava Leela Mohana; Vlad, Alexandru; Ajayan, Pulickel M

    2012-01-01

    If the components of a battery, including electrodes, separator, electrolyte and the current collectors can be designed as paints and applied sequentially to build a complete battery, on any arbitrary surface, it would have significant impact on the design, implementation and integration of energy storage devices. Here, we establish a paradigm change in battery assembly by fabricating rechargeable Li-ion batteries solely by multi-step spray painting of its components on a variety of materials such as metals, glass, glazed ceramics and flexible polymer substrates. We also demonstrate the possibility of interconnected modular spray painted battery units to be coupled to energy conversion devices such as solar cells, with possibilities of building standalone energy capture-storage hybrid devices in different configurations. PMID:22745900

  16. Paintable Battery

    PubMed Central

    Singh, Neelam; Galande, Charudatta; Miranda, Andrea; Mathkar, Akshay; Gao, Wei; Reddy, Arava Leela Mohana; Vlad, Alexandru; Ajayan, Pulickel M.

    2012-01-01

    If the components of a battery, including electrodes, separator, electrolyte and the current collectors can be designed as paints and applied sequentially to build a complete battery, on any arbitrary surface, it would have significant impact on the design, implementation and integration of energy storage devices. Here, we establish a paradigm change in battery assembly by fabricating rechargeable Li-ion batteries solely by multi-step spray painting of its components on a variety of materials such as metals, glass, glazed ceramics and flexible polymer substrates. We also demonstrate the possibility of interconnected modular spray painted battery units to be coupled to energy conversion devices such as solar cells, with possibilities of building standalone energy capture-storage hybrid devices in different configurations. PMID:22745900

  17. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  18. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  19. Battery pack

    SciTech Connect

    Weaver, R.J.; Brittingham, D.C.; Basta, J.C.

    1993-07-06

    A battery pack is described, having a center of mass, for use with a medical instrument including a latch, an ejector, and an electrical connector, the battery pack comprising: energy storage means for storing electrical energy; latch engagement means, physically coupled to the energy storage means, for engaging the latch; ejector engagement means, physically coupled to the energy storage means, for engaging the ejector; and connector engagement means, physically coupled to the energy storage means, for engaging the connector, the latch engagement means, ejector engagement means, and connector engagement means being substantially aligned in a plane offset from the center of mass of the battery pack.

  20. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  1. Performance of the Lester battery charger in electric vehicles

    NASA Technical Reports Server (NTRS)

    Vivian, H. C.; Bryant, J. A.

    1984-01-01

    Tests are performed on an improved battery charger. The primary purpose of the testing is to develop test methodologies for battery charger evaluation. Tests are developed to characterize the charger in terms of its charge algorithm and to assess the effects of battery initial state of charge and temperature on charger and battery efficiency. Tests show this charger to be a considerable improvement in the state of the art for electric vehicle chargers.

  2. 40 CFR 1065.350 - H2O interference verification for CO2 NDIR analyzers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... response to CO2. If the NDIR analyzer uses compensation algorithms that utilize measurements of other gases... compensation algorithms during the analyzer interference verification. (c) System requirements. A CO2...

  3. 40 CFR 1065.350 - H2O interference verification for CO2 NDIR analyzers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... response to CO2. If the NDIR analyzer uses compensation algorithms that utilize measurements of other gases... compensation algorithms during the analyzer interference verification. (c) System requirements. A CO2...

  4. 40 CFR 1065.375 - Interference verification for N2O analyzers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... certain analyzers by causing a response similar to N2O. If the analyzer uses compensation algorithms that... other measurements to test the compensation algorithms during the analyzer interference verification....

  5. 40 CFR 1065.375 - Interference verification for N2O analyzers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... certain analyzers by causing a response similar to N2O. If the analyzer uses compensation algorithms that... other measurements to test the compensation algorithms during the analyzer interference verification....

  6. 40 CFR 1065.350 - H2O interference verification for CO2 NDIR analyzers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... response to CO2. If the NDIR analyzer uses compensation algorithms that utilize measurements of other gases... compensation algorithms during the analyzer interference verification. (c) System requirements. A CO2...

  7. 40 CFR 1065.375 - Interference verification for N2O analyzers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... certain analyzers by causing a response similar to N2O. If the analyzer uses compensation algorithms that... other measurements to test the compensation algorithms during the analyzer interference verification....

  8. 40 CFR 1065.375 - Interference verification for N2O analyzers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... certain analyzers by causing a response similar to N2O. If the analyzer uses compensation algorithms that... other measurements to test the compensation algorithms during the analyzer interference verification....

  9. 40 CFR 1065.350 - H2O interference verification for CO2 NDIR analyzers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... response to CO2. If the NDIR analyzer uses compensation algorithms that utilize measurements of other gases... compensation algorithms during the analyzer interference verification. (c) System requirements. A CO2...

  10. Bipolar battery

    DOEpatents

    Kaun, Thomas D.

    1992-01-01

    A bipolar battery having a plurality of cells. The bipolar battery includes: a negative electrode; a positive electrode and a separator element disposed between the negative electrode and the positive electrode, the separator element electrically insulating the electrodes from one another; an electrolyte disposed within at least one of the negative electrode, the positive electrode and the separator element; and an electrode containment structure including a cup-like electrode holder.

  11. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    SciTech Connect

    Anderson, S R; Bihari, B L; Salari, K; Woodward, C S

    2006-12-29

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  12. Reserve battery

    SciTech Connect

    Theiss, G.H.

    1990-05-15

    This patent describes a reserve battery. It comprises: a battery cell compartment defined by housing walls surrounding rounding battery cells and having an open top; a lower bulkhead member spanning the open top of the battery cell compartment and having fill tubes depending from a downwardly facing surface of the lower bulkhead member, one fill tube being provided for each of the battery cells, and each fill tube having internal walls defining a passageway between the interior of the battery cell compartment and an upwardly facing surface of the lower bulkhead member; an upper bulkhead member having a downwardly facing surface opposite and spaced apart from the upwardly facing surface of the lower bulkhead member to form a bulkhead cavity; an elastic reservoir bag in an expanded state containing an electrolyte fluid under pressure and having an opening connected to a passageway to the bulkhead cavity; operable means for sealing the passageway between the reservoir bag opening and the cavity; and housing walls defining a containment for the reservoir bag.

  13. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  14. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  15. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  16. Modular Battery Charge Controller

    NASA Technical Reports Server (NTRS)

    Button, Robert; Gonzalez, Marcelo

    2009-01-01

    A new approach to masterless, distributed, digital-charge control for batteries requiring charge control has been developed and implemented. This approach is required in battery chemistries that need cell-level charge control for safety and is characterized by the use of one controller per cell, resulting in redundant sensors for critical components, such as voltage, temperature, and current. The charge controllers in a given battery interact in a masterless fashion for the purpose of cell balancing, charge control, and state-of-charge estimation. This makes the battery system invariably fault-tolerant. The solution to the single-fault failure, due to the use of a single charge controller (CC), was solved by implementing one CC per cell and linking them via an isolated communication bus [e.g., controller area network (CAN)] in a masterless fashion so that the failure of one or more CCs will not impact the remaining functional CCs. Each micro-controller-based CC digitizes the cell voltage (V(sub cell)), two cell temperatures, and the voltage across the switch (V); the latter variable is used in conjunction with V(sub cell) to estimate the bypass current for a given bypass resistor. Furthermore, CC1 digitizes the battery current (I1) and battery voltage (V(sub batt) and CC5 digitizes a second battery current (I2). As a result, redundant readings are taken for temperature, battery current, and battery voltage through the summation of the individual cell voltages given that each CC knows the voltage of the other cells. For the purpose of cell balancing, each CC periodically and independently transmits its cell voltage and stores the received cell voltage of the other cells in an array. The position in the array depends on the identifier (ID) of the transmitting CC. After eight cell voltage receptions, the array is checked to see if one or more cells did not transmit. If one or more transmissions are missing, the missing cell(s) is (are) eliminated from cell

  17. NASA Aerospace Flight Battery Program: Wet Life of Nickel-Hydrogen (Ni-H2) Batteries. Volume 1, Part 3

    NASA Technical Reports Server (NTRS)

    Jung, David S.; Lee, Leonine S.; Manzo, Michelle A.

    2010-01-01

    This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 3 - Volume I: Wet Life of Nickel-Hydrogen (Ni-H2) Batteries of the program's operations.

  18. NASA Aerospace Flight Battery Program: Recommendations for Technical Requirements for Inclusion in Aerospace Battery Procurements. Volume 1, Part 2

    NASA Technical Reports Server (NTRS)

    Jung, David S.; Manzo, Michelle A.

    2010-01-01

    This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 2 - Volume I: Recommendations for Technical Requirements for Inclusion in Aerospace Battery Procurements of the program's operations.

  19. Battery housing

    SciTech Connect

    Skinner, N. G.

    1985-03-19

    The present invention comprises a battery housing suitable for holding a battery which may generate a dangerously high level of internal pressure. The housing includes a receptacle having a vent passage covered by a rupture disc, the rupture disc in turn covered by a diffuser head having a longitudinal bore therein extending from the rupture disc to a blind end, the bore being traversed by at least one lateral passage leading to the exterior of the housing. Upon reaching a predetermined internal pressure level, the rupture disc ruptures and vents the interior of the housing safely to the exterior through the lateral passage.

  20. RADIOACTIVE BATTERY

    DOEpatents

    Birden, J.H.; Jordan, K.C.

    1959-11-17

    A radioactive battery which includes a capsule containing the active material and a thermopile associated therewith is presented. The capsule is both a shield to stop the radiations and thereby make the battery safe to use, and an energy conventer. The intense radioactive decay taking place inside is converted to useful heat at the capsule surface. The heat is conducted to the hot thermojunctions of a thermopile. The cold junctions of the thermopile are thermally insulated from the heat source, so that a temperature difference occurs between the hot and cold junctions, causing an electrical current of a constant magnitude to flow.

  1. Robust recursive impedance estimation for automotive lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Fridholm, Björn; Wik, Torsten; Nilsson, Magnus

    2016-02-01

    Recursive algorithms, such as recursive least squares (RLS) or Kalman filters, are commonly used in battery management systems to estimate the electrical impedance of the battery cell. However, these algorithms can in some cases run into problems with bias and even divergence of the estimates. This article illuminates problems that can arise in the online estimation using recursive methods, and lists modifications to handle these issues. An algorithm is also proposed that estimates the impedance by separating the problem in two parts; one estimating the ohmic resistance with an RLS approach, and another one where the dynamic effects are estimated using an adaptive Kalman filter (AKF) that is novel in the battery field. The algorithm produces robust estimates of ohmic resistance and time constant of the battery cell in closed loop with SoC estimation, as demonstrated by both in simulations and with experimental data from a lithium-ion battery cell.

  2. Digital Batteries

    NASA Astrophysics Data System (ADS)

    Hubler, Alfred

    2009-03-01

    The energy density in conventional capacitors is limited by sparking. We present nano-capacitor arrays, where - like in laser diodes and quantum wells [1] - quantization prevents dielectric breakthrough. We show that the energy density and the power/weight ratio are very high, possibly larger than in hydrogen [2]. Digital batteries are a potential clean energy source for cars, laptops, and mobile devices. The technology is related to flash drives. However, because of the high energy density, safety is a concern. Digital batteries can be easily and safely charged and discharged. In the discharged state they pose no danger. Even if a charged digital battery were to explode, it would produce no radioactive waste, no long-term radiation, and probably could be designed to produce no noxious chemicals. We discuss methodologies to prevent shorts and other measures to make digital batteries safe. [1] H. Higuraskh, A. Toriumi, F. Yamaguchi, K. Kawamura, A. Hubler, Correlation Tunnel Device, U. S. Patent No. 5,679,961 (1997) [2] Alfred Hubler, http://server10.how-why.com/blog/

  3. Testing activities at the National Battery Test Laboratory

    NASA Astrophysics Data System (ADS)

    Hornstra, F.; Deluca, W. H.; Mulcahey, T. P.

    The National Battery Test Laboratory (NBTL) is an Argonne National Laboratory facility for testing, evaluating, and studying advanced electric storage batteries. The facility tests batteries developed under Department of Energy programs and from private industry. These include batteries intended for future electric vehicle (EV) propulsion, electric utility load leveling (LL), and solar energy storage. Since becoming operational, the NBTL has evaluated well over 1400 cells (generally in the form of three- to six-cell modules, but up to 140-cell batteries) of various technologies. Performance characterization assessments are conducted under a series of charge/discharge cycles with constant current, constant power, peak power, and computer simulated dynamic load profile conditions. Flexible charging algorithms are provided to accommodate the specific needs of each battery under test. Special studies are conducted to explore and optimize charge procedures, to investigate the impact of unique load demands on battery performance, and to analyze the thermal management requirements of battery systems.

  4. Controllers for Battery Chargers and Battery Chargers Therefrom

    NASA Technical Reports Server (NTRS)

    Elmes, John (Inventor); Kersten, Rene (Inventor); Pepper, Michael (Inventor)

    2014-01-01

    A controller for a battery charger that includes a power converter has parametric sensors for providing a sensed Vin signal, a sensed Vout signal and a sensed Iout signal. A battery current regulator (BCR) is coupled to receive the sensed Iout signal and an Iout reference, and outputs a first duty cycle control signal. An input voltage regulator (IVR) receives the sensed Vin signal and a Vin reference. The IVR provides a second duty cycle control signal. A processor receives the sensed Iout signal and utilizes a Maximum Power Point Tracking (MPPT) algorithm, and provides the Vin reference to the IVR. A selection block forwards one of the first and second duty cycle control signals as a duty cycle control signal to the power converter. Dynamic switching between the first and second duty cycle control signals maximizes the power delivered to the battery.

  5. Algorithms and Algorithmic Languages.

    ERIC Educational Resources Information Center

    Veselov, V. M.; Koprov, V. M.

    This paper is intended as an introduction to a number of problems connected with the description of algorithms and algorithmic languages, particularly the syntaxes and semantics of algorithmic languages. The terms "letter, word, alphabet" are defined and described. The concept of the algorithm is defined and the relation between the algorithm and…

  6. Batteries: Overview of Battery Cathodes

    SciTech Connect

    Doeff, Marca M

    2010-07-12

    The very high theoretical capacity of lithium (3829 mAh/g) provided a compelling rationale from the 1970's onward for development of rechargeable batteries employing the elemental metal as an anode. The realization that some transition metal compounds undergo reductive lithium intercalation reactions reversibly allowed use of these materials as cathodes in these devices, most notably, TiS{sub 2}. Another intercalation compound, LiCoO{sub 2}, was described shortly thereafter but, because it was produced in the discharged state, was not considered to be of interest by battery companies at the time. Due to difficulties with the rechargeability of lithium and related safety concerns, however, alternative anodes were sought. The graphite intercalation compound (GIC) LiC{sub 6} was considered an attractive candidate but the high reactivity with commonly used electrolytic solutions containing organic solvents was recognized as a significant impediment to its use. The development of electrolytes that allowed the formation of a solid electrolyte interface (SEI) on surfaces of the carbon particles was a breakthrough that enabled commercialization of Li-ion batteries. In 1990, Sony announced the first commercial batteries based on a dual Li ion intercalation system. These devices are assembled in the discharged state, so that it is convenient to employ a prelithiated cathode such as LiCoO{sub 2} with the commonly used graphite anode. After charging, the batteries are ready to power devices. The practical realization of high energy density Li-ion batteries revolutionized the portable electronics industry, as evidenced by the widespread market penetration of mobile phones, laptop computers, digital music players, and other lightweight devices since the early 1990s. In 2009, worldwide sales of Li-ion batteries for these applications alone were US$ 7 billion. Furthermore, their performance characteristics (Figure 1) make them attractive for traction applications such as hybrid

  7. Performance of the Lester battery charger in electric vehicles

    SciTech Connect

    Vivian, H.C.; Bryant, J.A.

    1984-04-15

    Tests were performed on an improved battery charger manufactured by Lester Electrical of Nebraska, Inc. This charger was installed in a South Coast Technology Rabbit No. 4, which was equipped with lead-acid batteries produced by ESB Company. The primary purpose of the testing was to develop test methodologies for battery charger evaluation. To this end tests were developed to characterize the charger in terms of its charge algorithm and to assess the effects of battery initial state of charge and temperature on charger and battery efficiency. Tests showed this charger to be a considerable improvement in the state of the art for electric vehicle chargers.

  8. Secure optical verification using dual phase-only correlation

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun; Liu, Shutian

    2015-02-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method.

  9. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  10. Metal-Air Batteries

    SciTech Connect

    Zhang, Jiguang; Bruce, Peter G.; Zhang, Gregory

    2011-08-01

    Metal-air batteries have much higher specific energies than most currently available primary and rechargeable batteries. Recent advances in electrode materials and electrolytes, as well as new designs on metal-air batteries, have attracted intensive effort in recent years, especially in the development of lithium-air batteries. The general principle in metal-air batteries will be reviewed in this chapter. The materials, preparation methods, and performances of metal-air batteries will be discussed. Two main metal-air batteries, Zn-air and Li-air batteries will be discussed in detail. Other type of metal-air batteries will also be described.

  11. Batteries for Electric Vehicles

    NASA Technical Reports Server (NTRS)

    Conover, R. A.

    1985-01-01

    Report summarizes results of test on "near-term" electrochemical batteries - (batteries approaching commercial production). Nickel/iron, nickel/zinc, and advanced lead/acid batteries included in tests and compared with conventional lead/acid batteries. Batteries operated in electric vehicles at constant speed and repetitive schedule of accerlerating, coasting, and braking.

  12. Battery Safety Basics

    ERIC Educational Resources Information Center

    Roy, Ken

    2010-01-01

    Batteries commonly used in flashlights and other household devices produce hydrogen gas as a product of zinc electrode corrosion. The amount of gas produced is affected by the batteries' design and charge rate. Dangerous levels of hydrogen gas can be released if battery types are mixed, batteries are damaged, batteries are of different ages, or…

  13. Automated synthesis and verification of configurable DRAM blocks for ASIC's

    NASA Technical Reports Server (NTRS)

    Pakkurti, M.; Eldin, A. G.; Kwatra, S. C.; Jamali, M.

    1993-01-01

    A highly flexible embedded DRAM compiler is developed which can generate DRAM blocks in the range of 256 bits to 256 Kbits. The compiler is capable of automatically verifying the functionality of the generated DRAM modules. The fully automated verification capability is a key feature that ensures the reliability of the generated blocks. The compiler's architecture, algorithms, verification techniques and the implementation methodology are presented.

  14. Battery separator

    SciTech Connect

    Balouskus, R.A.; Feinberg, S.C.; Lundquist, J.T.; Lundsager, C.B.

    1980-09-23

    A battery separator and a method of forming the same is described. The separator has good electrical conductivity and a high degree of inhibition to dendrite formation, and is in the form of a thin sheet formed from a substantially uniform mixture of a thermoplastic rubber and a filler in a volume ratio of from about 1:0.15 to 1:0.6. The thermoplastic rubber is preferably a styrene/elastomer/styrene block copolymer.

  15. Speaker Verification Using Subword Neural Tree Networks.

    NASA Astrophysics Data System (ADS)

    Liou, Han-Sheng

    1995-01-01

    In this dissertation, a new neural-network-based algorithm for text-dependent speaker verification is presented. The algorithm uses a set of concatenated Neural Tree Networks (NTN's) trained on subword units to model a password. In contrast to the conventional stochastic approaches which model the subword units by Hidden Markov Models (HMM's), the new approach utilizes the discriminative training scheme to train a NTN for each subword unit. Two types of subword unit are investigated, phone-like units (PLU's) and HMM state-based units (HSU's). The training of the models includes the following steps. The training utterances of a password is first segmented into subword units using a HMM-based segmentation method. A NTN is then trained for each subword unit. In order to retrieve the temporal information which is relatively important in text-dependent speaker verification, the proposed paradigm integrates the discriminatory ability of the NTN with the temporal models of the HMM. A new scoring method using phonetic weighting to improve the speaker verification performance is also introduced. The proposed algorithms are evaluated by experiments on a TI isolated-word database, YOHO database, and several hundred utterances collected over telephone channel. Performance improvements are obtained over conventional techniques.

  16. Multiple duty battery

    SciTech Connect

    Cohen, F.S.; Hyland, A.L.

    1980-05-20

    A laminar battery capable of providing multiple currents and capacities at different voltages is described in which electrical access is provided to intermediate cells in the battery by conductive metal terminal layers incorporated in the structure of the battery.

  17. 9-Volt Battery Safety

    MedlinePlus

    ... and negative posts are close together. If a metal object touches the two posts of a 9- ... 9-volt batteries were thrown away with other metal items. Storing 9-volt batteries KKK Keep batteries ...

  18. Bipolar-Battery Construction

    NASA Technical Reports Server (NTRS)

    Rippel, Wally E.; Edwards, Dean B.

    1988-01-01

    Bipolar batteries fabricated in continuous quasi-automated process. Components of battery configured so processing steps run sequentially. Key components of battery, bipolar plate and bipolar separator, fabricated separately and later joined together.

  19. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  20. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  1. NASA Aerospace Flight Battery Program: Recommendations for Technical Requirements for Inclusion in Aerospace Battery Procurements. Volume 2/Part 2

    NASA Technical Reports Server (NTRS)

    Jung, David S.; Manzo, Michelle A.

    2010-01-01

    This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 2 - Volume II Appendix A to Part 2 - Volume I.

  2. Battery cell feedthrough apparatus

    DOEpatents

    Kaun, Thomas D.

    1995-01-01

    A compact, hermetic feedthrough apparatus comprising interfitting sleeve portions constructed of chemically-stable materials to permit unique battery designs and increase battery life and performance.

  3. Implementing efficient dynamic formal verification methods for MPI programs.

    SciTech Connect

    Vakkalanka, S.; DeLisi, M.; Gopalakrishnan, G.; Kirby, R. M.; Thakur, R.; Gropp, W.; Mathematics and Computer Science; Univ. of Utah; Univ. of Illinois

    2008-01-01

    We examine the problem of formally verifying MPI programs for safety properties through an efficient dynamic (runtime) method in which the processes of a given MPI program are executed under the control of an interleaving scheduler. To ensure full coverage for given input test data, the algorithm must take into consideration MPI's out-of-order completion semantics. The algorithm must also ensure that nondeterministic constructs (e.g., MPI wildcard receive matches) are executed in all possible ways. Our new algorithm rewrites wildcard receives to specific receives, one for each sender that can potentially match with the receive. It then recursively explores each case of the specific receives. The list of potential senders matching a receive is determined through a runtime algorithm that exploits MPI's operation ordering semantics. Our verification tool ISP that incorporates this algorithm efficiently verifies several programs and finds bugs missed by existing informal verification tools.

  4. Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  5. Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Hixon, Ray

    2015-01-01

    The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.

  6. Goddard high resolution spectrograph science verification and data analysis

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The data analysis performed was to support the Orbital Verification (OV) and Science Verification (SV) of the GHRS was in the areas of the Digicon detector's performance and stability, wavelength calibration, and geomagnetic induced image motion. The results of the analyses are briefly described. Detailed results are given in the form of attachments. Specialized software was developed for the analyses. Calibration files were formatted according to the specifications in a Space Telescope Science report. IRAS images were restored of the Large Magellanic Cloud using a blocked iterative algorithm. The algorithm works with the raw data scans without regridding or interpolating the data on an equally spaced image grid.

  7. Prognostics of Lithium-Ion Batteries Based on Wavelet Denoising and DE-RVM

    PubMed Central

    Zhang, Chaolong; He, Yigang; Yuan, Lifeng; Xiang, Sheng; Wang, Jinping

    2015-01-01

    Lithium-ion batteries are widely used in many electronic systems. Therefore, it is significantly important to estimate the lithium-ion battery's remaining useful life (RUL), yet very difficult. One important reason is that the measured battery capacity data are often subject to the different levels of noise pollution. In this paper, a novel battery capacity prognostics approach is presented to estimate the RUL of lithium-ion batteries. Wavelet denoising is performed with different thresholds in order to weaken the strong noise and remove the weak noise. Relevance vector machine (RVM) improved by differential evolution (DE) algorithm is utilized to estimate the battery RUL based on the denoised data. An experiment including battery 5 capacity prognostics case and battery 18 capacity prognostics case is conducted and validated that the proposed approach can predict the trend of battery capacity trajectory closely and estimate the battery RUL accurately. PMID:26413090

  8. Prognostics of Lithium-Ion Batteries Based on Wavelet Denoising and DE-RVM.

    PubMed

    Zhang, Chaolong; He, Yigang; Yuan, Lifeng; Xiang, Sheng; Wang, Jinping

    2015-01-01

    Lithium-ion batteries are widely used in many electronic systems. Therefore, it is significantly important to estimate the lithium-ion battery's remaining useful life (RUL), yet very difficult. One important reason is that the measured battery capacity data are often subject to the different levels of noise pollution. In this paper, a novel battery capacity prognostics approach is presented to estimate the RUL of lithium-ion batteries. Wavelet denoising is performed with different thresholds in order to weaken the strong noise and remove the weak noise. Relevance vector machine (RVM) improved by differential evolution (DE) algorithm is utilized to estimate the battery RUL based on the denoised data. An experiment including battery 5 capacity prognostics case and battery 18 capacity prognostics case is conducted and validated that the proposed approach can predict the trend of battery capacity trajectory closely and estimate the battery RUL accurately. PMID:26413090

  9. Piezonuclear battery

    DOEpatents

    Bongianni, Wayne L.

    1992-01-01

    A piezonuclear battery generates output power arising from the piezoelectric voltage produced from radioactive decay particles interacting with a piezoelectric medium. Radioactive particle energy may directly create an acoustic wave in the piezoelectric medium or a moderator may be used to generate collision particles for interacting with the medium. In one embodiment a radioactive material (.sup.252 Cf) with an output of about 1 microwatt produced a 12 nanowatt output (1.2% conversion efficiency) from a piezoelectric copolymer of vinylidene fluoride/trifluorethylene.

  10. Galileo Probe Battery System

    NASA Technical Reports Server (NTRS)

    Dagarin, B. P.; Taenaka, R. K.; Stofel, E. J.

    1997-01-01

    The conclusions of the Galileo probe battery system are: the battery performance met mission requirements with margin; extensive ground-based and flight tests of batteries prior to probe separation from orbiter provided good prediction of actual entry performance at Jupiter; and the Li-SO2 battery was an important choice for the probe's main power.

  11. Verification of the time evolution of cosmological simulations via hypothesis-driven comparative and quantitative visualization

    SciTech Connect

    Hsu, Chung-hsing; Ahrens, James P; Heitmann, Katrin

    2009-01-01

    We describe a visualization assisted process for the verification of cosmological simulation codes. The need for code verification stems from the requirement for very accurate predictions in order to interpret observational data confidently. We compare different simulation algorithms in order to reliably predict differences in simulation results and understand their dependence on input parameter settings.

  12. Galileo probe battery system -- An update

    SciTech Connect

    Dagarin, B.P.; Taenaka, R.K.; Stofel, E.J.

    1996-11-01

    NASA`s Galileo 6-year trip to Jupiter is in its final phase. The mission consists of a Jovian Orbiter and an atmospheric entry Probe. The Probe is designed to coast autonomously for up to 190 days and turn itself on 6 hours prior to entry. It will then descend through the upper atmosphere for 50 to 75 minutes with the aid of an 8-foot parachute. This paper discusses sources of electrical power for the Probe and battery testing at the systems level. Described are the final production phase, qualification, and systems testing prior to and following launch, as well as decisions made regarding the Probe separation Li/SO{sub 2} battery configuration. In addition, the paper briefly describes the thermal battery verification program. The main power source comprises three Li/SO{sub 2} battery modules containing 13 D-sized cell strings per module. These modules are required to retain capacity for 7.5 years and support a 150-day clock, ending with a 7-hour mission sequence of increasing loads from 0.15 A to 9.5 A during the last 30 minutes. The main power source is supplemented by two thermal batteries (CaCrO{sub 4}-Ca), which will be used for firing the pyrotechnic initiators during the atmospheric entry.

  13. Lithium Ion Batteries

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Lithium ion batteries, which use a new battery chemistry, are being developed under cooperative agreements between Lockheed Martin, Ultralife Battery, and the NASA Lewis Research Center. The unit cells are made in flat (prismatic) shapes that can be connected in series and parallel to achieve desired voltages and capacities. These batteries will soon be marketed to commercial original-equipment manufacturers and thereafter will be available for military and space use. Current NiCd batteries offer about 35 W-hr/kg compared with 110 W-hr/kg for current lithium ion batteries. Our ultimate target for these batteries is 200 W-hr/kg.

  14. Alkaline battery operational methodology

    DOEpatents

    Sholklapper, Tal; Gallaway, Joshua; Steingart, Daniel; Ingale, Nilesh; Nyce, Michael

    2016-08-16

    Methods of using specific operational charge and discharge parameters to extend the life of alkaline batteries are disclosed. The methods can be used with any commercial primary or secondary alkaline battery, as well as with newer alkaline battery designs, including batteries with flowing electrolyte. The methods include cycling batteries within a narrow operating voltage window, with minimum and maximum cut-off voltages that are set based on battery characteristics and environmental conditions. The narrow voltage window decreases available capacity but allows the batteries to be cycled for hundreds or thousands of times.

  15. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  16. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  17. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  18. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  19. TFE verification program

    NASA Astrophysics Data System (ADS)

    1994-01-01

    This is the final semiannual progress report for the Thermionic Fuel Elements (TFE) verification. A decision was made in August 1993 to begin a Close Out Program on October 1, 1993. Final reports summarizing the design analyses and test activities of the TFE Verification Program will be written, stand-alone documents for each task. The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein includes evaluated test data, design evaluations, the results of analyses and the significance of results.

  20. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  1. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  2. Battery separator

    SciTech Connect

    Giovannoni, R.T.; Kung, J.K.J.; Choi, W.M.

    1987-10-13

    This patent describes a battery system composed of at least one pair of electrodes of opposite polarity, an electrolyte and a separator positioned between electrodes of opposite polarity. The improvement comprises that the separator is a microporous sheet composed of a substantially uniform composition of A. from 7 to 50 weight percent of a polymer mixture, the mixture formed from (a) from about 95 to about 40 weight percent of polyolefin formed from ethylene, propylene or mixtures thereof or a mixture of the polyolefins having a weight average molecular weight of at least about 3,000,000; and (b) from about 5 to about 60 weight percent of a polymeric blend formed from a polyethylene terpolymer and a vinyl or vinylidene halide polymer in a weight ratio of 19:1 to 1:3, the polyethylene terpolymer formed from (1) ethylene monomer, (2) at least one ethylenically unsaturated organic monomer selected from the group consisting of esters of unsaturated C/sub 3/-C/sub 20/ mono- or dicarboxylic acids, vinyl esters of saturated C/sub 2/-C/sub 18/ carboxylic acids, vinyl alkyl ethers wherein the alkyl group has 1-18 carbon atoms, vinyl or vinylidene halides, acrylonitrile, methacrylonitrile, norbornene, alpha-olefins of 3-12 carbon atoms, and vinyl aromatic compounds, and, (3) an additional monomer selected from the group consisting of ethylenically unsaturated C/sub 3/-C/sub 20/ carboxylic acids, carbon monoxide, and sulfur dioxide; B. from 93 to 50 weight percent of a filler which is substantially inert with respect to the battery electrodes and electrolyte; and C. from 0 to 20 weight percent of plasticizer for at least one of the polymers of the composition.

  3. Electrochemistry-based Battery Modeling for Prognostics

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2013-01-01

    Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.

  4. Sensor-fusion-based biometric identity verification

    SciTech Connect

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  5. Context Effects in Sentence Verification.

    ERIC Educational Resources Information Center

    Kiger, John I.; Glass, Arnold L.

    1981-01-01

    Three experiments examined what happens to reaction time to verify easy items when they are mixed with difficult items in a verification task. Subjects verification of simple arithmetic equations and sentences took longer when placed in a difficult list. Difficult sentences also slowed the verification of easy arithmetic equations. (Author/RD)

  6. Model-based condition monitoring for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Kim, Taesic; Wang, Yebin; Fang, Huazhen; Sahinoglu, Zafer; Wada, Toshihiro; Hara, Satoshi; Qiao, Wei

    2015-11-01

    Condition monitoring for batteries involves tracking changes in physical parameters and operational states such as state of health (SOH) and state of charge (SOC), and is fundamentally important for building high-performance and safety-critical battery systems. A model-based condition monitoring strategy is developed in this paper for Lithium-ion batteries on the basis of an electrical circuit model incorporating hysteresis effect. It systematically integrates 1) a fast upper-triangular and diagonal recursive least squares algorithm for parameter identification of the battery model, 2) a smooth variable structure filter for the SOC estimation, and 3) a recursive total least squares algorithm for estimating the maximum capacity, which indicates the SOH. The proposed solution enjoys advantages including high accuracy, low computational cost, and simple implementation, and therefore is suitable for deployment and use in real-time embedded battery management systems (BMSs). Simulations and experiments validate effectiveness of the proposed strategy.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  8. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  9. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  10. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  11. Program verification document for the ASTP flight program

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The various segments of the Apollo Soyuz Test Project (ASTP) flight program were verified. This included checks on the following: general verification, reference systems and transformations, launch preparations, boost navigation and guidance, orbital navigation and guidance, time bases, discretes, and interrupts, launch vehicle attitude control, switch selector processing, digital command system, real time telemetry and data compression, and algorithms.

  12. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  13. Quick charge battery

    SciTech Connect

    Parise, R.J.

    1998-07-01

    Electric and hybrid electric vehicles (EVs and HEVs) will become a significant reality in the near future of the automotive industry. Both types of vehicles will need a means to store energy on board. For the present, the method of choice would be lead-acid batteries, with the HEV having auxiliary power supplied by a small internal combustion engine. One of the main drawbacks to lead-acid batteries is internal heat generation as a natural consequence of the charging process as well as resistance losses. This limits the re-charging rate to the battery pack for an EV which has a range of about 80 miles. A quick turnaround on recharge is needed but not yet possible. One of the limiting factors is the heat buildup. For the HEV the auxiliary power unit provides a continuous charge to the battery pack. Therefore heat generation in the lead-acid battery is a constant problem that must be addressed. Presented here is a battery that is capable of quick charging, the Quick Charge Battery with Thermal Management. This is an electrochemical battery, typically a lead-acid battery, without the inherent thermal management problems that have been present in the past. The battery can be used in an all-electric vehicle, a hybrid-electric vehicle or an internal combustion engine vehicle, as well as in other applications that utilize secondary batteries. This is not restricted to only lead-acid batteries. The concept and technology are flexible enough to use in any secondary battery application where thermal management of the battery must be addressed, especially during charging. Any battery with temperature constraints can benefit from this advancement in the state of the art of battery manufacturing. This can also include nickel-cadmium, metal-air, nickel hydroxide, zinc-chloride or any other type of battery whose performance is affected by the temperature control of the interior as well as the exterior of the battery.

  14. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  15. Nickel cadmium battery evaluation, modeling, and application in an electric vehicle

    NASA Astrophysics Data System (ADS)

    Lynch, William Alfred

    A battery testing facility was set up in the battery evaluation laboratory. This system includes a set of current regulators which were fabricated in the UMASS. Lowell labs and a PC based data acquisition system. Batteries were charged or discharged at any rate within system ratings, and data including battery voltage, current, temperature and impedance were stored by a PC. STM5.140 type nickel-cadmium electric vehicle batteries were subjected to various test procedures using the battery testing facility. The results from these tests were used to determine battery characteristics. An electrical component battery model was also developed using the test data. The validity of the battery model was verified through experimental testing, and it was found to be accurate. Additionally, improved battery charging algorithms were developed which resulted in significant improvements in battery efficiency. Electric car operation with STM5.140 type of batteries was evaluated. Realistic road test data were analyzed experimentally and using the battery model. No battery abuse was found under EV driving conditions. The performance of the STM5.140 battery under abuse conditions was evaluated and it was found that it performs reasonably well under all abuse conditions tested. The model and test methodologies may be incorporated into complete electric vehicle models in order to assist in the design and operation of current and future electric vehicles.

  16. Introduction to battery design

    SciTech Connect

    Nees, J.M.

    1983-05-01

    It is the purpose of this presentation on battery design to provide data and procedures that will enable the lead acid battery engineer to design replacement batteries for automotive application. Although the data and procedures cited in this presentation refer primarily to automotive batteries, they can be applied in principal to the design of other types of lead acid batteries. As the materials and processes will differ between battery manufacturers, the design criteria for each manufacturer will be subject to these differences and the data presented should be used accordingly.

  17. Multimodal Speaker Verification Based on Electroglottograph Signal and Glottal Activity Detection

    NASA Astrophysics Data System (ADS)

    Ćirović, Zoran; Milosavljević, Milan; Banjac, Zoran

    2010-12-01

    To achieve robust speaker verification, we propose a multimodal method which includes additional nonaudio features and glottal activity detector. As a nonaudio sensor an electroglottograph (EGG) is applied. Parameters of EGG signal are used to augment conventional audio feature vector. Algorithm for EGG parameterization is based on the shape of the idealized waveform and glottal activity detector. We compare our algorithm with conventional one in the term of verification accuracy in high noise environment. All experiments are performed using Gaussian Mixture Model recognition system. Obtained results show a significant improvement of the text-independent speaker verification in high noise environment and opportunity for further improvements in this area.

  18. NASA Aerospace Flight Battery Program: Generic Safety, Handling and Qualification Guidelines for Lithium-Ion (Li-Ion) Batteries; Availability of Source Materials for Lithium-Ion (Li-Ion) Batteries; Maintaining Technical Communications Related to Aerospace Batteries (NASA Aerospace Battery Workshop). Volume 2, Part 1

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Brewer, Jeffrey C.; Bugga, Ratnakumar V.; Darcy, Eric C.; Jeevarajan, Judith A.; McKissock, Barbara I.; Schmitz, Paul C.

    2010-01-01

    This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This report contains the Appendices to the findings from the first year of the program's operations.

  19. Hubble Space Telescope On-orbit NiH2 Battery Performance

    NASA Technical Reports Server (NTRS)

    Rao, Gopalakrishna M.; Krol, Stanley J., Jr.

    2002-01-01

    This paper summarizes the Hubble Space Telescope (HST) nickel-hydrogen (NiH2) battery performance from launch to the present time. Over the life of HST vehicle configuration, charge system degradation and failures together with thermal design limitations have had a significant effect on the capacity of the HST batteries. Changes made to the charge system configuration in order to protect against power system failures and to maintain battery thermal stability resulted in undercharging of the batteries. This undercharging resulted in decreased usable battery capacity as well as battery cell voltage/capacity divergence. This cell divergence was made evident during on-orbit battery capacity measurements by a relatively shallow slope of the discharge curve following the discharge knee. Early efforts to improve the battery performance have been successful. On-orbit capacity measurement data indicates increases in the usable battery capacity of all six batteries as well as improvements in the battery cell voltage/capacity divergence. Additional measures have been implemented to improve battery performance, however, failures within the HST Power Control Unit (PCU) have prevented verification of battery status. As this PCU fault prevents the execution of on-orbit capacity testing, the HST Project has based the battery capacity on trends, which utilizes previous on-orbit battery capacity test data, for science mission and servicing mission planning. The Servicing Mission 38 (SM-3B) in March 2002 replaced the faulty PCU. Following the servicing mission, on-orbit capacity test resumed. A summary of battery performance is reviewed since launch in this paper.

  20. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  1. Ionene membrane battery separator

    NASA Technical Reports Server (NTRS)

    Moacanin, J.; Tom, H. Y.

    1969-01-01

    Ionic transport characteristics of ionenes, insoluble membranes from soluble polyelectrolyte compositions, are studied for possible application in a battery separator. Effectiveness of the thin film of separator membrane essentially determines battery lifetime.

  2. Battery cell feedthrough apparatus

    DOEpatents

    Kaun, T.D.

    1995-03-14

    A compact, hermetic feedthrough apparatus is described comprising interfitting sleeve portions constructed of chemically-stable materials to permit unique battery designs and increase battery life and performance. 8 figs.

  3. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  4. Cancelable face verification using optical encryption and authentication.

    PubMed

    Taheri, Motahareh; Mozaffari, Saeed; Keshavarzi, Parviz

    2015-10-01

    In a cancelable biometric system, each instance of enrollment is distorted by a transform function, and the output should not be retransformed to the original data. This paper presents a new cancelable face verification system in the encrypted domain. Encrypted facial images are generated by a double random phase encoding (DRPE) algorithm using two keys (RPM1 and RPM2). To make the system noninvertible, a photon counting (PC) method is utilized, which requires a photon distribution mask for information reduction. Verification of sparse images that are not recognizable by direct visual inspection is performed by unconstrained minimum average correlation energy filter. In the proposed method, encryption keys (RPM1, RPM2, and PDM) are used in the sender side, and the receiver needs only encrypted images and correlation filters. In this manner, the system preserves privacy if correlation filters are obtained by an adversary. Performance of PC-DRPE verification system is evaluated under illumination variation, pose changes, and facial expression. Experimental results show that utilizing encrypted images not only increases security concerns but also enhances verification performance. This improvement can be attributed to the fact that, in the proposed system, the face verification problem is converted to key verification tasks. PMID:26479930

  5. SLA battery separators

    SciTech Connect

    Fujita, Y.

    1986-10-01

    Since they first appeared in the early 1970's, sealed lead acid (SLA) batteries have been a rapidly growing factor in the battery industry - in rechargeable, deep-cycle, and automotive storage systems. The key to these sealed batteries is the binderless, absorptive glass microfiber separator which permits the electrolyte to recombine after oxidation. The result is no free acid, no outgassing, and longer life. The batteries are described.

  6. Handbook of Battery Materials

    NASA Astrophysics Data System (ADS)

    Besenhard, J. O.

    1999-04-01

    Batteries find their applications in an increasing range of every-day products: discmen, mobile phones and electric cars need very different battery types. This handbook gives a concise survey about the materials used in modern battery technology. The physico-chemical fundamentals are as well treated as are the environmental and recycling aspects. It will be a profound reference source for anyone working in the research and development of new battery systems, regardless if chemist, physicist or engineer.

  7. Battery Review Board

    NASA Technical Reports Server (NTRS)

    Vaughn, Chester

    1993-01-01

    The topics covered are presented in viewgraph form: NASA Battery Review Board Charter; membership, board chronology; background; statement of problem; summary of problems with 50 AH standard Ni-Cd; activities for near term programs utilizing conventional Ni-Cd; present projects scheduled to use NASA standard Ni-Cd; other near-term NASA programs requiring secondary batteries; recommended direction for future programs; future cell/battery procurement strategy; and the NASA Battery Program.

  8. Battery test plan

    NASA Astrophysics Data System (ADS)

    Barnett, J. H.; Carter, C. L.; Blickwedel, T. W.; Todd, D. E.

    1982-06-01

    An approach to testing electric vehicle batteries is described. Each individual module and vehicle battery pack is given an identification that is traceable through its history. Computer-controlled battery capacity testing equipment is used. Two types of testing are performed - acceptance and operational. Records of tests are maintained on computer-generated outputs. The results of the testing is documented in a report on individual battery products of a manufacturer.

  9. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  10. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  11. TFE verification program

    NASA Astrophysics Data System (ADS)

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  12. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  13. Sodium sulfur battery seal

    DOEpatents

    Topouzian, Armenag

    1980-01-01

    This invention is directed to a seal for a sodium sulfur battery in which a flexible diaphragm sealing elements respectively engage opposite sides of a ceramic component of the battery which separates an anode compartment from a cathode compartment of the battery.

  14. Electric Vehicle Battery Challenge

    ERIC Educational Resources Information Center

    Roman, Harry T.

    2014-01-01

    A serious drawback to electric vehicles [batteries only] is the idle time needed to recharge their batteries. In this challenge, students can develop ideas and concepts for battery change-out at automotive service stations. Such a capability would extend the range of electric vehicles.

  15. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  16. Electrochemical model based charge optimization for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Pramanik, Sourav; Anwar, Sohel

    2016-05-01

    In this paper, we propose the design of a novel optimal strategy for charging the lithium-ion battery based on electrochemical battery model that is aimed at improved performance. A performance index that aims at minimizing the charging effort along with a minimum deviation from the rated maximum thresholds for cell temperature and charging current has been defined. The method proposed in this paper aims at achieving a faster charging rate while maintaining safe limits for various battery parameters. Safe operation of the battery is achieved by including the battery bulk temperature as a control component in the performance index which is of critical importance for electric vehicles. Another important aspect of the performance objective proposed here is the efficiency of the algorithm that would allow higher charging rates without compromising the internal electrochemical kinetics of the battery which would prevent abusive conditions, thereby improving the long term durability. A more realistic model, based on battery electro-chemistry has been used for the design of the optimal algorithm as opposed to the conventional equivalent circuit models. To solve the optimization problem, Pontryagins principle has been used which is very effective for constrained optimization problems with both state and input constraints. Simulation results show that the proposed optimal charging algorithm is capable of shortening the charging time of a lithium ion cell while maintaining the temperature constraint when compared with the standard constant current charging. The designed method also maintains the internal states within limits that can avoid abusive operating conditions.

  17. Enhanced Verification Test Suite for Physics Simulation Codes

    SciTech Connect

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater

  18. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  19. Graphene nanoarchitecture in batteries.

    PubMed

    Wei, Di; Astley, Michael R; Harris, Nadine; White, Richard; Ryhänen, Tapani; Kivioja, Jani

    2014-08-21

    We compare three different carbon nanoarchitectures used to produce standard coin cell batteries: graphene monolayer, graphite paper and graphene foam. The batteries' electrochemical performances are characterised using cyclic voltammetry, constant-current discharge and dynamic galvanostatic techniques. Even though graphene is the fundamental building block of graphite its properties are intrinsically different when used in batteries because there is no ion intercalation in graphene. The nanoarchitecture of the graphene electrode is shown to have a strong influence over the battery's electrochemical performance. This provides a versatile way to design various battery electrodes on different demands. PMID:24990483

  20. Chemically rechargeable battery

    NASA Technical Reports Server (NTRS)

    Graf, James E. (Inventor); Rowlette, John J. (Inventor)

    1984-01-01

    Batteries (50) containing oxidized, discharged metal electrodes such as an iron-air battery are charged by removing and storing electrolyte in a reservoir (98), pumping fluid reductant such as formalin (aqueous formaldehyde) from a storage tank (106) into the battery in contact with the surfaces of the electrodes. After sufficient iron hydroxide has been reduced to iron, the spent reductant is drained, the electrodes rinsed with water from rinse tank (102) and then the electrolyte in the reservoir (106) is returned to the battery. The battery can be slowly electrically charged when in overnight storage but can be quickly charged in about 10 minutes by the chemical procedure of the invention.

  1. The Dark Energy Survey Science Verification Shear Catalog

    NASA Astrophysics Data System (ADS)

    Jarvis, Michael; Sheldon, Erin; Zuntz, Joe; Bridle, Sarah; Kacprzak, Tomasz; Dark Energy Survey Collaboration

    2015-04-01

    We present results of the weak lensing analysis of the Dark Energy Survey (DES) science verification data. The science verification (SV) data use the same telescope and camera as the full DES is using, but the data were taken during commissioning time the year prior to the start of the DES. We have undergone a large battery of null tests to look for systematic errors in the shear values. The catalogs pass all tests at the levels required for doing weak lensing science with the SV data. We will show here the results of some of the more interesting tests. We will mention briefly some plans for improvements to the pipeline to help meet the more stringent demands of the full 5-year DES survey.

  2. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  3. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  4. NASA Aerospace Flight Battery Program: Wet Life of Nickel-Hydrogen (Ni-H2) Batteries. Volume 2, Part 3; Appendices

    NASA Technical Reports Server (NTRS)

    Jung, David S,; Lee, Leonine S.; Manzo, Michelle A.

    2010-01-01

    This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 3 - Volume II Appendices to Part 3 - Volume I.

  5. Automatic battery analyzer

    SciTech Connect

    Dougherty, T.J.; Frailing, C.E.

    1980-03-11

    Apparatus for automatically testing automotive-type, lead acid storage batteries is disclosed in which three separate tests are made and the results thereof compared to predetermined standards in a specified order to maximize the information obtained about the battery. The three tests measure (1) whether the battery meets its cold cranking rating by drawing a predetermined load current therefrom for a predetermined period of time and determining whether the battery terminal voltage is above a specified level at the end of that period, (2) whether the battery terminal voltage is above another specified level at the end of a predetermined period of time following the completion of the first test, and (3) whether the internal resistance is acceptably low. If the battery passes the first test, it is known to be acceptable. If the battery fails the first test and passes the second test, it is known to be unacceptable. If the battery fails the first and second tests, the third test is performed. If the battery then passes the third test, it is known to be acceptable but to require a recharge, whereas if the battery then fails the third test the acceptability of the battery is then not yet determined and it must be recharged and retested.

  6. Alkaline quinone flow battery.

    PubMed

    Lin, Kaixiang; Chen, Qing; Gerhardt, Michael R; Tong, Liuchuan; Kim, Sang Bok; Eisenach, Louise; Valle, Alvaro W; Hardee, David; Gordon, Roy G; Aziz, Michael J; Marshak, Michael P

    2015-09-25

    Storage of photovoltaic and wind electricity in batteries could solve the mismatch problem between the intermittent supply of these renewable resources and variable demand. Flow batteries permit more economical long-duration discharge than solid-electrode batteries by using liquid electrolytes stored outside of the battery. We report an alkaline flow battery based on redox-active organic molecules that are composed entirely of Earth-abundant elements and are nontoxic, nonflammable, and safe for use in residential and commercial environments. The battery operates efficiently with high power density near room temperature. These results demonstrate the stability and performance of redox-active organic molecules in alkaline flow batteries, potentially enabling cost-effective stationary storage of renewable energy. PMID:26404834

  7. Silicon Carbide Radioisotope Batteries

    NASA Technical Reports Server (NTRS)

    Rybicki, George C.

    2005-01-01

    The substantial radiation resistance and large bandgap of SiC semiconductor materials makes them an attractive candidate for application in a high efficiency, long life radioisotope battery. To evaluate their potential in this application, simulated batteries were constructed using SiC diodes and the alpha particle emitter Americium Am-241 or the beta particle emitter Promethium Pm-147. The Am-241 based battery showed high initial power output and an initial conversion efficiency of approximately 16%, but the power output decayed 52% in 500 hours due to radiation damage. In contrast the Pm-147 based battery showed a similar power output level and an initial conversion efficiency of approximately 0.6%, but no degradation was observed in 500 hours. However, the Pm-147 battery required approximately 1000 times the particle fluence as the Am-242 battery to achieve a similar power output. The advantages and disadvantages of each type of battery and suggestions for future improvements will be discussed.

  8. TU-A-9A-03: Development and Verification of a Forward Model That Assists in Iterative Post-Processing Algorithms Used to Reduce Blur in Compton Backscatter Imaging Systems

    SciTech Connect

    Juneja, B; Gilland, D; Hintenlang, D; Doxsee, K; Bova, F

    2014-06-15

    Purpose: In Compton Backscatter Imaging (CBI), the source and detector reside on the same side of the patient. We previously demonstrated the applicability of CBI systems for medical purposes using an industrial system. To assist in post-processing images from a CBI system, a forward model based on radiation absorption and scatter principles has been developed. Methods: The forward model was developed in C++ using raytracing to track particles. The algorithm accepts phantoms of any size and resolution to calculate the fraction of incident photons scattered back to the detector, and can perform these calculations for any detector geometry and source specification. To validate the model, results were compared to MCNP-X, which is a Monte Carlo based simulation software, for various combinations of source specifications, detector geometries, and phantom compositions. Results: The model verified that the backscatter signal to the detector was based on three interaction probabilities: a) attenuation of photons going into the phantom, b) Compton scatter of photons toward the detector, and c) attenuation of photons coming out of the phantom. The results from the MCNP-X simulations and the forward model varied from 1 to 5%. This difference was less than 1% for energies higher than 30 keV, but was up to 4% for lower energies. At 50 keV, the difference was less than 1% for multiple detector widths and for both homogeneous and heterogeneous phantoms. Conclusion: As part of the optimization of a medical CBI system, an efficient and accurate forward model was constructed in C++ to estimate the output of CBI system. The model characterized individual components contributing to CBI output and increased computational efficiency over Monte Carlo simulations. It is now used in the development of novel post-processing algorithms that reduce image blur by reversing undesired contribution from outside the region of interest.

  9. Verification Test of Power Fluctuation Suppression System for Large PV

    NASA Astrophysics Data System (ADS)

    Noro, Yasuhiro; Naoi, Shinya; Toba, Koji; Kimura, Misao; Minegishi, Toshiaki; Shimizu, Masanao; Aoki, Shinichi; Okuda, Yasuo

    The large scale photovoltaic (PV) generation station is expected to spread in the future. However, output power of renewable energy sources such as PV is affected by weather conditions and their output tends to be unstable. As a result, the penetration of PV power station makes it difficult to maintain frequency of power system in allowable range. The authors have developed a suppression system to stabilize output power fluctuation of a large PV generation station. To reduce short term fluctuation, storage batteries applying SCiBTM are used. In this paper, verification test results are explained and simulation results to improve control performance are also shown.

  10. 29 CFR 1926.441 - Batteries and battery charging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 8 2013-07-01 2013-07-01 false Batteries and battery charging. 1926.441 Section 1926.441... for Special Equipment § 1926.441 Batteries and battery charging. (a) General requirements—(1) Batteries of the unsealed type shall be located in enclosures with outside vents or in well ventilated...

  11. 29 CFR 1926.441 - Batteries and battery charging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 8 2012-07-01 2012-07-01 false Batteries and battery charging. 1926.441 Section 1926.441... for Special Equipment § 1926.441 Batteries and battery charging. (a) General requirements—(1) Batteries of the unsealed type shall be located in enclosures with outside vents or in well ventilated...

  12. 29 CFR 1926.441 - Batteries and battery charging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 8 2014-07-01 2014-07-01 false Batteries and battery charging. 1926.441 Section 1926.441... for Special Equipment § 1926.441 Batteries and battery charging. (a) General requirements—(1) Batteries of the unsealed type shall be located in enclosures with outside vents or in well ventilated...

  13. 29 CFR 1926.441 - Batteries and battery charging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 8 2010-07-01 2010-07-01 false Batteries and battery charging. 1926.441 Section 1926.441... for Special Equipment § 1926.441 Batteries and battery charging. (a) General requirements—(1) Batteries of the unsealed type shall be located in enclosures with outside vents or in well ventilated...

  14. 29 CFR 1926.441 - Batteries and battery charging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 8 2011-07-01 2011-07-01 false Batteries and battery charging. 1926.441 Section 1926.441... for Special Equipment § 1926.441 Batteries and battery charging. (a) General requirements—(1) Batteries of the unsealed type shall be located in enclosures with outside vents or in well ventilated...

  15. Exploring the Model Design Space for Battery Health Management

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar; Quach, Cuong Chi; Goebel, Kai Frank

    2011-01-01

    Battery Health Management (BHM) is a core enabling technology for the success and widespread adoption of the emerging electric vehicles of today. Although battery chemistries have been studied in detail in literature, an accurate run-time battery life prediction algorithm has eluded us. Current reliability-based techniques are insufficient to manage the use of such batteries when they are an active power source with frequently varying loads in uncertain environments. The amount of usable charge of a battery for a given discharge profile is not only dependent on the starting state-of-charge (SOC), but also other factors like battery health and the discharge or load profile imposed. This paper presents a Particle Filter (PF) based BHM framework with plug-and-play modules for battery models and uncertainty management. The batteries are modeled at three different levels of granularity with associated uncertainty distributions, encoding the basic electrochemical processes of a Lithium-polymer battery. The effects of different choices in the model design space are explored in the context of prediction performance in an electric unmanned aerial vehicle (UAV) application with emulated flight profiles.

  16. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  17. Standard Missile Block IV battery

    SciTech Connect

    Martin, J.

    1996-11-01

    During the 1980`s a trend in automatic primary battery technologies was the replacement of silver-zinc batteries by thermal battery designs. The Standard missile (SM 2) Block IV development is a noteworthy reversal of this trend. The SM2, Block IV battery was originally attempted as a thermal battery with multiple companies attempting to develop a thermal battery design. These attempts resulted in failure to obtain a production thermal battery. A decision to pursue a silver-zinc battery design resulted in the development of a battery to supply the SM 2, Block IV (thermal battery design goal) and also the projected power requirements of the evolving SM 2, Block IVA in a single silver-zinc battery design. Several advancements in silver-zinc battery technology were utilized in this design that improve the producibility and extend the boundaries of silver-zinc batteries.

  18. 1992 five year battery forecast

    SciTech Connect

    Amistadi, D.

    1992-12-01

    Five-year trends for automotive and industrial batteries are projected. Topic covered include: SLI shipments; lead consumption; automotive batteries (5-year annual growth rates); industrial batteries (standby power and motive power); estimated average battery life by area/country for 1989; US motor vehicle registrations; replacement battery shipments; potential lead consumption in electric vehicles; BCI recycling rates for lead-acid batteries; US average car/light truck battery life; channels of distribution; replacement battery inventory end July; 2nd US battery shipment forecast.

  19. Charging the new batteries -- IC controllers track new technologies

    SciTech Connect

    Mammano, R.A.

    1995-07-01

    Demands for portability have fueled significant developments in new battery technology. These developments have resulted in many more options in selecting the battery type for use in a particular project, but since most applications today are opting for rechargeable battery systems, the availability of battery charging solutions can become an equally important criteria in the selection process. Complicating this process are the demands for fast-but safe-charging with charge algorithms easily implemented with low-cost hardware. With the higher levels of complexity attendant with these more demanding algorithms, solutions have come primarily from the integrated circuit industry and the purpose of this paper is to provide a few examples of the latest efforts in this arena, specifically as addressed to lead-acid, nickel metal-hydride, and lithium-ion technologies.

  20. Electric-vehicle batteries

    NASA Astrophysics Data System (ADS)

    Oman, Henry; Gross, Sid

    1995-02-01

    Electric vehicles that can't reach trolley wires need batteries. In the early 1900's electric cars disappeared when owners found that replacing the car's worn-out lead-acid battery costs more than a new gasoline-powered car. Most of today's electric cars are still propelled by lead-acid batteries. General Motors in their prototype Impact, for example, used starting-lighting-ignition batteries, which deliver lots of power for demonstrations, but have a life of less than 100 deep discharges. Now promising alternative technology has challenged the world-wide lead miners, refiners, and battery makers into forming a consortium that sponsors research into making better lead-acid batteries. Horizon's new bipolar battery delivered 50 watt-hours per kg (Wh/kg), compared with 20 for ordinary transport-vehicle batteries. The alternatives are delivering from 80 Wh/kg (nickel-metal hydride) up to 200 Wh/kg (zinc-bromine). A Fiat Panda traveled 260 km on a single charge of its zinc-bromine battery. A German 3.5-ton postal truck traveled 300 km with a single charge in its 650-kg (146 Wh/kg) zinc-air battery. Its top speed was 110 km per hour.

  1. The 1975 GSFC Battery Workshop

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The proceedings of the 1975 Goddard Space Flight Center Battery Workshop are presented. The major topics of discussion were nickel cadmium batteries and, to a lesser extent, nickel hydrogen batteries. Battery design, manufacturing techniques, testing programs, and electrochemical characteristics were considered. The utilization of these batteries for spacecraft power supplies was given particular attention.

  2. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  3. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  4. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  5. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  6. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  7. Interface Generation and Compositional Verification in JavaPathfinder

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina

    2009-01-01

    We present a novel algorithm for interface generation of software components. Given a component, our algorithm uses learning techniques to compute a permissive interface representing legal usage of the component. Unlike our previous work, this algorithm does not require knowledge about the component s environment. Furthermore, in contrast to other related approaches, our algorithm computes permissive interfaces even in the presence of non-determinism in the component. Our algorithm is implemented in the JavaPathfinder model checking framework for UML statechart components. We have also added support for automated assume-guarantee style compositional verification in JavaPathfinder, using component interfaces. We report on the application of the presented approach to the generation of interfaces for flight software components.

  8. A Study of a Network-Flow Algorithm and a Noncorrecting Algorithm for Test Assembly.

    ERIC Educational Resources Information Center

    Armstrong, R. D.; And Others

    1996-01-01

    When the network-flow algorithm (NFA) and the average growth approximation algorithm (AGAA) were used for automated test assembly with American College Test and Armed Services Vocational Aptitude Battery item banks, results indicate that reasonable error in item parameters is not harmful for test assembly using NFA or AGAA. (SLD)

  9. Delivery verification and dose reconstruction in tomotherapy

    NASA Astrophysics Data System (ADS)

    Kapatoes, Jeffrey Michael

    2000-11-01

    It has long been a desire in photon-beam radiation therapy to make use of the significant fraction of the beam exiting the patient to infer how much of the beam energy was actually deposited in the patient. With a linear accelerator and corresponding exit detector mounted on the same ring gantry, tomotherapy provides a unique opportunity to accomplish this. Dose reconstruction describes the process in which the full three-dimensional dose actually deposited in a patient is computed. Dose reconstruction requires two inputs: an image of the patient at the time of treatment and the actual energy fluence delivered. Dose is reconstructed by computing the dose in the CT with the verified energy fluence using any model-based algorithm such as convolution/superposition or Monte Carlo. In tomotherapy, the CT at the time of treatment is obtained by megavoltage CT, the merits of which have been studied and proven. The actual energy fluence delivered to the patient is computed in a process called delivery verification. Methods for delivery verification and dose reconstruction in tomotherapy were investigated in this work. It is shown that delivery verification can be realized by a linear model of the tornotherapy system. However, due to the measurements required with this initial approach, clinical implementation would be difficult. Therefore, a clinically viable method for delivery verification was established, the details of which are discussed. With the verified energy fluence from delivery verification, an assessment of the accuracy and usefulness of dose reconstruction is performed. The latter two topics are presented in the context of a generalized dose comparison tool developed for intensity modulated radiation therapy. Finally, the importance of having a CT from the time of treatment for reconstructing the dose is shown. This is currently a point of contention in modern clinical radiotherapy and it is proven that using the incorrect CT for dose reconstruction can lead

  10. Fundamentals of battery dynamics

    NASA Astrophysics Data System (ADS)

    Jossen, Andreas

    Modern applications, such as wireless communication systems or hybrid electric vehicles operate at high power fluctuations. For some applications, where the power frequencies are high (above some 10 or 100 Hz) it is possible to filter the high frequencies using passive components; yet this results in additional costs. In other applications, where the dynamic time constants are in the range up to some seconds, filtering cannot be done. Batteries are hence operated with the dynamic loads. But what happens under these dynamic operation conditions? This paper describes the fundamentals of the dynamic characteristics of batteries in a frequency range from some MHz down to the mHz range. As the dynamic behaviour depends on the actual state of charge (SOC) and the state of health (SOH), it is possible to gain information on the battery state by analysing the dynamic behaviour. High dynamic loads can influence the battery temperature, the battery performance and the battery lifetime.