Research on Generating Method of Embedded Software Test Document Based on Dynamic Model
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
A Model-Based Method for Content Validation of Automatically Generated Test Items
ERIC Educational Resources Information Center
Zhang, Xinxin; Gierl, Mark
2016-01-01
The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
Loss of feed flow, steam generator tube rupture and steam line break thermohydraulic experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendler, O J; Takeuchi, K; Young, M Y
1986-10-01
The Westinghouse Model Boiler No. 2 (MB-2) steam generator test model at the Engineering Test Facility in Tampa, Florida, was reinstrumented and modified for performing a series of tests simulating steam generator accident transients. The transients simulated were: loss of feed flow, steam generator tube rupture, and steam line break events. This document presents a description of (1) the model boiler and the associated test facility, (2) the tests performed, and (3) the analyses of the test results.
Evaluating the Psychometric Characteristics of Generated Multiple-Choice Test Items
ERIC Educational Resources Information Center
Gierl, Mark J.; Lai, Hollis; Pugh, Debra; Touchie, Claire; Boulais, André-Philippe; De Champlain, André
2016-01-01
Item development is a time- and resource-intensive process. Automatic item generation integrates cognitive modeling with computer technology to systematically generate test items. To date, however, items generated using cognitive modeling procedures have received limited use in operational testing situations. As a result, the psychometric…
An Efficient Functional Test Generation Method For Processors Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Hudec, Ján; Gramatová, Elena
2015-07-01
The paper presents a new functional test generation method for processors testing based on genetic algorithms and evolutionary strategies. The tests are generated over an instruction set architecture and a processor description. Such functional tests belong to the software-oriented testing. Quality of the tests is evaluated by code coverage of the processor description using simulation. The presented test generation method uses VHDL models of processors and the professional simulator ModelSim. The rules, parameters and fitness functions were defined for various genetic algorithms used in automatic test generation. Functionality and effectiveness were evaluated using the RISC type processor DP32.
Modal Survey of ETM-3, A 5-Segment Derivative of the Space Shuttle Solid Rocket Booster
NASA Technical Reports Server (NTRS)
Nielsen, D.; Townsend, J.; Kappus, K.; Driskill, T.; Torres, I.; Parks, R.
2005-01-01
The complex interactions between internal motor generated pressure oscillations and motor structural vibration modes associated with the static test configuration of a Reusable Solid Rocket Motor have potential to generate significant dynamic thrust loads in the 5-segment configuration (Engineering Test Motor 3). Finite element model load predictions for worst-case conditions were generated based on extrapolation of a previously correlated 4-segment motor model. A modal survey was performed on the largest rocket motor to date, Engineering Test Motor #3 (ETM-3), to provide data for finite element model correlation and validation of model generated design loads. The modal survey preparation included pretest analyses to determine an efficient analysis set selection using the Effective Independence Method and test simulations to assure critical test stand component loads did not exceed design limits. Historical Reusable Solid Rocket Motor modal testing, ETM-3 test analysis model development and pre-test loads analyses, as well as test execution, and a comparison of results to pre-test predictions are discussed.
Model-Based GUI Testing Using Uppaal at Novo Nordisk
NASA Astrophysics Data System (ADS)
Hjort, Ulrik H.; Illum, Jacob; Larsen, Kim G.; Petersen, Michael A.; Skou, Arne
This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates a test suite satisfying some testing criterion, such as edge or state coverage, and converts the individual test case into a scripting language that can be automatically executed against the target. The tool has significantly reduced the time required for test construction and generation, and reduced the number of test scripts while increasing the coverage.
Application for managing model-based material properties for simulation-based engineering
Hoffman, Edward L [Alameda, CA
2009-03-03
An application for generating a property set associated with a constitutive model of a material includes a first program module adapted to receive test data associated with the material and to extract loading conditions from the test data. A material model driver is adapted to receive the loading conditions and a property set and operable in response to the loading conditions and the property set to generate a model response for the material. A numerical optimization module is adapted to receive the test data and the model response and operable in response to the test data and the model response to generate the property set.
Path generation algorithm for UML graphic modeling of aerospace test software
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao
2018-03-01
Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.
Formal methods for test case generation
NASA Technical Reports Server (NTRS)
Rushby, John (Inventor); De Moura, Leonardo Mendonga (Inventor); Hamon, Gregoire (Inventor)
2011-01-01
The invention relates to the use of model checkers to generate efficient test sets for hardware and software systems. The method provides for extending existing tests to reach new coverage targets; searching *to* some or all of the uncovered targets in parallel; searching in parallel *from* some or all of the states reached in previous tests; and slicing the model relative to the current set of coverage targets. The invention provides efficient test case generation and test set formation. Deep regions of the state space can be reached within allotted time and memory. The approach has been applied to use of the model checkers of SRI's SAL system and to model-based designs developed in Stateflow. Stateflow models achieving complete state and transition coverage in a single test case are reported.
The Role of Item Models in Automatic Item Generation
ERIC Educational Resources Information Center
Gierl, Mark J.; Lai, Hollis
2012-01-01
Automatic item generation represents a relatively new but rapidly evolving research area where cognitive and psychometric theories are used to produce tests that include items generated using computer technology. Automatic item generation requires two steps. First, test development specialists create item models, which are comparable to templates…
Test Generator for MATLAB Simulations
NASA Technical Reports Server (NTRS)
Henry, Joel
2011-01-01
MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
Model-Driven Test Generation of Distributed Systems
NASA Technical Reports Server (NTRS)
Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin
2012-01-01
This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.
Test-Case Generation using an Explicit State Model Checker Final Report
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Gao, Jimin
2003-01-01
In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.
Test Input Generation for Red-Black Trees using Abstraction
NASA Technical Reports Server (NTRS)
Visser, Willem; Pasareanu, Corina S.; Pelanek, Radek
2005-01-01
We consider the problem of test input generation for code that manipulates complex data structures. Test inputs are sequences of method calls from the data structure interface. We describe test input generation techniques that rely on state matching to avoid generation of redundant tests. Exhaustive techniques use explicit state model checking to explore all the possible test sequences up to predefined input sizes. Lossy techniques rely on abstraction mappings to compute and store abstract versions of the concrete states; they explore under-approximations of all the possible test sequences. We have implemented the techniques on top of the Java PathFinder model checker and we evaluate them using a Java implementation of red-black trees.
Renewable Energy Generation and Storage Models | Grid Modernization | NREL
-the-loop testing Projects Generator, Plant, and Storage Modeling, Simulation, and Validation NREL power plants. Power Hardware-in-the-Loop Testing NREL researchers are developing software-and-hardware -combined simulation testing methods known as power hardware-in-the-loop testing. Power hardware in the loop
Test pattern generation for ILA sequential circuits
NASA Technical Reports Server (NTRS)
Feng, YU; Frenzel, James F.; Maki, Gary K.
1993-01-01
An efficient method of generating test patterns for sequential machines implemented using one-dimensional, unilateral, iterative logic arrays (ILA's) of BTS pass transistor networks is presented. Based on a transistor level fault model, the method affords a unique opportunity for real-time fault detection with improved fault coverage. The resulting test sets are shown to be equivalent to those obtained using conventional gate level models, thus eliminating the need for additional test patterns. The proposed method advances the simplicity and ease of the test pattern generation for a special class of sequential circuitry.
Operator's Manual for Waveform Generator Model RPG-6236-A
DOT National Transportation Integrated Search
1988-02-01
The waveform generator, described in this manual, provides a reference signal standard for use in testing the performance of crash test data acquisition systems. During the test, the waveform generator provides the signal inputs to the data acquisiti...
Phase I Experimental Testing of a Generic Submarine Model in the DSTO Low Speed Wind Tunnel
2012-07-01
used during the tests , along with the test methodology . A sub-set of the data gathered is also presented and briefly discussed. Overall, the force...total pressure probe when positioned close to the model. 4. Results Selected results from the testing of the generic submarine model in the...Appendix B summaries the test conditions. 4.3.1 Smoke Generator and Probe An Aerotech smoke generator and probe were used for visualisation of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée; McKay, Erin
Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of amore » given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. Conclusions: The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.« less
Garcia, Marie-Paule; Villoing, Daphnée; McKay, Erin; Ferrer, Ludovic; Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila; Bardiès, Manuel
2015-12-01
The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit gate offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on gate to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user's imaging requirements and generates automatically command files used as input for gate. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant gate input files are generated for the virtual patient model and associated pharmacokinetics. Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body "step and shoot" acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
NASA Astrophysics Data System (ADS)
Boakye-Boateng, Nasir Abdulai
The growing demand for wind power integration into the generation mix prompts the need to subject these systems to stringent performance requirements. This study sought to identify the required tools and procedures needed to perform real-time simulation studies of Doubly-Fed Induction Generator (DFIG) based wind generation systems as basis for performing more practical tests of reliability and performance for both grid-connected and islanded wind generation systems. The author focused on developing a platform for wind generation studies and in addition, the author tested the performance of two DFIG models on the platform real-time simulation model; an average SimpowerSystemsRTM DFIG wind turbine, and a detailed DFIG based wind turbine using ARTEMiSRTM components. The platform model implemented here consists of a high voltage transmission system with four integrated wind farm models consisting in total of 65 DFIG based wind turbines and it was developed and tested on OPAL-RT's eMEGASimRTM Real-Time Digital Simulator.
Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder
NASA Technical Reports Server (NTRS)
Staats, Matt
2009-01-01
We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.
The new Kuznets cycle: a test of the Easterlin-Wachter-Wachter hypothesis.
Ahlburg, D A
1982-01-01
The aim of this paper is to evaluate the Easterlin-Wachter-Wachter model of the effect of the size of one generation on the size of the succeeding generation. An attempt is made "to identify and test empirically each component of the Easterlin-Wachter-Wachter model..., to show how the components collapse to give a closed demographic model of generation size, and to investigate the impacts of relative cohort size on the economic performance of a cohort." The models derived are then used to generate forecasts of the U.S. birth rate to the year 2050. The results provide support for the major components of the original model. excerpt
NASA Technical Reports Server (NTRS)
Stankovic, Ana V.
2003-01-01
Professor Stankovic will be developing and refining Simulink based models of the PM alternator and comparing the simulation results with experimental measurements taken from the unit. Her first task is to validate the models using the experimental data. Her next task is to develop alternative control techniques for the application of the Brayton Cycle PM Alternator in a nuclear electric propulsion vehicle. The control techniques will be first simulated using the validated models then tried experimentally with hardware available at NASA. Testing and simulation of a 2KW PM synchronous generator with diode bridge output is described. The parameters of a synchronous PM generator have been measured and used in simulation. Test procedures have been developed to verify the PM generator model with diode bridge output. Experimental and simulation results are in excellent agreement.
Positive and negative generation effects in source monitoring.
Riefer, David M; Chien, Yuchin; Reimer, Jason F
2007-10-01
Research is mixed as to whether self-generation improves memory for the source of information. We propose the hypothesis that positive generation effects (better source memory for self-generated information) occur in reality-monitoring paradigms, while negative generation effects (better source memory for externally presented information) tend to occur in external source-monitoring paradigms. This hypothesis was tested in an experiment in which participants read or generated words, followed by a memory test for the source of each word (read or generated) and the word's colour. Meiser and Bröder's (2002) multinomial model for crossed source dimensions was used to analyse the data, showing that source memory for generation (reality monitoring) was superior for the generated words, while source memory for word colour (external source monitoring) was superior for the read words. The model also revealed the influence of strong response biases in the data, demonstrating the usefulness of formal modelling when examining generation effects in source monitoring.
Mathematical modeling to predict residential solid waste generation.
Benítez, Sara Ojeda; Lozano-Olvera, Gabriela; Morelos, Raúl Adalberto; Vega, Carolina Armijo de
2008-01-01
One of the challenges faced by waste management authorities is determining the amount of waste generated by households in order to establish waste management systems, as well as trying to charge rates compatible with the principle applied worldwide, and design a fair payment system for households according to the amount of residential solid waste (RSW) they generate. The goal of this research work was to establish mathematical models that correlate the generation of RSW per capita to the following variables: education, income per household, and number of residents. This work was based on data from a study on generation, quantification and composition of residential waste in a Mexican city in three stages. In order to define prediction models, five variables were identified and included in the model. For each waste sampling stage a different mathematical model was developed, in order to find the model that showed the best linear relation to predict residential solid waste generation. Later on, models to explore the combination of included variables and select those which showed a higher R(2) were established. The tests applied were normality, multicolinearity and heteroskedasticity. Another model, formulated with four variables, was generated and the Durban-Watson test was applied to it. Finally, a general mathematical model is proposed to predict residential waste generation, which accounts for 51% of the total.
Multi-Fidelity Framework for Modeling Combustion Instability
2016-07-27
generated from the reduced-domain dataset. Evaluations of the framework are performed based on simplified test problems for a model rocket combustor showing...generated from the reduced-domain dataset. Evaluations of the framework are performed based on simplified test problems for a model rocket combustor...of Aeronautics and Astronautics and Associate Fellow AIAA. ‡ Professor Emeritus. § Senior Scientist, Rocket Propulsion Division and Senior Member
Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras
2018-05-01
The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majumdar, S.
1997-02-01
Available models for predicting failure of flawed and unflawed steam generator tubes under normal operating, accident, and severe accident conditions are reviewed. Tests conducted in the past, though limited, tended to show that the earlier flow-stress model for part-through-wall axial cracks overestimated the damaging influence of deep cracks. This observation was confirmed by further tests at high temperatures, as well as by finite-element analysis. A modified correlation for deep cracks can correct this shortcoming of the model. Recent tests have shown that lateral restraint can significantly increase the failure pressure of tubes with unsymmetrical circumferential cracks. This observation was confirmedmore » by finite-element analysis. The rate-independent flow stress models that are successful at low temperatures cannot predict the rate-sensitive failure behavior of steam generator tubes at high temperatures. Therefore, a creep rupture model for predicting failure was developed and validated by tests under various temperature and pressure loadings that can occur during postulated severe accidents.« less
A Path to an Instructional Science: Data-Generated vs. Postulated Models
ERIC Educational Resources Information Center
Gropper, George L.
2016-01-01
Psychological testing can serve as a prototype on which to base a data-generated approach to instructional design. In "testing batteries" tests are used to predict achievement. In the proposed approach batteries of prescriptions would be used to produce achievement. In creating "test batteries" tests are selected for their…
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks
2017-01-01
In de novo drug design, computational strategies are used to generate novel molecules with good affinity to the desired biological target. In this work, we show that recurrent neural networks can be trained as generative models for molecular structures, similar to statistical language models in natural language processing. We demonstrate that the properties of the generated molecules correlate very well with the properties of the molecules used to train the model. In order to enrich libraries with molecules active toward a given biological target, we propose to fine-tune the model with small sets of molecules, which are known to be active against that target. Against Staphylococcus aureus, the model reproduced 14% of 6051 hold-out test molecules that medicinal chemists designed, whereas against Plasmodium falciparum (Malaria), it reproduced 28% of 1240 test molecules. When coupled with a scoring function, our model can perform the complete de novo drug design cycle to generate large sets of novel molecules for drug discovery. PMID:29392184
A Micro-Computer Model for Army Air Defense Training.
1985-03-01
generator. The period is 32763 numbers generated before a repetitive sequence is encountered on the development system. Chi-Squared tests for frequency...C’ Tests CPeriodicity. The period is 32763 numbers generated C’before a repetitive sequence is encountered on the development system. This was...positions in the test array. This was done with several different random number seeds. In each case 32763 p random numbers were generated before a
Fast solver for large scale eddy current non-destructive evaluation problems
NASA Astrophysics Data System (ADS)
Lei, Naiguang
Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two-dimension raster scan data typically takes one to two days on a dedicated eight-core PC. A novel direct integral solver for eddy current problems and GPU-based implementation is also investigated in this research to reduce the computational time.
ERIC Educational Resources Information Center
Gierl, Mark J.; Lai, Hollis
2013-01-01
Changes to the design and development of our educational assessments are resulting in the unprecedented demand for a large and continuous supply of content-specific test items. One way to address this growing demand is with automatic item generation (AIG). AIG is the process of using item models to generate test items with the aid of computer…
Materials Database Development for Ballistic Impact Modeling
NASA Technical Reports Server (NTRS)
Pereira, J. Michael
2007-01-01
A set of experimental data is being generated under the Fundamental Aeronautics Program Supersonics project to help create and validate accurate computational impact models of jet engine impact events. The data generated will include material property data generated at a range of different strain rates, from 1x10(exp -4)/sec to 5x10(exp 4)/sec, over a range of temperatures. In addition, carefully instrumented ballistic impact tests will be conducted on flat plates and curved structures to provide material and structural response information to help validate the computational models. The material property data and the ballistic impact data will be generated using materials from the same lot, as far as possible. It was found in preliminary testing that the surface finish of test specimens has an effect on measured high strain rate tension response of AL2024. Both the maximum stress and maximum elongation are greater on specimens with a smoother finish. This report gives an overview of the testing that is being conducted and presents results of preliminary testing of the surface finish study.
Ground Reaction Forces Generated During Rhythmical Squats as a Dynamic Loads of the Structure
NASA Astrophysics Data System (ADS)
Pantak, Marek
2017-10-01
Dynamic forces generated by moving persons can lead to excessive vibration of the long span, slender and lightweight structure such as floors, stairs, stadium stands and footbridges. These dynamic forces are generated during walking, running, jumping and rhythmical body swaying in vertical or horizontal direction etc. In the paper the mathematical models of the Ground Reaction Forces (GRFs) generated during squats have been presented. Elaborated models was compared to the GRFs measured during laboratory tests carried out by author in wide range of frequency using force platform. Moreover, the GRFs models were evaluated during dynamic numerical analyses and dynamic field tests of the exemplary structure (steel footbridge).
Object-Oriented Modeling of an Energy Harvesting System Based on Thermoelectric Generators
NASA Astrophysics Data System (ADS)
Nesarajah, Marco; Frey, Georg
This paper deals with the modeling of an energy harvesting system based on thermoelectric generators (TEG), and the validation of the model by means of a test bench. TEGs are capable to improve the overall energy efficiency of energy systems, e.g. combustion engines or heating systems, by using the remaining waste heat to generate electrical power. Previously, a component-oriented model of the TEG itself was developed in Modelica® language. With this model any TEG can be described and simulated given the material properties and the physical dimension. Now, this model was extended by the surrounding components to a complete model of a thermoelectric energy harvesting system. In addition to the TEG, the model contains the cooling system, the heat source, and the power electronics. To validate the simulation model, a test bench was built and installed on an oil-fired household heating system. The paper reports results of the measurements and discusses the validity of the developed simulation models. Furthermore, the efficiency of the proposed energy harvesting system is derived and possible improvements based on design variations tested in the simulation model are proposed.
2016-10-04
model of 1.24 m with the PGAD and control surface 3 1.2. Design and manufacture of the gust generator (frame, blades , actuation and control system...Chapter 3, a gust generator with two rotating blades was designed and manufactured to induce a transverse turbulence for wind tunnel test. A CFD...velocity at 8C (eight times of blade chord length) achieved 1.3%. In Chapter 4, the wind tunnel test of the scaled wing model is presented, including the
Steiger, Andrea E; Fend, Helmut A; Allemand, Mathias
2015-02-01
The vulnerability model states that low self-esteem functions as a predictor for the development of depressive symptoms whereas the scar model assumes that these symptoms leave scars in individuals resulting in lower self-esteem. Both models have received empirical support, however, they have only been tested within individuals and not across generations (i.e., between family members). Thus, we tested the scope of these competing models by (a) investigating whether the effects hold from adolescence to middle adulthood (long-term vulnerability and scar effects), (b) whether the effects hold across generations (intergenerational vulnerability and scar effects), and (c) whether intergenerational effects are mediated by parental self-esteem and depressive symptoms and parent-child discord. We used longitudinal data from adolescence to middle adulthood (N = 1,359) and from Generation 1 adolescents (G1) to Generation 2 adolescents (G2) (N = 572 parent-child pairs). Results from latent cross-lagged regression analyses demonstrated that both adolescent self-esteem and depressive symptoms were prospectively related to adult self-esteem and depressive symptoms 3 decades later. That is, both the vulnerability and scar models are valid over decades with stronger effects for the vulnerability model. Across generations, we found a substantial direct transmission effect from G1 to G2 adolescent depressive symptoms but no evidence for the proposed intergenerational vulnerability and scar effect or for any of the proposed mediating mechanisms. PsycINFO Database Record (c) 2015 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Żukowicz, Marek; Markiewicz, Michał
2016-09-01
The aim of the article is to present a mathematical definition of the object model, that is known in computer science as TreeList and to show application of this model for design evolutionary algorithm, that purpose is to generate structures based on this object. The first chapter introduces the reader to the problem of presenting data using the TreeList object. The second chapter describes the problem of testing data structures based on TreeList. The third one shows a mathematical model of the object TreeList and the parameters, used in determining the utility of structures created through this model and in evolutionary strategy, that generates these structures for testing purposes. The last chapter provides a brief summary and plans for future research related to the algorithm presented in the article.
Daschewski, M; Kreutzbruck, M; Prager, J
2015-12-01
In this work we experimentally verify the theoretical prediction of the recently published Energy Density Fluctuation Model (EDF-model) of thermo-acoustic sound generation. Particularly, we investigate experimentally the influence of thermal inertia of an electrically conductive film on the efficiency of thermal airborne ultrasound generation predicted by the EDF-model. Unlike widely used theories, the EDF-model predicts that the thermal inertia of the electrically conductive film is a frequency-dependent parameter. Its influence grows non-linearly with the increase of excitation frequency and reduces the efficiency of the ultrasound generation. Thus, this parameter is the major limiting factor for the efficient thermal airborne ultrasound generation in the MHz-range. To verify this theoretical prediction experimentally, five thermo-acoustic emitter samples consisting of Indium-Tin-Oxide (ITO) coatings of different thicknesses (from 65 nm to 1.44 μm) on quartz glass substrates were tested for airborne ultrasound generation in a frequency range from 10 kHz to 800 kHz. For the measurement of thermally generated sound pressures a laser Doppler vibrometer combined with a 12 μm thin polyethylene foil was used as the sound pressure detector. All tested thermo-acoustic emitter samples showed a resonance-free frequency response in the entire tested frequency range. The thermal inertia of the heat producing film acts as a low-pass filter and reduces the generated sound pressure with the increasing excitation frequency and the ITO film thickness. The difference of generated sound pressure levels for samples with 65 nm and 1.44 μm thickness is in the order of about 6 dB at 50 kHz and of about 12 dB at 500 kHz. A comparison of sound pressure levels measured experimentally and those predicted by the EDF-model shows for all tested emitter samples a relative error of less than ±6%. Thus, experimental results confirm the prediction of the EDF-model and show that the model can be applied for design and optimization of thermo-acoustic airborne ultrasound emitters. Copyright © 2015 Elsevier B.V. All rights reserved.
Swales, Henry; Banko, Richard; Coakley, David
2015-06-03
Aquantis 2.5 MW Ocean Current Generation Device, Tow Tank Dynamic Test Rig Drawings and Bill of Materials. This submission contains information on the equipment for the scaled model tow tank testing. The information includes hardware, test protocols, and plans.
ERIC Educational Resources Information Center
Sohr-Preston, Sara L.; Scaramella, Laura V.; Martin, Monica J.; Neppl, Tricia K.; Ontai, Lenna; Conger, Rand
2013-01-01
This third-generation, longitudinal study evaluated a family investment perspective on family socioeconomic status (SES), parental investments in children, and child development. The theoretical framework was tested for first-generation parents (G1), their children (G2), and the children of the second generation (G3). G1 SES was expected to…
ERIC Educational Resources Information Center
Bogiages, Christopher A.; Lotter, Christine
2011-01-01
In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shurupov, A. V.; Zavalova, V. E., E-mail: zavalova@fites.ru; Kozlov, A. V.
The report presents the results of the development and field testing of a mobile test facility based on a helical magnetic cumulative generator (MCGTF). The system is designed for full-scale modeling of lightning currents to study the safety of power plants of any type, including nuclear power plants. Advanced technologies of high-energy physics for solving both engineering and applied problems underlie this pilot project. The energy from the magnetic cumulative generator (MCG) is transferred to a high-impedance load with high efficiency of more than 50% using pulse transformer coupling. Modeling of the dynamics of the MEG that operates in amore » circuit with lumped parameters allows one to apply the law of inductance output during operation of the MCG, thus providing the required front of the current pulse in the load without using any switches. The results of field testing of the MCGTF are presented for both the ground loop and the model load. The ground loop generates a load resistance of 2–4 Ω. In the tests, the ohmic resistance of the model load is 10 Ω. It is shown that the current pulse parameters recorded in the resistive-inductive load are close to the calculated values.« less
Modeling and Simulation of U-tube Steam Generator
NASA Astrophysics Data System (ADS)
Zhang, Mingming; Fu, Zhongguang; Li, Jinyao; Wang, Mingfei
2018-03-01
The U-tube natural circulation steam generator was mainly researched with modeling and simulation in this article. The research is based on simuworks system simulation software platform. By analyzing the structural characteristics and the operating principle of U-tube steam generator, there are 14 control volumes in the model, including primary side, secondary side, down channel and steam plenum, etc. The model depends completely on conservation laws, and it is applied to make some simulation tests. The results show that the model is capable of simulating properly the dynamic response of U-tube steam generator.
NASA Technical Reports Server (NTRS)
Culpepper, William X.; ONeill, Pat; Nicholson, Leonard L.
2000-01-01
An internuclear cascade and evaporation model has been adapted to estimate the LET spectrum generated during testing with 200 MeV protons. The model-generated heavy ion LET spectrum is compared to the heavy ion LET spectrum seen on orbit. This comparison is the basis for predicting single event failure rates from heavy ions using results from a single proton test. Of equal importance, this spectra comparison also establishes an estimate of the risk of encountering a failure mode on orbit that was not detected during proton testing. Verification of the general results of the model is presented based on experiments, individual part test results, and flight data. Acceptance of this model and its estimate of remaining risk opens the hardware verification philosophy to the consideration of radiation testing with high energy protons at the board and box level instead of the more standard method of individual part testing with low energy heavy ions.
Significance Testing in Confirmatory Factor Analytic Models.
ERIC Educational Resources Information Center
Khattab, Ali-Maher; Hocevar, Dennis
Traditionally, confirmatory factor analytic models are tested against a null model of total independence. Using randomly generated factors in a matrix of 46 aptitude tests, this approach is shown to be unlikely to reject even random factors. An alternative null model, based on a single general factor, is suggested. In addition, an index of model…
Action perception as hypothesis testing.
Donnarumma, Francesco; Costantini, Marcello; Ambrosini, Ettore; Friston, Karl; Pezzulo, Giovanni
2017-04-01
We present a novel computational model that describes action perception as an active inferential process that combines motor prediction (the reuse of our own motor system to predict perceived movements) and hypothesis testing (the use of eye movements to disambiguate amongst hypotheses). The system uses a generative model of how (arm and hand) actions are performed to generate hypothesis-specific visual predictions, and directs saccades to the most informative places of the visual scene to test these predictions - and underlying hypotheses. We test the model using eye movement data from a human action observation study. In both the human study and our model, saccades are proactive whenever context affords accurate action prediction; but uncertainty induces a more reactive gaze strategy, via tracking the observed movements. Our model offers a novel perspective on action observation that highlights its active nature based on prediction dynamics and hypothesis testing. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Functional test generation for digital circuits described with a declarative language: LUSTRE
NASA Astrophysics Data System (ADS)
Almahrous, Mazen
1990-08-01
A functional approach to the test generation problem starting from a high level description is proposed. The circuit tested is modeled, using the LUSTRE high level data flow description language. The different LUSTRE primitives are translated to a SATAN format graph in order to evaluate the testability of the circuit and to generate test sequences. Another method of testing the complex circuits comprising an operative part and a control part is defined. It consists of checking experiments for the control part observed through the operative part. It was applied to the automata generated from a LUSTRE description of the circuit.
R. Johnson; K. Jayawickrama
2003-01-01
Gain from various orchard strategies were modeled. The scenario tested 2,000 first-generation open-pollinated families, from which orchards of 20 selections were formed, using either parents, progeny or both. This was followed by a second-generation breeding population in which 200 full-sib families were tested followed by a second-generation orchard of 20 selections....
Gas Generation Testing of Spherical Resorcinol-Formaldehyde (sRF) Resin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colburn, Heather A.; Bryan, Samuel A.; Camaioni, Donald M.
This report describes gas generation testing of the spherical resorcinol-formaldehyde (sRF) resin that was conducted to support the technology maturation of the LAWPS facility. The current safety basis for the LAWPS facility is based primarily on two studies that had limited or inconclusive data sets. The two studies indicated a 40% increase in hydrogen generation rate of water (as predicted by the Hu model) with sRF resin over water alone. However, the previous studies did not test the range of conditions (process fluids and temperatures) that are expected in the LAWPS facility. Additionally, the previous studies did not obtain replicatemore » test results or comparable liquid-only control samples. All of the testing described in this report, conducted with water, 0.45M nitric acid, and waste simulants with and without sRF resin, returned hydrogen generation rates that are within the current safety basis for the facility of 1.4 times the Hu model output for water.« less
An Extended IEEE 118-Bus Test System With High Renewable Penetration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pena, Ivonne; Martinez-Anido, Carlo Brancucci; Hodge, Bri-Mathias
This article describes a new publicly available version of the IEEE 118-bus test system, named NREL-118. The database is based on the transmission representation (buses and lines) of the IEEE 118-bus test system, with a reconfigured generation representation using three regions of the US Western Interconnection from the latest Western Electricity Coordination Council (WECC) 2024 Common Case [1]. Time-synchronous hourly load, wind, and solar time series are provided for over one year (8784 hours). The public database presented and described in this manuscript will allow researchers to model a test power system using detailed transmission, generation, load, wind, and solarmore » data. This database includes key additional features that add to the current IEEE 118-bus test model, such as: the inclusion of 10 generation technologies with different heat rate functions, minimum stable levels and ramping rates, GHG emissions rates, regulation and contingency reserves, and hourly time series data for one full year for load, wind and solar generation.« less
Modeling Tsunami Wave Generation Using a Two-layer Granular Landslide Model
NASA Astrophysics Data System (ADS)
Ma, G.; Kirby, J. T., Jr.; Shi, F.; Grilli, S. T.; Hsu, T. J.
2016-12-01
Tsunamis can be generated by subaerial or submarine landslides in reservoirs, lakes, fjords, bays and oceans. Compared to seismogenic tsunamis, landslide or submarine mass failure (SMF) tsunamis are normally characterized by relatively shorter wave lengths and stronger wave dispersion, and potentially may generate large wave amplitudes locally and high run-up along adjacent coastlines. Due to a complex interplay between the landslide and tsunami waves, accurate simulation of landslide motion as well as tsunami generation is a challenging task. We develop and test a new two-layer model for granular landslide motion and tsunami wave generation. The landslide is described as a saturated granular flow, accounting for intergranular stresses governed by Coulomb friction. Tsunami wave generation is simulated by the three-dimensional non-hydrostatic wave model NHWAVE, which is capable of capturing wave dispersion efficiently using a small number of discretized vertical levels. Depth-averaged governing equations for the granular landslide are derived in a slope-oriented coordinate system, taking into account the dynamic interaction between the lower-layer granular landslide and upper-layer water motion. The model is tested against laboratory experiments on impulsive wave generation by subaerial granular landslides. Model results illustrate a complex interplay between the granular landslide and tsunami waves, and they reasonably predict not only the tsunami wave generation but also the granular landslide motion from initiation to deposition.
Vessel Noise Affects Beaked Whale Behavior: Results of a Dedicated Acoustic Response Study
2012-08-01
the analysis. Gaussian Models Shapiro-Wilk test (Normality) Breusch - Pagan test (Heteroscedasticity) Durbin-Watson test (Independence) Foraging duration...noise) may disrupt behavior. An experiment involving the exposure of target whale groups to intense vessel-generated noise tested how these exposures...exposure of target whale groups to intense vessel-generated noise tested how these exposures influenced the foraging behavior of Blainville?s beaked
Model verification of large structural systems. [space shuttle model response
NASA Technical Reports Server (NTRS)
Lee, L. T.; Hasselman, T. K.
1978-01-01
A computer program for the application of parameter identification on the structural dynamic models of space shuttle and other large models with hundreds of degrees of freedom is described. Finite element, dynamic, analytic, and modal models are used to represent the structural system. The interface with math models is such that output from any structural analysis program applied to any structural configuration can be used directly. Processed data from either sine-sweep tests or resonant dwell tests are directly usable. The program uses measured modal data to condition the prior analystic model so as to improve the frequency match between model and test. A Bayesian estimator generates an improved analytical model and a linear estimator is used in an iterative fashion on highly nonlinear equations. Mass and stiffness scaling parameters are generated for an improved finite element model, and the optimum set of parameters is obtained in one step.
Using the Kaleidoscope Career Model to Examine Generational Differences in Work Attitudes
ERIC Educational Resources Information Center
Sullivan, Sherry E.; Forret, Monica L.; Carraher, Shawn M.; Mainiero, Lisa A.
2009-01-01
Purpose: The purpose of this paper is to examine, utilising the Kaleidoscope Career Model, whether members of the Baby Boom generation and Generation X differ in their needs for authenticity, balance, and challenge. Design/methodology/approach: Survey data were obtained from 982 professionals located across the USA. Correlations, t-tests, and…
Evaluation of Generation Alternation Models in Evolutionary Robotics
NASA Astrophysics Data System (ADS)
Oiso, Masashi; Matsumura, Yoshiyuki; Yasuda, Toshiyuki; Ohkura, Kazuhiro
For efficient implementation of Evolutionary Algorithms (EA) to a desktop grid computing environment, we propose a new generation alternation model called Grid-Oriented-Deletion (GOD) based on comparison with the conventional techniques. In previous research, generation alternation models are generally evaluated by using test functions. However, their exploration performance on the real problems such as Evolutionary Robotics (ER) has not been made very clear yet. Therefore we investigate the relationship between the exploration performance of EA on an ER problem and its generation alternation model. We applied four generation alternation models to the Evolutionary Multi-Robotics (EMR), which is the package-pushing problem to investigate their exploration performance. The results show that GOD is more effective than the other conventional models.
Development of the mathematical model for design and verification of acoustic modal analysis methods
NASA Astrophysics Data System (ADS)
Siner, Alexander; Startseva, Maria
2016-10-01
To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.
Second Generation Crop Yield Models Review
NASA Technical Reports Server (NTRS)
Hodges, T. (Principal Investigator)
1982-01-01
Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, W.S.
Progress during the period includes completion of the SNAP 7C system tests, completion of safety analysis for the SNAP 7A and C systems, assembly and initial testing of SNAP 7A, assembly of a modified reliability model, and assembly of a 10-W generator. Other activities include completion of thermal and safety analyses for SNAP 7B and D generators and fuel processing for these generators. (J.R.D.)
Automatic 3d Building Model Generations with Airborne LiDAR Data
NASA Astrophysics Data System (ADS)
Yastikli, N.; Cetin, Z.
2017-11-01
LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D building models can be generated successfully using raw LiDAR point cloud data.
Lessons Learned During Instrument Testing for the Thermal Infrared Sensor (TIRS)
NASA Technical Reports Server (NTRS)
Peabody, Hume L.; Otero, Veronica; Neuberger, David
2013-01-01
The Themal InfraRed Sensor (TIRS) instrument, set to launch on the Landsat Data Continuity Mission in 2013, features a passively cooled telescope and IR detectors which are actively cooled by a two stage cryocooler. In order to proceed to the instrument level test campaign, at least one full functional test was required, necessitating a thermal vacuum test to sufficiently cool the detectors and demonstrate performance. This was fairly unique in that this test occurred before the Pre Environmental Review, but yielded significant knowledge gains before the planned instrument level test. During the pre-PER test, numerous discrepancies were found between the model and the actual hardware, which were revealed by poor correlation between model predictions and test data. With the inclusion of pseudo-balance points, the test also provided an opportunity to perform a pre-correlation to test data prior to the instrument level test campaign. Various lessons were learned during this test related to modeling and design of both the flight hardware and the Ground Support Equipment and test setup. The lessons learned in the pre-PER test resulted in a better test setup for the nstrument level test and the completion of the final instrument model correlation in a shorter period of time. Upon completion of the correlation, the flight predictions were generated including the full suite of off-nominal cases, including some new cases defined by the spacecraft. For some of these ·new cases, some components now revealed limit exceedances, in particular for a portion of the hardware that could not be tested due to its size and chamber limitations.. Further lessons were learned during the completion of flight predictions. With a correlated detalled instrument model, significant efforts were made to generate a reduced model suitable for observatory level analyses. This proved a major effort both to generate an appropriate network as well as to convert to the final model to the required format and yielded additional lessons learned. In spite of all the challenges encountered by TIRS, the instrument was successfully delivered to the spacecraft and will soon be tested at observatory level in preparation for a successful mission launch.
Generating Test Templates via Automated Theorem Proving
NASA Technical Reports Server (NTRS)
Kancherla, Mani Prasad
1997-01-01
Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.
Xenograft model for therapeutic drug testing in recurrent respiratory papillomatosis.
Ahn, Julie; Bishop, Justin A; Akpeng, Belinda; Pai, Sara I; Best, Simon R A
2015-02-01
Identifying effective treatment for papillomatosis is limited by a lack of animal models, and there is currently no preclinical model for testing potential therapeutic agents. We hypothesized that xenografting of papilloma may facilitate in vivo drug testing to identify novel treatment options. A biopsy of fresh tracheal papilloma was xenografted into a NOD-scid-IL2Rgamma(null) (NSG) mouse. The xenograft began growing after 5 weeks and was serially passaged over multiple generations. Each generation showed a consistent log-growth pattern, and in all xenografts, the presence of the human papillomavirus (HPV) genome was confirmed by polymerase chain reaction (PCR). Histopathologic analysis demonstrated that the squamous architecture of the original papilloma was maintained in each generation. In vivo drug testing with bevacizumab (5 mg/kg i.p. twice weekly for 3 weeks) showed a dramatic therapeutic response compared to saline control. We report here the first successful case of serial xenografting of a tracheal papilloma in vivo with a therapeutic response observed with drug testing. In severely immunocompromised mice, the HPV genome and squamous differentiation of the papilloma can be maintained for multiple generations. This is a feasible approach to identify therapeutic agents in the treatment of recurrent respiratory papillomatosis. © The Author(s) 2014.
Development and optimization of a stove-powered thermoelectric generator
NASA Astrophysics Data System (ADS)
Mastbergen, Dan
Almost a third of the world's population still lacks access to electricity. Most of these people use biomass stoves for cooking which produce significant amounts of wasted thermal energy, but no electricity. Less than 1% of this energy in the form of electricity would be adequate for basic tasks such as lighting and communications. However, an affordable and reliable means of accomplishing this is currently nonexistent. The goal of this work is to develop a thermoelectric generator to convert a small amount of wasted heat into electricity. Although this concept has been around for decades, previous attempts have failed due to insufficient analysis of the system as a whole, leading to ineffective and costly designs. In this work, a complete design process is undertaken including concept generation, prototype testing, field testing, and redesign/optimization. Detailed component models are constructed and integrated to create a full system model. The model encompasses the stove operation, thermoelectric module, heat sinks, charging system and battery. A 3000 cycle endurance test was also conducted to evaluate the effects of operating temperature, module quality, and thermal interface quality on the generator's reliability, lifetime and cost effectiveness. The results from this testing are integrated into the system model to determine the lowest system cost in $/Watt over a five year period. Through this work the concept of a stove-based thermoelectric generator is shown to be technologically and economically feasible. In addition, a methodology is developed for optimizing the system for specific regional stove usage habits.
Validating EHR documents: automatic schematron generation using archetypes.
Pfeiffer, Klaus; Duftschmid, Georg; Rinner, Christoph
2014-01-01
The goal of this study was to examine whether Schematron schemas can be generated from archetypes. The openEHR Java reference API was used to transform an archetype into an object model, which was then extended with context elements. The model was processed and the constraints were transformed into corresponding Schematron assertions. A prototype of the generator for the reference model HL7 v3 CDA R2 was developed and successfully tested. Preconditions for its reusability with other reference models were set. Our results indicate that an automated generation of Schematron schemas is possible with some limitations.
Nonlinear model for offline correction of pulmonary waveform generators.
Reynolds, Jeffrey S; Stemple, Kimberly J; Petsko, Raymond A; Ebeling, Thomas R; Frazer, David G
2002-12-01
Pulmonary waveform generators consisting of motor-driven piston pumps are frequently used to test respiratory-function equipment such as spirometers and peak expiratory flow (PEF) meters. Gas compression within these generators can produce significant distortion of the output flow-time profile. A nonlinear model of the generator was developed along with a method to compensate for gas compression when testing pulmonary function equipment. The model and correction procedure were tested on an Assess Full Range PEF meter and a Micro DiaryCard PEF meter. The tests were performed using the 26 American Thoracic Society standard flow-time waveforms as the target flow profiles. Without correction, the pump loaded with the higher resistance Assess meter resulted in ten waveforms having a mean square error (MSE) higher than 0.001 L2/s2. Correction of the pump for these ten waveforms resulted in a mean decrease in MSE of 87.0%. When loaded with the Micro DiaryCard meter, the uncorrected pump outputs included six waveforms with MSE higher than 0.001 L2/s2. Pump corrections for these six waveforms resulted in a mean decrease in MSE of 58.4%.
Parametric analysis of ATM solar array.
NASA Technical Reports Server (NTRS)
Singh, B. K.; Adkisson, W. B.
1973-01-01
The paper discusses the methods used for the calculation of ATM solar array performance characteristics and provides the parametric analysis of solar panels used in SKYLAB. To predict the solar array performance under conditions other than test conditions, a mathematical model has been developed. Four computer programs have been used to convert the solar simulator test data to the parametric curves. The first performs module summations, the second determines average solar cell characteristics which will cause a mathematical model to generate a curve matching the test data, the third is a polynomial fit program which determines the polynomial equations for the solar cell characteristics versus temperature, and the fourth program uses the polynomial coefficients generated by the polynomial curve fit program to generate the parametric data.
ERIC Educational Resources Information Center
Beard, John; Yaprak, Attila
A content analysis model for assessing advertising themes and messages generated primarily for United States markets to overcome barriers in the cultural environment of international markets was developed and tested. The model is based on three primary categories for generating, evaluating, and executing advertisements: rational, emotional, and…
Validation and Verification of LADEE Models and Software
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen
2013-01-01
The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.
OMV: A simplified mathematical model of the orbital maneuvering vehicle
NASA Technical Reports Server (NTRS)
Teoh, W.
1984-01-01
A model of the orbital maneuvering vehicle (OMV) is presented which contains several simplications. A set of hand controller signals may be used to control the motion of the OMV. Model verification is carried out using a sequence of tests. The dynamic variables generated by the model are compared, whenever possible, with the corresponding analytical variables. The results of the tests show conclusively that the present model is behaving correctly. Further, this model interfaces properly with the state vector transformation module (SVX) developed previously. Correct command sentence sequences are generated by the OMV and and SVX system, and these command sequences can be used to drive the flat floor simulation system at MSFC.
Solid waste forecasting using modified ANFIS modeling.
Younes, Mohammad K; Nopiah, Z M; Basri, N E Ahmad; Basri, H; Abushammala, Mohammed F M; K N A, Maulud
2015-10-01
Solid waste prediction is crucial for sustainable solid waste management. Usually, accurate waste generation record is challenge in developing countries which complicates the modelling process. Solid waste generation is related to demographic, economic, and social factors. However, these factors are highly varied due to population and economy growths. The objective of this research is to determine the most influencing demographic and economic factors that affect solid waste generation using systematic approach, and then develop a model to forecast solid waste generation using a modified Adaptive Neural Inference System (MANFIS). The model evaluation was performed using Root Mean Square Error (RMSE), Mean Absolute Error (MAE) and the coefficient of determination (R²). The results show that the best input variables are people age groups 0-14, 15-64, and people above 65 years, and the best model structure is 3 triangular fuzzy membership functions and 27 fuzzy rules. The model has been validated using testing data and the resulted training RMSE, MAE and R² were 0.2678, 0.045 and 0.99, respectively, while for testing phase RMSE =3.986, MAE = 0.673 and R² = 0.98. To date, a few attempts have been made to predict the annual solid waste generation in developing countries. This paper presents modeling of annual solid waste generation using Modified ANFIS, it is a systematic approach to search for the most influencing factors and then modify the ANFIS structure to simplify the model. The proposed method can be used to forecast the waste generation in such developing countries where accurate reliable data is not always available. Moreover, annual solid waste prediction is essential for sustainable planning.
Stirling Convertor Performance Mapping Test Results for Future Radioisotope Power Systems
NASA Astrophysics Data System (ADS)
Qiu, Songgang; Peterson, Allen A.; Faultersack, Franklyn D.; Redinger, Darin L.; Augenblick, John E.
2004-02-01
Long-life radioisotope-fueled generators based on free-piston Stirling convertors are an energy-conversion solution for future space applications. The high efficiency of Stirling machines makes them more attractive than the thermoelectric generators currently used in space. Stirling Technology Company (STC) has been performance-testing its Stirling generators to provide data for potential system integration contractors. This paper describes the most recent test results from the STC RemoteGen™ 55 W-class Stirling generators (RG-55). Comparisons are made between the new data and previous Stirling thermodynamic simulation models. Performance-mapping tests are presented including variations in: internal charge pressure, cold end temperature, hot end temperature, alternator temperature, input power, and variation of control voltage.
Geant4 hadronic physics for space radiation environment.
Ivantchenko, Anton V; Ivanchenko, Vladimir N; Molina, Jose-Manuel Quesada; Incerti, Sebastien L
2012-01-01
To test and to develop Geant4 (Geometry And Tracking version 4) Monte Carlo hadronic models with focus on applications in a space radiation environment. The Monte Carlo simulations have been performed using the Geant4 toolkit. Binary (BIC), its extension for incident light ions (BIC-ion) and Bertini (BERT) cascades were used as main Monte Carlo generators. For comparisons purposes, some other models were tested too. The hadronic testing suite has been used as a primary tool for model development and validation against experimental data. The Geant4 pre-compound (PRECO) and de-excitation (DEE) models were revised and improved. Proton, neutron, pion, and ion nuclear interactions were simulated with the recent version of Geant4 9.4 and were compared with experimental data from thin and thick target experiments. The Geant4 toolkit offers a large set of models allowing effective simulation of interactions of particles with matter. We have tested different Monte Carlo generators with our hadronic testing suite and accordingly we can propose an optimal configuration of Geant4 models for the simulation of the space radiation environment.
Automated Test Case Generation for an Autopilot Requirement Prototype
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael
2011-01-01
Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.
Aerodynamic stability analysis of NASA J85-13/planar pressure pulse generator installation
NASA Technical Reports Server (NTRS)
Chung, K.; Hosny, W. M.; Steenken, W. G.
1980-01-01
A digital computer simulation model for the J85-13/Planar Pressure Pulse Generator (P3 G) test installation was developed by modifying an existing General Electric compression system model. This modification included the incorporation of a novel method for describing the unsteady blade lift force. This approach significantly enhanced the capability of the model to handle unsteady flows. In addition, the frequency response characteristics of the J85-13/P3G test installation were analyzed in support of selecting instrumentation locations to avoid standing wave nodes within the test apparatus and thus, low signal levels. The feasibility of employing explicit analytical expression for surge prediction was also studied.
Zhou, Anne Q; Hsueh, Loretta; Roesch, Scott C; Vaughn, Allison A; Sotelo, Frank L; Lindsay, Suzanne; Klonoff, Elizabeth A
2016-02-01
Federal and state policies are based on data from surveys that examine sexual-related cognitions and behaviors through self-reports of attitudes and actions. No study has yet examined their factorial invariance--specifically, whether the relationship between items assessing sexual behavior and their underlying construct differ depending on gender, ethnicity/race, or age. This study examined the factor structure of four items from the sexual behavior questionnaire part of the National Health and Nutrition Examination Survey (NHANES). As NHANES provided different versions of the survey per gender, invariance was tested across gender to determine whether subsequent tests across ethnicity/race and generation could be done across gender. Items were not invariant across gender groups so data files for women and men were not collapsed. Across ethnicity/race for both genders, and across generation for women, items were configurally invariant, and exhibited metric invariance across Latino/Latina and Black participants for both genders. Across generation for men, the configural invariance model could not be identified so the baseline models were examined. The four item one factor model fit well for the Millennial and GenerationX groups but was a poor fit for the baby boomer and silent generation groups, suggesting that gender moderated the invariance across generation. Thus, comparisons between ethnic/racial and generational groups should not be made between the genders or even within gender. Findings highlight the need for programs and interventions that promote a more inclusive definition of "having had sex."
McFarland, Michael J; Palmer, Glenn R; Rasmussen, Steve L; Kordich, Micheal M; Pollet, Dean A; Jensen, James A; Lindsay, Mitchell H
2006-07-01
The U.S. Department of Defense-approved activities conducted at the Utah Test and Training Range (UTTR) include both operational readiness test firing of intercontinental ballistic missile (ICBM) motors, as well as the destruction of obsolete or otherwise unusable ICBM motors through open burn/open detonation (OB/OD). Within the Utah Division of Air Quality, these activities have been identified as having the potential to generate unacceptable noise levels, as well as significant amounts of volatile organic compounds (VOCs). Hill Air Force Base, UT, has completed a series of field tests at the UTTR in which sound-monitoring surveillance of OB/OD activities was conducted to validate the Sound Intensity Prediction System (SIPS) model. Using results generated by the SIPS model to support the decision to detonate, the UTTR successfully disposed of missile motors having an aggregate net explosive weight (NEW) of 81,374 lb without generating adverse noise levels within populated areas. In conjunction with collecting noise-monitoring data, air emissions were collected to support the development of air emission factors for both static missile motor firings and OB/OD activities. Through the installation of 15 ground-based air samplers, the generation of combustion-fixed gases, VOCs, and chlorides was monitored during the 81,374-lb NEW detonation event. Comparison of field measurements to predictions generated from the US Navy energetic combustion pollutant formation model, POLU4WN, indicated that, as the detonation fire ball expanded, organic compounds, as well as CO, continued to oxidize as the combustion gases mixed with ambient air. VOC analysis of air samplers confirmed the presence of chloromethane, vinyl chloride, benzene, toluene, and 2-methyl-1-propene. Qualitative chloride analysis indicated that gaseous HCl was generated at low concentrations, if at all.
Testing statistical self-similarity in the topology of river networks
Troutman, Brent M.; Mantilla, Ricardo; Gupta, Vijay K.
2010-01-01
Recent work has demonstrated that the topological properties of real river networks deviate significantly from predictions of Shreve's random model. At the same time the property of mean self-similarity postulated by Tokunaga's model is well supported by data. Recently, a new class of network model called random self-similar networks (RSN) that combines self-similarity and randomness has been introduced to replicate important topological features observed in real river networks. We investigate if the hypothesis of statistical self-similarity in the RSN model is supported by data on a set of 30 basins located across the continental United States that encompass a wide range of hydroclimatic variability. We demonstrate that the generators of the RSN model obey a geometric distribution, and self-similarity holds in a statistical sense in 26 of these 30 basins. The parameters describing the distribution of interior and exterior generators are tested to be statistically different and the difference is shown to produce the well-known Hack's law. The inter-basin variability of RSN parameters is found to be statistically significant. We also test generator dependence on two climatic indices, mean annual precipitation and radiative index of dryness. Some indication of climatic influence on the generators is detected, but this influence is not statistically significant with the sample size available. Finally, two key applications of the RSN model to hydrology and geomorphology are briefly discussed.
COI Structural Analysis Presentation
NASA Technical Reports Server (NTRS)
Cline, Todd; Stahl, H. Philip (Technical Monitor)
2001-01-01
This report discusses the structural analysis of the Next Generation Space Telescope Mirror System Demonstrator (NMSD) developed by Composite Optics Incorporated (COI) in support of the Next Generation Space Telescope (NGST) project. The mirror was submitted to Marshall Space Flight Center (MSFC) for cryogenic testing and evaluation. Once at MSFC, the mirror was lowered to approximately 40 K and the optical surface distortions were measured. Alongside this experiment, an analytical model was developed and used to compare to the test results. A NASTRAN finite element model was provided by COI and a thermal model was developed from it. Using the thermal model, steady state nodal temperatures were calculated based on the predicted environment of the large cryogenic test chamber at MSFC. This temperature distribution was applied in the structural analysis to solve for the deflections of the optical surface. Finally, these deflections were submitted for optical analysis and comparison to the interferometer test data.
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krupowicz, J.J.; Scott, D.B.; Fink, G.C.
Corrosion results obtained from the post-test non-destructive and destructive examinations of an alternative materials model steam generator are described in this final report. The model operated under representative thermal and hydraulic and accelerated (high seawater contaminant concentration) steam generator secondary water chemistry conditions. Total exposure consisted of 114 steaming days under all volatile treatment (AVT) chemistry conditions followed by 282 fault steaming days at a 30 ppM chloride concentration in the secondary bulk water. Various support plate and lattice strip support designs incorporated Types 347, 405, 409 and SCR-3 stainless steels; Alloys 600 and 690; and carbon steel. Heat transfermore » tube materials included Alloy 600 in various heat treated conditions, Alloy 690, and Alloy 800. All tubing materials in this test exhibited moderate pitting, primarily in the sludge pile region above the tubesheet.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krupowicz, J.J.; Scott, D.B.; Rentler, R.M.
Corrosion results obtained from the post-test non-destructive and destructive examinations of an alternative materials model steam generator are described in this final report. The model operated under representative thermal and hydraulic and accelerated (high fresh water contaminant concentration) steam generator secondary water chemistry conditions. Total exposure consisted of 114 steaming days under all volatile treatment (AVT) chemistry conditions followed by 358 fault steaming days at a 40 ppM sulfate concentration in the secondary bulk water. Various support plate and lattice strip support designs incorporated Types 347, 405, 409 and SCR-3 stainless steels; Alloys 600 and 690; and carbon steel. Heatmore » transfer tube materials included Alloy 600 in various heat treated conditions, Alloy 690, and Alloy 800. All tubing materials in this test exhibited significant general corrosion beneath thick surface deposits.« less
Parametric vs. non-parametric daily weather generator: validation and comparison
NASA Astrophysics Data System (ADS)
Dubrovsky, Martin
2016-04-01
As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30 years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database.
Using Apex To Construct CPM-GOMS Models
NASA Technical Reports Server (NTRS)
John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger
2006-01-01
process for automatically generating computational models of human/computer interactions as well as graphical and textual representations of the models has been built on the conceptual foundation of a method known in the art as CPM-GOMS. This method is so named because it combines (1) the task decomposition of analysis according to an underlying method known in the art as the goals, operators, methods, and selection (GOMS) method with (2) a model of human resource usage at the level of cognitive, perceptual, and motor (CPM) operations. CPM-GOMS models have made accurate predictions about behaviors of skilled computer users in routine tasks, but heretofore, such models have been generated in a tedious, error-prone manual process. In the present process, CPM-GOMS models are generated automatically from a hierarchical task decomposition expressed by use of a computer program, known as Apex, designed previously to be used to model human behavior in complex, dynamic tasks. An inherent capability of Apex for scheduling of resources automates the difficult task of interleaving the cognitive, perceptual, and motor resources that underlie common task operators (e.g., move and click mouse). The user interface of Apex automatically generates Program Evaluation Review Technique (PERT) charts, which enable modelers to visualize the complex parallel behavior represented by a model. Because interleaving and the generation of displays to aid visualization are automated, it is now feasible to construct arbitrarily long sequences of behaviors. The process was tested by using Apex to create a CPM-GOMS model of a relatively simple human/computer-interaction task and comparing the time predictions of the model and measurements of the times taken by human users in performing the various steps of the task. The task was to withdraw $80 in cash from an automated teller machine (ATM). For the test, a Visual Basic mockup of an ATM was created, with a provision for input from (and measurement of the performance of) the user via a mouse. The times predicted by the automatically generated model turned out to approximate the measured times fairly well (see figure). While these results are promising, there is need for further development of the process. Moreover, it will also be necessary to test other, more complex models: The actions required of the user in the ATM task are too sequential to involve substantial parallelism and interleaving and, hence, do not serve as an adequate test of the unique strength of CPM-GOMS models to accommodate parallelism and interleaving.
MMPP Traffic Generator for the Testing of the SCAR 2 Fast Packet Switch
NASA Technical Reports Server (NTRS)
Chren, William A., Jr.
1995-01-01
A prototype MWP Traffic Generator (TG) has been designed for testing of the COMSAT-supplied SCAR II Fast Packet Switch. By generating packets distributed according to a Markov-Modulated Poisson Process (MMPP) model. it allows the assessment of the switch performance under traffic conditions that are more realistic than could be generated using the COMSAT-supplied Traffic Generator Module. The MMPP model is widely believed to model accurately real-world superimposed voice and data communications traffic. The TG was designed to be as much as possible of a "drop-in" replacement for the COMSAT Traffic Generator Module. The latter fit on two Altera EPM7256EGC 192-pin CPLDs and produced traffic for one switch input port. No board changes are necessary because it has been partitioned to use the existing board traces. The TG, consisting of parts "TGDATPROC" and "TGRAMCTL" must merely be reprogrammed into the Altera devices of the same name. However, the 040 controller software must be modified to provide TG initialization data. This data will be given in Section II.
NASA Astrophysics Data System (ADS)
Kwok, Yu Fat
The main objective of this study is to develop a model for the determination of optimum testing interval (OTI) of non-redundant standby plants. This study focuses on the emergency power generators in tall buildings in Hong Kong. The model for the reliability, which is developed, is applicable to any non-duplicated standby plant. In a tall building, the mobilisation of occupants is constrained by its height and the building internal layout. Occupant's safety, amongst other safety considerations, highly depends on the reliability of the fire detection and protection system, which in turn is dependent on the reliability of the emergency power generation plants. A thorough literature survey shows that the practice used in determining OTI in nuclear plants is generally applicable. Historically, the OTI in these plants is determined by balancing the testing downtime and reliability gained from frequent testing. However, testing downtime does not exist in plants like emergency power generator. Subsequently, sophisticated models have taken repairing downtime into consideration. In this study, the algorithms for the determination of OTI, and hence reliability of standby plants, are reconsidered. A new concept is introduced into the subject. A new model is developed for such purposes which embraces more realistic factors found in practice. System aging and the finite life cycle of the standby plant are considered. Somewhat more pragmatic is that the Optimum Overhauling Interval can also be determined from this new model. System unavailability grow with time, but can be reset by test or overhaul. Contrary to fixed testing intervals, OTI is determined whenever system point unavailability exceeds certain level, which depends on the reliability requirement of the standby system. An optimum testing plan for lowering this level to the 'minimum useful unavailability' level (see section 9.1 for more elaboration) can be determined by the new model presented. Cost effectiveness is accounted for by a new parameter 'tau min', the minimum testing interval (MTI). The MTI optimises the total number of tests and the total number of overhauls, when the costs for each are available. The model sets up criteria for test and overhaul and to 'announce' end of system life. The usefulness of the model is validated by a detailed analysis of the operating parameters from 8,500 maintenance records collected for emergency power generation plants in high rise buildings in Hong Kong. (Abstract shortened by UMI.)
2016-08-01
expanded upon the relationship between GR and SGK1 in the context of enzalutamide-driven prostate cancer. We have generated CRISPR /Cas9 cell lines...Complete Generate SGK1 overexpressing cell models 50% Ongoing Clone SGK1 CRISPR 100% Complete Generate SGK1-deficient cell models 75% Ongoing Test...driven enzalutamide resistance, GR-expressing enzalutamide-resistant prostate cancer cells expressing CRISPR /Cas9 and a guide targeting SGK1 (sgSGK1
NASA Technical Reports Server (NTRS)
Chambers, Jeffrey A.
1994-01-01
Finite element analysis is regularly used during the engineering cycle of mechanical systems to predict the response to static, thermal, and dynamic loads. The finite element model (FEM) used to represent the system is often correlated with physical test results to determine the validity of analytical results provided. Results from dynamic testing provide one means for performing this correlation. One of the most common methods of measuring accuracy is by classical modal testing, whereby vibratory mode shapes are compared to mode shapes provided by finite element analysis. The degree of correlation between the test and analytical mode shapes can be shown mathematically using the cross orthogonality check. A great deal of time and effort can be exhausted in generating the set of test acquired mode shapes needed for the cross orthogonality check. In most situations response data from vibration tests are digitally processed to generate the mode shapes from a combination of modal parameters, forcing functions, and recorded response data. An alternate method is proposed in which the same correlation of analytical and test acquired mode shapes can be achieved without conducting the modal survey. Instead a procedure is detailed in which a minimum of test information, specifically the acceleration response data from a random vibration test, is used to generate a set of equivalent local accelerations to be applied to the reduced analytical model at discrete points corresponding to the test measurement locations. The static solution of the analytical model then produces a set of deformations that once normalized can be used to represent the test acquired mode shapes in the cross orthogonality relation. The method proposed has been shown to provide accurate results for both a simple analytical model as well as a complex space flight structure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haslinger, K.H.
Tube-to-tube support interaction characterisitics were determined experimentally on a single tube, multi-span geometry, representative of the Westinghouse Model 51 steam generator economizer design. Results, in part, became input for an autoclave type wear test program on steam generator tubes, performed by Kraftwerk Union (KWU). More importantly, the test data reported here have been used to validate two analytical wear prediction codes; the WECAN code, which was developed by Westinghouse, and the ABAQUS code which has been enhanced for EPRI by Foster Wheeler to enable simulation of gap conditions (including fluid film effects) for various support geometries.
A HWIL test facility of infrared imaging laser radar using direct signal injection
NASA Astrophysics Data System (ADS)
Wang, Qian; Lu, Wei; Wang, Chunhui; Wang, Qi
2005-01-01
Laser radar has been widely used these years and the hardware-in-the-loop (HWIL) testing of laser radar become important because of its low cost and high fidelity compare with On-the-Fly testing and whole digital simulation separately. Scene generation and projection two key technologies of hardware-in-the-loop testing of laser radar and is a complicated problem because the 3D images result from time delay. The scene generation process begins with the definition of the target geometry and reflectivity and range. The real-time 3D scene generation computer is a PC based hardware and the 3D target models were modeled using 3dsMAX. The scene generation software was written in C and OpenGL and is executed to extract the Z-buffer from the bit planes to main memory as range image. These pixels contain each target position x, y, z and its respective intensity and range value. Expensive optical injection technologies of scene projection such as LDP array, VCSEL array, DMD and associated scene generation is ongoing. But the optical scene projection is complicated and always unaffordable. In this paper a cheaper test facility was described that uses direct electronic injection to provide rang images for laser radar testing. The electronic delay and pulse shaping circuits inject the scenes directly into the seeker's signal processing unit.
Computer-aided-engineering system for modeling and analysis of ECLSS integration testing
NASA Technical Reports Server (NTRS)
Sepahban, Sonbol
1987-01-01
The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.
Development of the GPM Observatory Thermal Vacuum Test Model
NASA Technical Reports Server (NTRS)
Yang, Kan; Peabody, Hume
2012-01-01
A software-based thermal modeling process was documented for generating the thermal panel settings necessary to simulate worst-case on-orbit flight environments in an observatory-level thermal vacuum test setup. The method for creating such a thermal model involved four major steps: (1) determining the major thermal zones for test as indicated by the major dissipating components on the spacecraft, then mapping the major heat flows between these components; (2) finding the flight equivalent sink temperatures for these test thermal zones; (3) determining the thermal test ground support equipment (GSE) design and initial thermal panel settings based on the equivalent sink temperatures; and (4) adjusting the panel settings in the test model to match heat flows and temperatures with the flight model. The observatory test thermal model developed from this process allows quick predictions of the performance of the thermal vacuum test design. In this work, the method described above was applied to the Global Precipitation Measurement (GPM) core observatory spacecraft, a joint project between NASA and the Japanese Aerospace Exploration Agency (JAXA) which is currently being integrated at NASA Goddard Space Flight Center for launch in Early 2014. From preliminary results, the thermal test model generated from this process shows that the heat flows and temperatures match fairly well with the flight thermal model, indicating that the test model can simulate fairly accurately the conditions on-orbit. However, further analysis is needed to determine the best test configuration possible to validate the GPM thermal design before the start of environmental testing later this year. Also, while this analysis method has been applied solely to GPM, it should be emphasized that the same process can be applied to any mission to develop an effective test setup and panel settings which accurately simulate on-orbit thermal environments.
NASA Technical Reports Server (NTRS)
Aoyagi, Kiyoshi; Olson, Lawrence E.; Peterson, Randall L.; Yamauchi, Gloria K.; Ross, James C.; Norman, Thomas R.
1987-01-01
Time-averaged aerodynamic loads are estimated for each of the vane sets in the National Full-Scale Aerodynamic Complex (NFAC). The methods used to compute global and local loads are presented. Experimental inputs used to calculate these loads are based primarily on data obtained from tests conducted in the NFAC 1/10-Scale Vane-Set Test Facility and from tests conducted in the NFAC 1/50-Scale Facility. For those vane sets located directly downstream of either the 40- by 80-ft test section or the 80- by 120-ft test section, aerodynamic loads caused by the impingement of model-generated wake vortices and model-generated jet and propeller wakes are also estimated.
Crash Testing and Simulation of a Cessna 172 Aircraft: Pitch Down Impact Onto Soft Soil
NASA Technical Reports Server (NTRS)
Fasanella, Edwin L.; Jackson, Karen E.
2016-01-01
During the summer of 2015, NASA Langley Research Center conducted three full-scale crash tests of Cessna 172 (C-172) aircraft at the NASA Langley Landing and Impact Research (LandIR) Facility. The first test represented a flare-to-stall emergency or hard landing onto a rigid surface. The second test, which is the focus of this paper, represented a controlled-flight-into-terrain (CFIT) with a nose-down pitch attitude of the aircraft, which impacted onto soft soil. The third test, also conducted onto soil, represented a CFIT with a nose-up pitch attitude of the aircraft, which resulted in a tail strike condition. These three crash tests were performed for the purpose of evaluating the performance of Emergency Locator Transmitters (ELTs) and to generate impact test data for model validation. LS-DYNA finite element models were generated to simulate the three test conditions. This paper describes the model development and presents test-analysis comparisons of acceleration and velocity time-histories, as well as a comparison of the time sequence of events for Test 2 onto soft soil.
System and Method for Modeling the Flow Performance Features of an Object
NASA Technical Reports Server (NTRS)
Jorgensen, Charles (Inventor); Ross, James (Inventor)
1997-01-01
The method and apparatus includes a neural network for generating a model of an object in a wind tunnel from performance data on the object. The network is trained from test input signals (e.g., leading edge flap position, trailing edge flap position, angle of attack, and other geometric configurations, and power settings) and test output signals (e.g., lift, drag, pitching moment, or other performance features). In one embodiment, the neural network training method employs a modified Levenberg-Marquardt optimization technique. The model can be generated 'real time' as wind tunnel testing proceeds. Once trained, the model is used to estimate performance features associated with the aircraft given geometric configuration and/or power setting input. The invention can also be applied in other similar static flow modeling applications in aerodynamics, hydrodynamics, fluid dynamics, and other such disciplines. For example, the static testing of cars, sails, and foils, propellers, keels, rudders, turbines, fins, and the like, in a wind tunnel, water trough, or other flowing medium.
Development of the CCP-200 mathematical model for Syzran CHPP using the Thermolib software package
NASA Astrophysics Data System (ADS)
Usov, S. V.; Kudinov, A. A.
2016-04-01
Simplified cycle diagram of the CCP-200 power generating unit of Syzran CHPP containing two gas turbines PG6111FA with generators, two steam recovery boilers KUP-110/15-8.0/0.7-540/200, and one steam turbine Siemens SST-600 (one-cylinder with two variable heat extraction units of 60/75 MW in heatextraction and condensing modes, accordingly) with S-GEN5-100 generators was presented. Results of experimental guarantee tests of the CCP-200 steam-gas unit are given. Brief description of the Thermolib application for the MatLab Simulink software package is given. Basic equations used in Thermolib for modeling thermo-technical processes are given. Mathematical models of gas-turbine plant, heat-recovery steam generator, steam turbine and integrated plant for power generating unit CCP-200 of Syzran CHPP were developed with the help of MatLab Simulink and Thermolib. The simulation technique at different ambient temperature values was used in order to get characteristics of the developed mathematical model. Graphic comparison of some characteristics of the CCP-200 simulation model (gas temperature behind gas turbine, gas turbine and combined cycle plant capacity, high and low pressure steam consumption and feed water consumption for high and low pressure economizers) with actual characteristics of the steam-gas unit received at experimental (field) guarantee tests at different ambient temperature are shown. It is shown that the chosen degrees of complexity, characteristics of the CCP-200 simulation model, developed by Thermolib, adequately correspond to the actual characteristics of the steam-gas unit received at experimental (field) guarantee tests; this allows considering the developed mathematical model as adequate and acceptable it for further work.
Power Control for Direct-Driven Permanent Magnet Wind Generator System with Battery Storage
Guang, Chu Xiao; Ying, Kong
2014-01-01
The objective of this paper is to construct a wind generator system (WGS) loss model that addresses the loss of the wind turbine and the generator. It aims to optimize the maximum effective output power and turbine speed. Given that the wind generator system has inertia and is nonlinear, the dynamic model of the wind generator system takes the advantage of the duty of the Buck converter and employs feedback linearization to design the optimized turbine speed tracking controller and the load power controller. According to that, this paper proposes a dual-mode dynamic coordination strategy based on the auxiliary load to reduce the influence of mode conversion on the lifetime of the battery. Optimized speed and power rapid tracking as well as the reduction of redundant power during mode conversion have gone through the test based on a 5 kW wind generator system test platform. The generator output power as the capture target has also been proved to be efficient. PMID:25050405
Power control for direct-driven permanent magnet wind generator system with battery storage.
Guang, Chu Xiao; Ying, Kong
2014-01-01
The objective of this paper is to construct a wind generator system (WGS) loss model that addresses the loss of the wind turbine and the generator. It aims to optimize the maximum effective output power and turbine speed. Given that the wind generator system has inertia and is nonlinear, the dynamic model of the wind generator system takes the advantage of the duty of the Buck converter and employs feedback linearization to design the optimized turbine speed tracking controller and the load power controller. According to that, this paper proposes a dual-mode dynamic coordination strategy based on the auxiliary load to reduce the influence of mode conversion on the lifetime of the battery. Optimized speed and power rapid tracking as well as the reduction of redundant power during mode conversion have gone through the test based on a 5 kW wind generator system test platform. The generator output power as the capture target has also been proved to be efficient.
Yu, S; Gao, S; Gan, Y; Zhang, Y; Ruan, X; Wang, Y; Yang, L; Shi, J
2016-04-01
Quantitative structure-property relationship modelling can be a valuable alternative method to replace or reduce experimental testing. In particular, some endpoints such as octanol-water (KOW) and organic carbon-water (KOC) partition coefficients of polychlorinated biphenyls (PCBs) are easier to predict and various models have been already developed. In this paper, two different methods, which are multiple linear regression based on the descriptors generated using Dragon software and hologram quantitative structure-activity relationships, were employed to predict suspended particulate matter (SPM) derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of 209 PCBs. The predictive ability of the derived models was validated using a test set. The performances of all these models were compared with EPI Suite™ software. The results indicated that the proposed models were robust and satisfactory, and could provide feasible and promising tools for the rapid assessment of the SPM derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of PCBs.
Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance
NASA Technical Reports Server (NTRS)
Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.
2014-01-01
This paper presents recent thermal model results of the Advanced Stirling Radioisotope Generator (ASRG). The three-dimensional (3D) ASRG thermal power model was built using the Thermal Desktop(trademark) thermal analyzer. The model was correlated with ASRG engineering unit test data and ASRG flight unit predictions from Lockheed Martin's (LM's) I-deas(trademark) TMG thermal model. The auxiliary cooling system (ACS) of the ASRG is also included in the ASRG thermal model. The ACS is designed to remove waste heat from the ASRG so that it can be used to heat spacecraft components. The performance of the ACS is reported under nominal conditions and during a Venus flyby scenario. The results for the nominal case are validated with data from Lockheed Martin. Transient thermal analysis results of ASRG for a Venus flyby with a representative trajectory are also presented. In addition, model results of an ASRG mounted on a Cassini-like spacecraft with a sunshade are presented to show a way to mitigate the high temperatures of a Venus flyby. It was predicted that the sunshade can lower the temperature of the ASRG alternator by 20 C for the representative Venus flyby trajectory. The 3D model also was modified to predict generator performance after a single Advanced Stirling Convertor failure. The geometry of the Microtherm HT insulation block on the outboard side was modified to match deformation and shrinkage observed during testing of a prototypic ASRG test fixture by LM. Test conditions and test data were used to correlate the model by adjusting the thermal conductivity of the deformed insulation to match the post-heat-dump steady state temperatures. Results for these conditions showed that the performance of the still-functioning inboard ACS was unaffected.
ERIC Educational Resources Information Center
Aspelmeier, Jeffery E.; Love, Michael M.; McGill, Lauren A.; Elliott, Ann N.; Pierce, Thomas W.
2012-01-01
The role of generational status (first-generation vs. continuing-generation college students) as a moderator of the relationship between psychological factors and college outcomes was tested to determine whether generational status acts as a risk factor or as a sensitizing factor. The sample consisted of 322 undergraduate students who completed…
Design and Test of Pseudorandom Number Generator Using a Star Network of Lorenz Oscillators
NASA Astrophysics Data System (ADS)
Cho, Kenichiro; Miyano, Takaya
We have recently developed a chaos-based stream cipher based on augmented Lorenz equations as a star network of Lorenz subsystems. In our method, the augmented Lorenz equations are used as a pseudorandom number generator. In this study, we propose a new method based on the augmented Lorenz equations for generating binary pseudorandom numbers and evaluate its security using the statistical tests of SP800-22 published by the National Institute for Standards and Technology in comparison with the performances of other chaotic dynamical models used as binary pseudorandom number generators. We further propose a faster version of the proposed method and evaluate its security using the statistical tests of TestU01 published by L’Ecuyer and Simard.
Eichhorn, S; Mendoza Garcia, A; Polski, M; Spindler, J; Stroh, A; Heller, M; Lange, R; Krane, M
2017-06-01
The provision of sufficient chest compression is among the most important factors influencing patient survival during cardiopulmonary resuscitation (CPR). One approach to optimize the quality of chest compressions is to use mechanical-resuscitation devices. The aim of this study was to compare a new device for chest compression (corpuls cpr) with an established device (LUCAS II). We used a mechanical thorax model consisting of a chest with variable stiffness and an integrated heart chamber which generated blood flow dependent on the compression depth and waveform. The method of blood-flow generation could be changed between direct cardiac-compression mode and thoracic-pump mode. Different chest-stiffness settings and compression modes were tested to generate various blood-flow profiles. Additionally, an endurance test at high stiffness was performed to measure overall performance and compression consistency. Both resuscitation machines were able to compress the model thorax with a frequency of 100/min and a depth of 5 cm, independent of the chosen chest stiffness. Both devices passed the endurance test without difficulty. The corpuls cpr device was able to generate about 10-40% more blood flow than the LUCAS II device, depending on the model settings. In most scenarios, the corpuls cpr device also generated a higher blood pressure than the LUCAS II. The peak compression forces during CPR were about 30% higher using the corpuls cpr device than with the LUCAS II. In this study, the corpuls cpr device had improved blood flow and pressure outcomes than the LUCAS II device. Further examination in an animal model is required to prove the findings of this preliminary study.
NASA Astrophysics Data System (ADS)
Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.
2016-05-01
The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.
Testing collapse models by a thermometer
NASA Astrophysics Data System (ADS)
Bahrami, M.
2018-05-01
Collapse models postulate that space is filled with a collapse noise field, inducing quantum Brownian motions, which are dominant during the measurement, thus causing collapse of the wave function. An important manifestation of the collapse noise field, if any, is thermal energy generation, thus disturbing the temperature profile of a system. The experimental investigation of a collapse-driven heating effect has provided, so far, the most promising test of collapse models against standard quantum theory. In this paper, we calculate the collapse-driven heat generation for a three-dimensional multi-atomic Bravais lattice by solving stochastic Heisenberg equations. We perform our calculation for the mass-proportional continuous spontaneous localization collapse model with nonwhite noise. We obtain the temperature distribution of a sphere under stationary-state and insulated surface conditions. However, the exact quantification of the collapse-driven heat-generation effect highly depends on the actual value of cutoff in the collapse noise spectrum.
Zhang, Z; Jewett, D L
1994-01-01
Due to model misspecification, currently-used Dipole Source Localization (DSL) methods may contain Multiple-Generator Errors (MulGenErrs) when fitting simultaneously-active dipoles. The size of the MulGenErr is a function of both the model used, and the dipole parameters, including the dipoles' waveforms (time-varying magnitudes). For a given fitting model, by examining the variation of the MulGenErrs (or the fit parameters) under different waveforms for the same generating-dipoles, the accuracy of the fitting model for this set of dipoles can be determined. This method of testing model misspecification can be applied to evoked potential maps even when the parameters of the generating-dipoles are unknown. The dipole parameters fitted in a model should only be accepted if the model can be shown to be sufficiently accurate.
NASA Technical Reports Server (NTRS)
Makel, Darby B.; Rosenberg, Sanders D.
1990-01-01
The formation and deposition of carbon (soot) was studied in the Carbon Deposition Model for Oxygen-Hydrocarbon Combustion Program. An empirical, 1-D model for predicting soot formation and deposition in LO2/hydrocarbon gas generators/preburners was derived. The experimental data required to anchor the model were identified and a test program to obtain the data was defined. In support of the model development, cold flow mixing experiments using a high injection density injector were performed. The purpose of this investigation was to advance the state-of-the-art in LO2/hydrocarbon gas generator design by developing a reliable engineering model of gas generator operation. The model was formulated to account for the influences of fluid dynamics, chemical kinetics, and gas generator hardware design on soot formation and deposition.
Analysis of subsonic wind tunnel with variation shape rectangular and octagonal on test section
NASA Astrophysics Data System (ADS)
Rhakasywi, D.; Ismail; Suwandi, A.; Fadhli, A.
2018-02-01
The need for good design in the aerodynamics field required a wind tunnel design. The wind tunnel design required in this case is capable of generating laminar flow. In this research searched for wind tunnel models with rectangular and octagonal variations with objectives to generate laminar flow in the test section. The research method used numerical approach of CFD (Computational Fluid Dynamics) and manual analysis to analyze internal flow in test section. By CFD simulation results and manual analysis to generate laminar flow in the test section is a design that has an octagonal shape without filled for optimal design.
Racial and Cultural Factors Affecting the Mental Health of Asian Americans
Miller, Matthew J.; Yang, Minji; Farrell, Jerome A.; Lin, Li-Ling
2011-01-01
In this study, we employed structural equation modeling to test the degree to which racism-related stress, acculturative stress, and bicultural self-efficacy were predictive of mental health in a predominantly community-based sample of 367 Asian American adults. We also tested whether bicultural self-efficacy moderated the relationship between acculturative stress and mental health. Finally, we examined whether generational status moderated the impact of racial and cultural predictors of mental health by testing our model across immigrant and U.S.-born samples. Results indicated that our hypothesized structural model represented a good fit to the total sample data. While racism-related stress, acculturative stress, and bicultural self-efficacy were significant predictors of mental health in the total sample analyses, our generational analyses revealed a differential predictive pattern across generational status. Finally, we found that the buffering effect of bicultural self-efficacy on the relationship between acculturative stress and mental health was significant for U.S.-born individuals only. Implications for research and service delivery are explored. PMID:21977934
NASA Astrophysics Data System (ADS)
Schott, John R.; Brown, Scott D.; Raqueno, Rolando V.; Gross, Harry N.; Robinson, Gary
1999-01-01
The need for robust image data sets for algorithm development and testing has prompted the consideration of synthetic imagery as a supplement to real imagery. The unique ability of synthetic image generation (SIG) tools to supply per-pixel truth allows algorithm writers to test difficult scenarios that would require expensive collection and instrumentation efforts. In addition, SIG data products can supply the user with `actual' truth measurements of the entire image area that are not subject to measurement error thereby allowing the user to more accurately evaluate the performance of their algorithm. Advanced algorithms place a high demand on synthetic imagery to reproduce both the spectro-radiometric and spatial character observed in real imagery. This paper describes a synthetic image generation model that strives to include the radiometric processes that affect spectral image formation and capture. In particular, it addresses recent advances in SIG modeling that attempt to capture the spatial/spectral correlation inherent in real images. The model is capable of simultaneously generating imagery from a wide range of sensors allowing it to generate daylight, low-light-level and thermal image inputs for broadband, multi- and hyper-spectral exploitation algorithms.
NASA Shuttle Orbiter Reinforced Carbon Carbon (RCC) Crack Repair Arc-Jet Testing
NASA Technical Reports Server (NTRS)
Clark, ShawnDella; Larin, Max; Rochelle, Bill
2007-01-01
This NASA study demonstrates the capability for testing NOAX-repaired RCC crack models in high temperature environments representative of Shuttle Orbiter during reentry. Analysis methods have provided correlation of test data with flight predictions. NOAX repair material for RCC is flown on every STS flight in the event such a repair is needed. Two final test reports are being generated on arc-jet results (both calibration model runs and repaired models runs).
Performance evaluation of an automotive thermoelectric generator
NASA Astrophysics Data System (ADS)
Dubitsky, Andrei O.
Around 40% of the total fuel energy in typical internal combustion engines (ICEs) is rejected to the environment in the form of exhaust gas waste heat. Efficient recovery of this waste heat in automobiles can promise a fuel economy improvement of 5%. The thermal energy can be harvested through thermoelectric generators (TEGs) utilizing the Seebeck effect. In the present work, a versatile test bench has been designed and built in order to simulate conditions found on test vehicles. This allows experimental performance evaluation and model validation of automotive thermoelectric generators. An electrically heated exhaust gas circuit and a circulator based coolant loop enable integrated system testing of hot and cold side heat exchangers, thermoelectric modules (TEMs), and thermal interface materials at various scales. A transient thermal model of the coolant loop was created in order to design a system which can maintain constant coolant temperature under variable heat input. Additionally, as electrical heaters cannot match the transient response of an ICE, modelling was completed in order to design a relaxed exhaust flow and temperature history utilizing the system thermal lag. This profile reduced required heating power and gas flow rates by over 50%. The test bench was used to evaluate a DOE/GM initial prototype automotive TEG and validate analytical performance models. The maximum electrical power generation was found to be 54 W with a thermal conversion efficiency of 1.8%. It has been found that thermal interface management is critical for achieving maximum system performance, with novel designs being considered for further improvement.
Requirements-Based Conformance Testing of ARINC 653 Real-Time Operating Systems
NASA Astrophysics Data System (ADS)
Maksimov, Andrey
2010-08-01
Requirements-based testing is emphasized in avionics certification documents because this strategy has been found to be the most effective at revealing errors. This paper describes the unified requirements-based approach to the creation of conformance test suites for mission-critical systems. The approach uses formal machine-readable specifications of requirements and finite state machine model for test sequences generation on-the-fly. The paper also presents the test system for automated test generation for ARINC 653 services built on this approach. Possible application of the presented approach to various areas of avionics embedded systems testing is discussed.
Verification of component mode techniques for flexible multibody systems
NASA Technical Reports Server (NTRS)
Wiens, Gloria J.
1990-01-01
Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.
Validation of two (parametric vs non-parametric) daily weather generators
NASA Astrophysics Data System (ADS)
Dubrovsky, M.; Skalak, P.
2015-12-01
As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed-like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30-years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database. Acknowledgements: The weather generator is developed and validated within the frame of projects WG4VALUE (sponsored by the Ministry of Education, Youth and Sports of CR), and VALUE (COST ES 1102 action).
Price vs. Performance: The Value of Next Generation Fighter Aircraft
2007-03-01
forms. Both the semi-log and log-log forms were plagued with heteroskedasticity (according to the Breusch - Pagan /Cook-Weisberg test ). The RDT&E models...from 1949-present were used to construct two models – one based on procurement costs and one based on research, design, test , and evaluation (RDT&E...fighter aircraft hedonic models include several different categories of variables. Aircraft procurement costs and research, design, test , and
Automatic item generation implemented for measuring artistic judgment aptitude.
Bezruczko, Nikolaus
2014-01-01
Automatic item generation (AIG) is a broad class of methods that are being developed to address psychometric issues arising from internet and computer-based testing. In general, issues emphasize efficiency, validity, and diagnostic usefulness of large scale mental testing. Rapid prominence of AIG methods and their implicit perspective on mental testing is bringing painful scrutiny to many sacred psychometric assumptions. This report reviews basic AIG ideas, then presents conceptual foundations, image model development, and operational application to artistic judgment aptitude testing.
Summary of CPAS EDU Testing Analysis Results
NASA Technical Reports Server (NTRS)
Romero, Leah M.; Bledsoe, Kristin J.; Davidson, John.; Engert, Meagan E.; Fraire, Usbaldo, Jr.; Galaviz, Fernando S.; Galvin, Patrick J.; Ray, Eric S.; Varela, Jose
2015-01-01
The Orion program's Capsule Parachute Assembly System (CPAS) project is currently conducting its third generation of testing, the Engineering Development Unit (EDU) series. This series utilizes two test articles, a dart-shaped Parachute Compartment Drop Test Vehicle (PCDTV) and capsule-shaped Parachute Test Vehicle (PTV), both of which include a full size, flight-like parachute system and require a pallet delivery system for aircraft extraction. To date, 15 tests have been completed, including six with PCDTVs and nine with PTVs. Two of the PTV tests included the Forward Bay Cover (FBC) provided by Lockheed Martin. Advancements in modeling techniques applicable to parachute fly-out, vehicle rate of descent, torque, and load train, also occurred during the EDU testing series. An upgrade from a composite to an independent parachute simulation allowed parachute modeling at a higher level of fidelity than during previous generations. The complexity of separating the test vehicles from their pallet delivery systems necessitated the use the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulator for modeling mated vehicle aircraft extraction and separation. This paper gives an overview of each EDU test and summarizes the development of CPAS analysis tools and techniques during EDU testing.
NASA Technical Reports Server (NTRS)
Antle, John M.; Basso, Bruno; Conant, Richard T.; Godfray, H. Charles J.; Jones, James W.; Herrero, Mario; Howitt, Richard E.; Keating, Brian A.; Munoz-Carpena, Rafael; Rosenzweig, Cynthia
2016-01-01
This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.
Antle, John M; Basso, Bruno; Conant, Richard T; Godfray, H Charles J; Jones, James W; Herrero, Mario; Howitt, Richard E; Keating, Brian A; Munoz-Carpena, Rafael; Rosenzweig, Cynthia; Tittonell, Pablo; Wheeler, Tim R
2017-07-01
This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.
NASA Technical Reports Server (NTRS)
Blotzer, Michael J.; Woods, Jody L.
2009-01-01
This viewgraph presentation reviews computational fluid dynamics as a tool for modelling the dispersion of carbon monoxide at the Stennis Space Center's A3 Test Stand. The contents include: 1) Constellation Program; 2) Constellation Launch Vehicles; 3) J2X Engine; 4) A-3 Test Stand; 5) Chemical Steam Generators; 6) Emission Estimates; 7) Located in Existing Test Complex; 8) Computational Fluid Dynamics; 9) Computational Tools; 10) CO Modeling; 11) CO Model results; and 12) Next steps.
We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...
Gen 2.0 Mixer/Ejector Nozzle Test at LSAF June 1995 to July 1996
NASA Technical Reports Server (NTRS)
Arney, L. D.; Sandquist, D. L.; Forsyth, D. W.; Lidstone, G. L.; Long-Davis, Mary Jo (Technical Monitor)
2005-01-01
Testing of the HSCT Generation 2.0 nozzle model hardware was conducted at the Boeing Low Speed Aeroacoustic Facility, LSAF. Concurrent measurements of noise and thrust were made at critical takeoff design conditions for a variety of mixer/ejector model hardware. Design variables such as suppressor area ratio, mixer area ratio, liner type and thickness, ejector length, lobe penetration, and mixer chute shape were tested. Parallel testing was conducted at G.E.'s Cell 41 acoustic free jet facility to augment the LSAF test. The results from the Gen 2.0 testing are being used to help shape the current nozzle baseline configuration and guide the efforts in the upcoming Generation 2.5 and 3.0 nozzle tests. The Gen 2.0 results have been included in the total airplane system studies conducted at MDC and Boeing to provide updated noise and thrust performance estimates.
Analysis of screeching in a cold flow jet experiment
NASA Technical Reports Server (NTRS)
Wang, M. E.; Slone, R. M., Jr.; Robertson, J. E.; Keefe, L.
1975-01-01
The screech phenomenon observed in a one-sixtieth scale model space shuttle test of the solid rocket booster exhaust flow noise has been investigated. A critical review is given of the cold flow test data representative of Space Shuttle launch configurations to define those parameters which contribute to screech generation. An acoustic feedback mechanism is found to be responsible for the generation of screech. A simple equation which permits prediction of screech frequency in terms of basic testing parameters such as the jet exhaust Mach number and the separating distance from nozzle exit to the surface of model launch pad is presented and is found in good agreement with the test data. Finally, techniques are recommended to eliminate or reduce the screech.
Method to Generate Full-Span Ice Shape on Swept Wing Using Icing Tunnel Data
NASA Technical Reports Server (NTRS)
Lee, Sam; Camello, Stephanie
2015-01-01
There is a collaborative research program by NASA, FAA, ONERA, and university partners to improve the fidelity of experimental and computational simulation methods for swept-wing ice accretion formulations and resultant aerodynamic effects on large transport aircraft. This research utilizes a 65 scale Common Research Model as the baseline configuration. In order to generate the ice shapes for the aerodynamic testing, ice-accretion testing will be conducted in the NASA Icing Research Tunnel utilizing hybrid model from the 20, 64, and 83 spanwise locations. The models will have full-scale leading edges with truncated chord in order to fit the IRT test section. The ice shapes from the IRT tests will be digitized using a commercially available articulated-arm 3D laser scanning system. The methodology to acquire 3D ice shapes using a laser scanner was developed and validated in a previous research effort. Each of these models will yield a 1.5ft span of ice than can be used. However, a full-span ice accretion will require 75 ft span of ice. This means there will be large gaps between these spanwise ice sections that must be filled, while maintaining all of the important aerodynamic features. A method was developed to generate a full-span ice shape from the three 1.5 ft span ice shapes from the three models.
Li, Jie; Na, Lixin; Ma, Hao; Zhang, Zhe; Li, Tianjiao; Lin, Liqun; Li, Qiang; Sun, Changhao; Li, Ying
2015-01-01
The effects of prenatal nutrition on adult cognitive function have been reported for one generation. However, human evidence for multigenerational effects is lacking. We examined whether prenatal exposure to the Chinese famine of 1959–61 affects adult cognitive function in two consecutive generations. In this retrospective family cohort study, we investigated 1062 families consisting of 2124 parents and 1215 offspring. We assessed parental and offspring cognitive performance by means of a comprehensive test battery. Generalized linear regression model analysis in the parental generation showed that prenatal exposure to famine was associated with a 8.1 (95% CI 5.8 to 10.4) second increase in trail making test part A, a 7.0 (1.5 to 12.5) second increase in trail making test part B, and a 5.5 (−7.3 to −3.7) score decrease in the Stroop color-word test in adulthood, after adjustment for potential confounders. In the offspring generation, linear mixed model analysis found no significant association between parental prenatal exposure to famine and offspring cognitive function in adulthood after adjustment for potential confounders. In conclusion, prenatal exposure to severe malnutrition is negatively associated with visual- motor skill, mental flexibility, and selective attention in adulthood. However, these associations are limited to only one generation. PMID:26333696
Calculating Nozzle Side Loads using Acceleration Measurements of Test-Based Models
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ruf, Joe
2007-01-01
As part of a NASA/MSFC research program to evaluate the effect of different nozzle contours on the well-known but poorly characterized "side load" phenomena, we attempt to back out the net force on a sub-scale nozzle during cold-flow testing using acceleration measurements. Because modeling the test facility dynamics is problematic, new techniques for creating a "pseudo-model" of the facility and nozzle directly from modal test results are applied. Extensive verification procedures were undertaken, resulting in a loading scale factor necessary for agreement between test and model based frequency response functions. Side loads are then obtained by applying a wide-band random load onto the system model, obtaining nozzle response PSD's, and iterating both the amplitude and frequency of the input until a good comparison of the response with the measured response PSD for a specific time point is obtained. The final calculated loading can be used to compare different nozzle profiles for assessment during rocket engine nozzle development and as a basis for accurate design of the nozzle and engine structure to withstand these loads. The techniques applied within this procedure have extensive applicability to timely and accurate characterization of all test fixtures used for modal test.A viewgraph presentation on a model-test based pseudo-model used to calculate side loads on rocket engine nozzles is included. The topics include: 1) Side Loads in Rocket Nozzles; 2) Present Side Loads Research at NASA/MSFC; 3) Structural Dynamic Model Generation; 4) Pseudo-Model Generation; 5) Implementation; 6) Calibration of Pseudo-Model Response; 7) Pseudo-Model Response Verification; 8) Inverse Force Determination; 9) Results; and 10) Recent Work.
Tan, S; Hu, A; Wilson, T; Ladak, H; Haase, P; Fung, K
2012-04-01
(1) To investigate the efficacy of a computer-generated three-dimensional laryngeal model for laryngeal anatomy teaching; (2) to explore the relationship between students' spatial ability and acquisition of anatomical knowledge; and (3) to assess participants' opinion of the computerised model. Forty junior doctors were randomised to undertake laryngeal anatomy study supplemented by either a three-dimensional computer model or two-dimensional images. Outcome measurements comprised a laryngeal anatomy test, the modified Vandenberg and Kuse mental rotation test, and an opinion survey. Mean scores ± standard deviations for the anatomy test were 15.7 ± 2.0 for the 'three dimensions' group and 15.5 ± 2.3 for the 'standard' group (p = 0.7222). Pearson's correlation between the rotation test scores and the scores for the spatial ability questions in the anatomy test was 0.4791 (p = 0.086, n = 29). Opinion survey answers revealed significant differences in respondents' perceptions of the clarity and 'user friendliness' of, and their preferences for, the three-dimensional model as regards anatomical study. The three-dimensional computer model was equivalent to standard two-dimensional images, for the purpose of laryngeal anatomy teaching. There was no association between students' spatial ability and functional anatomy learning. However, students preferred to use the three-dimensional model.
Comparing fire spread algorithms using equivalence testing and neutral landscape models
Brian R. Miranda; Brian R. Sturtevant; Jian Yang; Eric J. Gustafson
2009-01-01
We demonstrate a method to evaluate the degree to which a meta-model approximates spatial disturbance processes represented by a more detailed model across a range of landscape conditions, using neutral landscapes and equivalence testing. We illustrate this approach by comparing burn patterns produced by a relatively simple fire spread algorithm with those generated by...
Testing Product Generation in Software Product Lines Using Pairwise for Features Coverage
NASA Astrophysics Data System (ADS)
Pérez Lamancha, Beatriz; Polo Usaola, Macario
A Software Product Lines (SPL) is "a set of software-intensive systems sharing a common, managed set of features that satisfy the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way". Variability is a central concept that permits the generation of different products of the family by reusing core assets. It is captured through features which, for a SPL, define its scope. Features are represented in a feature model, which is later used to generate the products from the line. From the testing point of view, testing all the possible combinations in feature models is not practical because: (1) the number of possible combinations (i.e., combinations of features for composing products) may be untreatable, and (2) some combinations may contain incompatible features. Thus, this paper resolves the problem by the implementation of combinatorial testing techniques adapted to the SPL context.
McFarland, Michael J; Palmer, Glenn R; Kordich, Micheal M; Pollet, Dean A; Jensen, James A; Lindsay, Mitchell H
2005-08-01
The U.S. Department of Defense approved activities conducted at the Utah Test and Training Range (UTTR) include both operational readiness test firing of intercontinental ballistic missile motors as well as the destruction of obsolete or otherwise unusable intercontinental ballistic missile motors through open burn/open detonation (OB/ OD). Within the Utah Division of Air Quality, these activities have been identified as having the potential to generate unacceptable noise levels, as well as significant amounts of hazardous air pollutants. Hill Air Force Base, UT, has completed a series of field tests at the UTTR in which sound-monitoring surveillance of OB/OD activities was conducted to validate the Sound Intensity Prediction System (SIPS) model. Using results generated by the SIPS model to support the decision to detonate, the UTTR successfully disposed of missile motors having an aggregate net explosive weight (NEW) of 56,500 lbs without generating adverse noise levels within populated areas. These results suggest that, under appropriate conditions, missile motors of even larger NEW may be detonated without exceeding regulatory noise limits. In conjunction with collecting noise monitoring data, air quality data was collected to support the development of air emission factors for both static missile motor firings and OB/OD activities. Through the installation of 15 ground-based air samplers, the generation of combustion fixed gases, hazardous air pollutants, and chlorides were monitored during the 56,500-lb NEW detonation event. Comparison of field measurements to predictions generated from the U.S. Navy's energetic combustion pollutant formation model, POLU4WN, indicated that, as the detonation fireball expanded from ground zero, organic compounds as well as carbon monoxide continued to oxidize as the hot gases reacted with ambient air. Hazardous air pollutant analysis of air samplers confirmed the presence of chloromethane, benzene, toluene, 1,2-propadiene, and 2-methyl-l-propene, whereas the absence of hydrogen chloride gas suggested that free chlorine is not generated during the combustion process.
NASA Astrophysics Data System (ADS)
Einspigel, D.; Sachl, L.; Martinec, Z.
2014-12-01
We present the DEBOT model, which is a new global barotropic ocean model. The DEBOT model is primarily designed for modelling of ocean flow generated by the tidal attraction of the Moon and the Sun, however it can be used for other ocean applications where the barotropic model is sufficient, for instance, a tsunami wave propagation. The model has been thoroughly tested by several different methods: 1) synthetic example which involves a tsunami-like wave propagation of an initial Gaussian depression and testing of the conservation of integral invariants, 2) a benchmark study with another barotropic model, the LSGbt model, has been performed and 3) results of realistic simulations have been compared with data from tide gauge measurements around the world. The test computations prove the validity of the numerical code and demonstrate the ability of the DEBOT model to simulate the realistic ocean tides. The DEBOT model will be principaly applied in related geophysical disciplines, for instance, in an investigation of an influence of the ocean tides on the geomagnetic field or the Earth's rotation. A module for modelling of the secondary poloidal magnetic field generated by an ocean flow is already implemented in the DEBOT model and preliminary results will be presented. The future aim is to assimilate magnetic data provided by the Swarm satellite mission into the ocean flow model.
Does rational selection of training and test sets improve the outcome of QSAR modeling?
Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander
2012-10-22
Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.
Generation and detection of plasmonic nanobubbles in zebrafish.
Lukianova-Hleb, E Y; Santiago, C; Wagner, D S; Hafner, J H; Lapotko, D O
2010-06-04
The zebrafish embryo has been evaluated as an in vivo model for plasmonic nanobubble (PNB) generation and detection at nanoscale. The embryo is easily observed and manipulated utilizing the same methodology as for application of PNBs in vitro. Injection and irradiation of gold nanoparticles with a short laser pulse resulted in generation of PNBs in zebrafish with similar parameters as for PNBs generated in water and cultured living cells. These PNBs do not result in systemic damage, thus we demonstrated an in vivo model for rapid and precise testing of plasmonic nanotechnologies.
NASA Technical Reports Server (NTRS)
Chen, Ping-Chih (Inventor)
2013-01-01
This invention is a ground flutter testing system without a wind tunnel, called Dry Wind Tunnel (DWT) System. The DWT system consists of a Ground Vibration Test (GVT) hardware system, a multiple input multiple output (MIMO) force controller software, and a real-time unsteady aerodynamic force generation software, that is developed from an aerodynamic reduced order model (ROM). The ground flutter test using the DWT System operates on a real structural model, therefore no scaled-down structural model, which is required by the conventional wind tunnel flutter test, is involved. Furthermore, the impact of the structural nonlinearities on the aeroelastic stability can be included automatically. Moreover, the aeroservoelastic characteristics of the aircraft can be easily measured by simply including the flight control system in-the-loop. In addition, the unsteady aerodynamics generated computationally is interference-free from the wind tunnel walls. Finally, the DWT System can be conveniently and inexpensively carried out as a post GVT test with the same hardware, only with some possible rearrangement of the shakers and the inclusion of additional sensors.
ERIC Educational Resources Information Center
Steiger, Andrea E.; Fend, Helmut A.; Allemand, Mathias
2015-01-01
The vulnerability model states that low self-esteem functions as a predictor for the development of depressive symptoms whereas the scar model assumes that these symptoms leave scars in individuals resulting in lower self-esteem. Both models have received empirical support, however, they have only been tested within individuals and not across…
Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Improved animal models for testing gene therapy for atherosclerosis.
Du, Liang; Zhang, Jingwan; De Meyer, Guido R Y; Flynn, Rowan; Dichek, David A
2014-04-01
Gene therapy delivered to the blood vessel wall could augment current therapies for atherosclerosis, including systemic drug therapy and stenting. However, identification of clinically useful vectors and effective therapeutic transgenes remains at the preclinical stage. Identification of effective vectors and transgenes would be accelerated by availability of animal models that allow practical and expeditious testing of vessel-wall-directed gene therapy. Such models would include humanlike lesions that develop rapidly in vessels that are amenable to efficient gene delivery. Moreover, because human atherosclerosis develops in normal vessels, gene therapy that prevents atherosclerosis is most logically tested in relatively normal arteries. Similarly, gene therapy that causes atherosclerosis regression requires gene delivery to an existing lesion. Here we report development of three new rabbit models for testing vessel-wall-directed gene therapy that either prevents or reverses atherosclerosis. Carotid artery intimal lesions in these new models develop within 2-7 months after initiation of a high-fat diet and are 20-80 times larger than lesions in a model we described previously. Individual models allow generation of lesions that are relatively rich in either macrophages or smooth muscle cells, permitting testing of gene therapy strategies targeted at either cell type. Two of the models include gene delivery to essentially normal arteries and will be useful for identifying strategies that prevent lesion development. The third model generates lesions rapidly in vector-naïve animals and can be used for testing gene therapy that promotes lesion regression. These models are optimized for testing helper-dependent adenovirus (HDAd)-mediated gene therapy; however, they could be easily adapted for testing of other vectors or of different types of molecular therapies, delivered directly to the blood vessel wall. Our data also supports the promise of HDAd to deliver long-term therapy from vascular endothelium without accelerating atherosclerotic disease.
A Statistical Method to Distinguish Functional Brain Networks
Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045
A Statistical Method to Distinguish Functional Brain Networks.
Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).
Visual Persons Behavior Diary Generation Model based on Trajectories and Pose Estimation
NASA Astrophysics Data System (ADS)
Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li
2018-03-01
The behavior pattern of persons was the important output of the surveillance analysis. This paper focus on the generation model of visual person behavior diary. The pipeline includes the person detection, tracking, and the person behavior classify. This paper adopts the deep convolutional neural model YOLO (You Only Look Once)V2 for person detection module. Multi person tracking was based on the detection framework. The Hungarian assignment algorithm was used to the matching. The person appearance model was integrated by HSV color model and Hash code model. The person object motion was estimated by the Kalman Filter. The multi objects were matching with exist tracklets through the appearance and motion location distance by the Hungarian assignment method. A long continuous trajectory for one person was get by the spatial-temporal continual linking algorithm. And the face recognition information was used to identify the trajectory. The trajectories with identification information can be used to generate the visual diary of person behavior based on the scene context information and person action estimation. The relevant modules are tested in public data sets and our own capture video sets. The test results show that the method can be used to generate the visual person behavior pattern diary with certain accuracy.
Nicholson, Daren T; Chalk, Colin; Funnell, W Robert J; Daniel, Sam J
2006-11-01
The use of computer-generated 3-dimensional (3-D) anatomical models to teach anatomy has proliferated. However, there is little evidence that these models are educationally effective. The purpose of this study was to test the educational effectiveness of a computer-generated 3-D model of the middle and inner ear. We reconstructed a fully interactive model of the middle and inner ear from a magnetic resonance imaging scan of a human cadaver ear. To test the model's educational usefulness, we conducted a randomised controlled study in which 28 medical students completed a Web-based tutorial on ear anatomy that included the interactive model, while a control group of 29 students took the tutorial without exposure to the model. At the end of the tutorials, both groups were asked a series of 15 quiz questions to evaluate their knowledge of 3-D relationships within the ear. The intervention group's mean score on the quiz was 83%, while that of the control group was 65%. This difference in means was highly significant (P < 0.001). Our findings stand in contrast to the handful of previous randomised controlled trials that evaluated the effects of computer-generated 3-D anatomical models on learning. The equivocal and negative results of these previous studies may be due to the limitations of these studies (such as small sample size) as well as the limitations of the models that were studied (such as a lack of full interactivity). Given our positive results, we believe that further research is warranted concerning the educational effectiveness of computer-generated anatomical models.
A simple topography-driven, calibration-free runoff generation model
NASA Astrophysics Data System (ADS)
Gao, H.; Birkel, C.; Hrachowitz, M.; Tetzlaff, D.; Soulsby, C.; Savenije, H. H. G.
2017-12-01
Determining the amount of runoff generation from rainfall occupies a central place in rainfall-runoff modelling. Moreover, reading landscapes and developing calibration-free runoff generation models that adequately reflect land surface heterogeneities remains the focus of much hydrological research. In this study, we created a new method to estimate runoff generation - HAND-based Storage Capacity curve (HSC) which uses a topographic index (HAND, Height Above the Nearest Drainage) to identify hydrological similarity and partially the saturated areas of catchments. We then coupled the HSC model with the Mass Curve Technique (MCT) method to estimate root zone storage capacity (SuMax), and obtained the calibration-free runoff generation model HSC-MCT. Both the two models (HSC and HSC-MCT) allow us to estimate runoff generation and simultaneously visualize the spatial dynamic of saturated area. We tested the two models in the data-rich Bruntland Burn (BB) experimental catchment in Scotland with an unusual time series of the field-mapped saturation area extent. The models were subsequently tested in 323 MOPEX (Model Parameter Estimation Experiment) catchments in the United States. HBV and TOPMODEL were used as benchmarks. We found that the HSC performed better in reproducing the spatio-temporal pattern of the observed saturated areas in the BB catchment compared with TOPMODEL which is based on the topographic wetness index (TWI). The HSC also outperformed HBV and TOPMODEL in the MOPEX catchments for both calibration and validation. Despite having no calibrated parameters, the HSC-MCT model also performed comparably well with the calibrated HBV and TOPMODEL, highlighting the robustness of the HSC model to both describe the spatial distribution of the root zone storage capacity and the efficiency of the MCT method to estimate the SuMax. Moreover, the HSC-MCT model facilitated effective visualization of the saturated area, which has the potential to be used for broader geoscience studies beyond hydrology.
Monte Carlo Simulation Using HyperCard and Lotus 1-2-3.
ERIC Educational Resources Information Center
Oulman, Charles S.; Lee, Motoko Y.
Monte Carlo simulation is a computer modeling procedure for mimicking observations on a random variable. A random number generator is used in generating the outcome for the events that are being modeled. The simulation can be used to obtain results that otherwise require extensive testing or complicated computations. This paper describes how Monte…
Calibrating Item Families and Summarizing the Results Using Family Expected Response Functions
ERIC Educational Resources Information Center
Sinharay, Sandip; Johnson, Matthew S.; Williamson, David M.
2003-01-01
Item families, which are groups of related items, are becoming increasingly popular in complex educational assessments. For example, in automatic item generation (AIG) systems, a test may consist of multiple items generated from each of a number of item models. Item calibration or scoring for such an assessment requires fitting models that can…
Engineering High Assurance Distributed Cyber Physical Systems
2015-01-15
decisions: number of interacting agents and co-dependent decisions made in real-time without causing interference . To engineer a high assurance DART...environment specification, architecture definition, domain-specific languages, design patterns, code - generation, analysis, test-generation, and simulation...include synchronization between the models and source code , debugging at the model level, expression of the design intent, and quality of service
ERIC Educational Resources Information Center
Obschonka, Martin; Silbereisen, Rainer K.; Schmitt-Rodermund, Eva
2012-01-01
Applying a life-span approach of human development and using the example of science-based business idea generation, the authors used structural equation modeling to test a mediation model for predicting entrepreneurial behavior in a sample of German scientists (2 measurement occasions; Time 1, N = 488). It was found that recalled early…
Scaled Tank Test Design and Results for the Aquantis 2.5 MW Ocean Current Generation Device
Swales, Henry; Kils, Ole; Coakley, David B.; Sites, Eric; Mayer, Tyler
2015-06-03
Aquantis 2.5 MW Ocean Current Generation Device, Tow Tank Dynamic Rig Structural Analysis Results. This is the detailed documentation for scaled device testing in a tow tank, including models, drawings, presentations, cost of energy analysis, and structural analysis. This dataset also includes specific information on drivetrain, roller bearing, blade fabrication, mooring, and rotor characteristics.
USDA-ARS?s Scientific Manuscript database
CLIGEN (CLImate GENerator) is a widely used stochastic weather generator to simulate continuous daily precipitation and storm pattern information for hydrological and soil erosion models. Although CLIGEN has been tested in several regions in the world, thoroughly assessment before applying it to Chi...
Multispectral Remote Sensing of the Earth and Environment Using KHawk Unmanned Aircraft Systems
NASA Astrophysics Data System (ADS)
Gowravaram, Saket
This thesis focuses on the development and testing of the KHawk multispectral remote sensing system for environmental and agricultural applications. KHawk Unmanned Aircraft System (UAS), a small and low-cost remote sensing platform, is used as the test bed for aerial video acquisition. An efficient image geotagging and photogrammetric procedure for aerial map generation is described, followed by a comprehensive error analysis on the generated maps. The developed procedure is also used for generation of multispectral aerial maps including red, near infrared (NIR) and colored infrared (CIR) maps. A robust Normalized Difference Vegetation index (NDVI) calibration procedure is proposed and validated by ground tests and KHawk flight test. Finally, the generated aerial maps and their corresponding Digital Elevation Models (DEMs) are used for typical application scenarios including prescribed fire monitoring, initial fire line estimation, and tree health monitoring.
Metastatic melanoma moves on: translational science in the era of personalized medicine.
Levesque, Mitchell P; Cheng, Phil F; Raaijmakers, Marieke I G; Saltari, Annalisa; Dummer, Reinhard
2017-03-01
Progress in understanding and treating metastatic melanoma is the result of decades of basic and translational research as well as the development of better in vitro tools for modeling the disease. Here, we review the latest therapeutic options for metastatic melanoma and the known genetic and non-genetic mechanisms of resistance to these therapies, as well as the in vitro toolbox that has provided the greatest insights into melanoma progression. These include next-generation sequencing technologies and more complex 2D and 3D cell culture models to functionally test the data generated by genomics approaches. The combination of hypothesis generating and hypothesis testing paradigms reviewed here will be the foundation for the next phase of metastatic melanoma therapies in the coming years.
NASA Astrophysics Data System (ADS)
Pickett, Derek Kyle
Due to an increased interest in sustainable energy, biodiesel has become much more widely used in the last several years. Glycerin, one major waste component in biodiesel production, can be converted into a hydrogen rich synthesis gas to be used in an engine generator to recover energy from the biodiesel production process. This thesis contains information detailing the production, testing, and analysis of a unique synthesis generator rig at the University of Kansas. Chapter 2 gives a complete background of all major components, as well as how they are operated. In addition to component descriptions, methods for operating the system on pure propane, reformed propane, reformed glycerin along with the methodology of data acquisition is described. This chapter will serve as a complete operating manual for future students to continue research on the project. Chapter 3 details the literature review that was completed to better understand fuel reforming of propane and glycerin. This chapter also describes the numerical model produced to estimate the species produced during reformation activities. The model was applied to propane reformation in a proof of concept and calibration test before moving to glycerin reformation and its subsequent combustion. Chapter 4 first describes the efforts to apply the numerical model to glycerin using the calibration tools from propane reformation. It then discusses catalytic material preparation and glycerin reformation tests. Gas chromatography analysis of the reformer effluent was completed to compare to theoretical values from the numerical model. Finally, combustion of reformed glycerin was completed for power generation. Tests were completed to compare emissions from syngas combustion and propane combustion.
Simulating the Impact Response of Three Full-Scale Crash Tests of Cessna 172 Aircraft
NASA Technical Reports Server (NTRS)
Jackson, Karen E.; Fasanella, Edwin L.; Littell, Justin D.; Annett, Martin S.; Stimson, Chad M.
2017-01-01
During the summer of 2015, a series of three full-scale crash tests were performed at the Landing and Impact Research Facility located at NASA Langley Research Center of Cessna 172 aircraft. The first test (Test 1) represented a flare-to-stall emergency or hard landing onto a rigid surface. The second test (Test 2) represented a controlled-flight- into-terrain (CFIT) with a nose down pitch attitude of the aircraft, which impacted onto soft soil. The third test (Test 3) also represented a CFIT with a nose up pitch attitude of the aircraft, which resulted in a tail strike condition. Test 3 was also conducted onto soft soil. These crash tests were performed for the purpose of evaluating the performance of Emergency Locator Transmitters and to generate impact test data for model calibration. Finite element models were generated and impact analyses were conducted to simulate the three impact conditions using the commercial nonlinear, transient dynamic finite element code, LS-DYNA®. The objective of this paper is to summarize test-analysis results for the three full-scale crash tests.
Generalized functional linear models for gene-based case-control association studies.
Fan, Ruzong; Wang, Yifan; Mills, James L; Carter, Tonia C; Lobach, Iryna; Wilson, Alexander F; Bailey-Wilson, Joan E; Weeks, Daniel E; Xiong, Momiao
2014-11-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene region are disease related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease datasets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. © 2014 WILEY PERIODICALS, INC.
Generalized Functional Linear Models for Gene-based Case-Control Association Studies
Mills, James L.; Carter, Tonia C.; Lobach, Iryna; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Weeks, Daniel E.; Xiong, Momiao
2014-01-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene are disease-related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease data sets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. PMID:25203683
40 CFR 600.111-93 - Test procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Test procedures. 600.111-93 Section... Emission Regulations for 1978 and Later Model Year Automobiles-Test Procedures § 600.111-93 Test procedures. (a) The test procedures to be followed for generation of the city fuel economy data are those...
40 CFR 600.111-80 - Test procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Test procedures. 600.111-80 Section... Emission Regulations for 1978 and Later Model Year Automobiles-Test Procedures § 600.111-80 Test procedures. (a) The test procedures to be followed for generation of the city fuel economy data are those...
Testing 40 Predictions from the Transtheoretical Model Again, with Confidence
ERIC Educational Resources Information Center
Velicer, Wayne F.; Brick, Leslie Ann D.; Fava, Joseph L.; Prochaska, James O.
2013-01-01
Testing Theory-based Quantitative Predictions (TTQP) represents an alternative to traditional Null Hypothesis Significance Testing (NHST) procedures and is more appropriate for theory testing. The theory generates explicit effect size predictions and these effect size estimates, with related confidence intervals, are used to test the predictions.…
Acoustic Model Testing Chronology
NASA Technical Reports Server (NTRS)
Nesman, Tom
2017-01-01
Scale models have been used for decades to replicate liftoff environments and in particular acoustics for launch vehicles. It is assumed, and analyses supports, that the key characteristics of noise generation, propagation, and measurement can be scaled. Over time significant insight was gained not just towards understanding the effects of thruster details, pad geometry, and sound mitigation but also to the physical processes involved. An overview of a selected set of scale model tests are compiled here to illustrate the variety of configurations that have been tested and the fundamental knowledge gained. The selected scale model tests are presented chronologically.
A reduced order, test verified component mode synthesis approach for system modeling applications
NASA Astrophysics Data System (ADS)
Butland, Adam; Avitabile, Peter
2010-05-01
Component mode synthesis (CMS) is a very common approach used for the generation of large system models. In general, these modeling techniques can be separated into two categories: those utilizing a combination of constraint modes and fixed interface normal modes and those based on a combination of free interface normal modes and residual flexibility terms. The major limitation of the methods utilizing constraint modes and fixed interface normal modes is the inability to easily obtain the required information from testing; the result of this limitation is that constraint mode-based techniques are primarily used with numerical models. An alternate approach is proposed which utilizes frequency and shape information acquired from modal testing to update reduced order finite element models using exact analytical model improvement techniques. The connection degrees of freedom are then rigidly constrained in the test verified, reduced order model to provide the boundary conditions necessary for constraint modes and fixed interface normal modes. The CMS approach is then used with this test verified, reduced order model to generate the system model for further analysis. A laboratory structure is used to show the application of the technique with both numerical and simulated experimental components to describe the system and validate the proposed approach. Actual test data is then used in the approach proposed. Due to typical measurement data contaminants that are always included in any test, the measured data is further processed to remove contaminants and is then used in the proposed approach. The final case using improved data with the reduced order, test verified components is shown to produce very acceptable results from the Craig-Bampton component mode synthesis approach. Use of the technique with its strengths and weaknesses are discussed.
Erdoğdu, Utku; Tan, Mehmet; Alhajj, Reda; Polat, Faruk; Rokne, Jon; Demetrick, Douglas
2013-01-01
The availability of enough samples for effective analysis and knowledge discovery has been a challenge in the research community, especially in the area of gene expression data analysis. Thus, the approaches being developed for data analysis have mostly suffered from the lack of enough data to train and test the constructed models. We argue that the process of sample generation could be successfully automated by employing some sophisticated machine learning techniques. An automated sample generation framework could successfully complement the actual sample generation from real cases. This argument is validated in this paper by describing a framework that integrates multiple models (perspectives) for sample generation. We illustrate its applicability for producing new gene expression data samples, a highly demanding area that has not received attention. The three perspectives employed in the process are based on models that are not closely related. The independence eliminates the bias of having the produced approach covering only certain characteristics of the domain and leading to samples skewed towards one direction. The first model is based on the Probabilistic Boolean Network (PBN) representation of the gene regulatory network underlying the given gene expression data. The second model integrates Hierarchical Markov Model (HIMM) and the third model employs a genetic algorithm in the process. Each model learns as much as possible characteristics of the domain being analysed and tries to incorporate the learned characteristics in generating new samples. In other words, the models base their analysis on domain knowledge implicitly present in the data itself. The developed framework has been extensively tested by checking how the new samples complement the original samples. The produced results are very promising in showing the effectiveness, usefulness and applicability of the proposed multi-model framework.
ERIC Educational Resources Information Center
Roduta Roberts, Mary; Alves, Cecilia B.; Chu, Man-Wai; Thompson, Margaret; Bahry, Louise M.; Gotzmann, Andrea
2014-01-01
The purpose of this study was to evaluate the adequacy of three cognitive models, one developed by content experts and two generated from student verbal reports for explaining examinee performance on a grade 3 diagnostic mathematics test. For this study, the items were developed to directly measure the attributes in the cognitive model. The…
AIAA Aerospace America Magazine - Year in Review Article, 2010
NASA Technical Reports Server (NTRS)
Figueroa, Fernando
2010-01-01
NASA Stennis Space Center has implemented a pilot operational Integrated System Health Management (ISHM) capability. The implementation was done for the E-2 Rocket Engine Test Stand and a Chemical Steam Generator (CSG) test article; and validated during operational testing. The CSG test program is a risk mitigation activity to support building of the new A-3 Test Stand, which will be a highly complex facility for testing of engines in high altitude conditions. The foundation of the ISHM capability are knowledge-based integrated domain models for the test stand and CSG, with physical and model-based elements represented by objects the domain models enable modular and evolutionary ISHM functionality.
Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.
Kolossa, Antonio; Kopp, Bruno
2016-01-01
The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.
Multi-agent simulation of generation expansion in electricity markets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botterud, A; Mahalik, M. R.; Veselka, T. D.
2007-06-01
We present a new multi-agent model of generation expansion in electricity markets. The model simulates generation investment decisions of decentralized generating companies (GenCos) interacting in a complex, multidimensional environment. A probabilistic dispatch algorithm calculates prices and profits for new candidate units in different future states of the system. Uncertainties in future load, hydropower conditions, and competitors actions are represented in a scenario tree, and decision analysis is used to identify the optimal expansion decision for each individual GenCo. We test the model using real data for the Korea power system under different assumptions about market design, market concentration, and GenCo'smore » assumed expectations about their competitors investment decisions.« less
Implementing a Nuclear Power Plant Model for Evaluating Load-Following Capability on a Small Grid
NASA Astrophysics Data System (ADS)
Arda, Samet Egemen
A pressurized water reactor (PWR) nuclear power plant (NPP) model is introduced into Positive Sequence Load Flow (PSLF) software by General Electric in order to evaluate the load-following capability of NPPs. The nuclear steam supply system (NSSS) consists of a reactor core, hot and cold legs, plenums, and a U-tube steam generator. The physical systems listed above are represented by mathematical models utilizing a state variable lumped parameter approach. A steady-state control program for the reactor, and simple turbine and governor models are also developed. Adequacy of the isolated reactor core, the isolated steam generator, and the complete PWR models are tested in Matlab/Simulink and dynamic responses are compared with the test results obtained from the H. B. Robinson NPP. Test results illustrate that the developed models represents the dynamic features of real-physical systems and are capable of predicting responses due to small perturbations of external reactivity and steam valve opening. Subsequently, the NSSS representation is incorporated into PSLF and coupled with built-in excitation system and generator models. Different simulation cases are run when sudden loss of generation occurs in a small power system which includes hydroelectric and natural gas power plants besides the developed PWR NPP. The conclusion is that the NPP can respond to a disturbance in the power system without exceeding any design and safety limits if appropriate operational conditions, such as achieving the NPP turbine control by adjusting the speed of the steam valve, are met. In other words, the NPP can participate in the control of system frequency and improve the overall power system performance.
NASA Technical Reports Server (NTRS)
Wells, Jason E.; Black, David L.; Taylor, Casey L.
2013-01-01
Exhaust plumes from large solid rocket motors fired at ATK's Promontory test site carry particulates to high altitudes and typically produce deposits that fall on regions downwind of the test area. As populations and communities near the test facility grow, ATK has become increasingly concerned about the impact of motor testing on those surrounding communities. To assess the potential impact of motor testing on the community and to identify feasible mitigation strategies, it is essential to have a tool capable of predicting plume behavior downrange of the test stand. A software package, called PlumeTracker, has been developed and validated at ATK for this purpose. The code is a point model that offers a time-dependent, physics-based description of plume transport and precipitation. The code can utilize either measured or forecasted weather data to generate plume predictions. Next-Generation Radar (NEXRAD) data and field observations from twenty-three historical motor test fires at Promontory were collected to test the predictive capability of PlumeTracker. Model predictions for plume trajectories and deposition fields were found to correlate well with the collected dataset.
Enhanced modeling and simulation of EO/IR sensor systems
NASA Astrophysics Data System (ADS)
Hixson, Jonathan G.; Miller, Brian; May, Christopher
2015-05-01
The testing and evaluation process developed by the Night Vision and Electronic Sensors Directorate (NVESD) Modeling and Simulation Division (MSD) provides end to end systems evaluation, testing, and training of EO/IR sensors. By combining NV-LabCap, the Night Vision Integrated Performance Model (NV-IPM), One Semi-Automated Forces (OneSAF) input sensor file generation, and the Night Vision Image Generator (NVIG) capabilities, NVESD provides confidence to the M&S community that EO/IR sensor developmental and operational testing and evaluation are accurately represented throughout the lifecycle of an EO/IR system. This new process allows for both theoretical and actual sensor testing. A sensor can be theoretically designed in NV-IPM, modeled in NV-IPM, and then seamlessly input into the wargames for operational analysis. After theoretical design, prototype sensors can be measured by using NV-LabCap, then modeled in NV-IPM and input into wargames for further evaluation. The measurement process to high fidelity modeling and simulation can then be repeated again and again throughout the entire life cycle of an EO/IR sensor as needed, to include LRIP, full rate production, and even after Depot Level Maintenance. This is a prototypical example of how an engineering level model and higher level simulations can share models to mutual benefit.
Comparative Analysis of InSAR Digital Surface Models for Test Area Bucharest
NASA Astrophysics Data System (ADS)
Dana, Iulia; Poncos, Valentin; Teleaga, Delia
2010-03-01
This paper presents the results of the interferometric processing of ERS Tandem, ENVISAT and TerraSAR- X for digital surface model (DSM) generation. The selected test site is Bucharest (Romania), a built-up area characterized by the usual urban complex pattern: mixture of buildings with different height levels, paved roads, vegetation, and water bodies. First, the DSMs were generated following the standard interferometric processing chain. Then, the accuracy of the DSMs was analyzed against the SPOT HRS model (30 m resolution at the equator). A DSM derived by optical stereoscopic processing of SPOT 5 HRG data and also the SRTM (3 arc seconds resolution at the equator) DSM have been included in the comparative analysis.
Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model Generation for ns-3
2015-12-01
through visiting the inferred automata o Fuzzing of an implementation by generating altered message formats We tested with 3 versions of Netzob. First...relationships. Afterwards, we used the Automata module to generate state machines using different functions: “generateChainedStateAutomata...The “generatePTAAutomata” takes as input several communication sessions and then identifies common paths and merges these into a single automata . The
Automatic Testcase Generation for Flight Software
NASA Technical Reports Server (NTRS)
Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.
2008-01-01
The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.
NASA Technical Reports Server (NTRS)
Ensey, Tyler S.
2013-01-01
During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.
SRG110 Stirling Generator Dynamic Simulator Vibration Test Results and Analysis Correlation
NASA Technical Reports Server (NTRS)
Suarez, Vicente J.; Lewandowski, Edward J.; Callahan, John
2006-01-01
The U.S. Department of Energy (DOE), Lockheed Martin (LM), and NASA Glenn Research Center (GRC) have been developing the Stirling Radioisotope Generator (SRG110) for use as a power system for space science missions. The launch environment enveloping potential missions results in a random input spectrum that is significantly higher than historical RPS launch levels and is a challenge for designers. Analysis presented in prior work predicted that tailoring the compliance at the generator-spacecraft interface reduced the dynamic response of the system thereby allowing higher launch load input levels and expanding the range of potential generator missions. To confirm analytical predictions, a dynamic simulator representing the generator structure, Stirling convertors and heat sources was designed and built for testing with and without a compliant interface. Finite element analysis was performed to guide the generator simulator and compliant interface design so that test modes and frequencies were representative of the SRG110 generator. This paper presents the dynamic simulator design, the test setup and methodology, test article modes and frequencies and dynamic responses, and post-test analysis results. With the compliant interface, component responses to an input environment exceeding the SRG110 qualification level spectrum were all within design allowables. Post-test analysis included finite element model tuning to match test frequencies and random response analysis using the test input spectrum. Analytical results were in good overall agreement with the test results and confirmed previous predictions that the SRG110 power system may be considered for a broad range of potential missions, including those with demanding launch environments.
SRG110 Stirling Generator Dynamic Simulator Vibration Test Results and Analysis Correlation
NASA Technical Reports Server (NTRS)
Lewandowski, Edward J.; Suarez, Vicente J.; Goodnight, Thomas W.; Callahan, John
2007-01-01
The U.S. Department of Energy (DOE), Lockheed Martin (LM), and NASA Glenn Research Center (GRC) have been developing the Stirling Radioisotope Generator (SRG110) for use as a power system for space science missions. The launch environment enveloping potential missions results in a random input spectrum that is significantly higher than historical radioisotope power system (RPS) launch levels and is a challenge for designers. Analysis presented in prior work predicted that tailoring the compliance at the generator-spacecraft interface reduced the dynamic response of the system thereby allowing higher launch load input levels and expanding the range of potential generator missions. To confirm analytical predictions, a dynamic simulator representing the generator structure, Stirling convertors and heat sources were designed and built for testing with and without a compliant interface. Finite element analysis was performed to guide the generator simulator and compliant interface design so that test modes and frequencies were representative of the SRG110 generator. This paper presents the dynamic simulator design, the test setup and methodology, test article modes and frequencies and dynamic responses, and post-test analysis results. With the compliant interface, component responses to an input environment exceeding the SRG110 qualification level spectrum were all within design allowables. Post-test analysis included finite element model tuning to match test frequencies and random response analysis using the test input spectrum. Analytical results were in good overall agreement with the test results and confirmed previous predictions that the SRG110 power system may be considered for a broad range of potential missions, including those with demanding launch environments.
Conduction cooled compact laser for the supercam Libsraman instrument
NASA Astrophysics Data System (ADS)
Durand, Eric; Derycke, C.; Boudjemaa, L.; Simon-Boisson, C.; Roucayrol, L.; Perez, R.; Faure, B.; Maurice, S.
2017-09-01
A new conduction cooled compact laser for SuperCam LIBS-RAMAN instrument aboard Mars 2020 Rover is presented. An oscillator generates 30mJ at 1µm with a good spatial quality. A Second Harmonic Generator (SHG) at the oscillator output generates 15 mJ at 532 nm. A RTP electro-optical switch, between the oscillator and SHG, allows the operation mode selection (LIBS or RAMAN). Qualification model of this laser has been built and characterised. Environmental testing of this model is also reported.
Armanini, D G; Monk, W A; Carter, L; Cote, D; Baird, D J
2013-08-01
Evaluation of the ecological status of river sites in Canada is supported by building models using the reference condition approach. However, geography, data scarcity and inter-operability constraints have frustrated attempts to monitor national-scale status and trends. This issue is particularly true in Atlantic Canada, where no ecological assessment system is currently available. Here, we present a reference condition model based on the River Invertebrate Prediction and Classification System approach with regional-scale applicability. To achieve this, we used biological monitoring data collected from wadeable streams across Atlantic Canada together with freely available, nationally consistent geographic information system (GIS) environmental data layers. For the first time, we demonstrated that it is possible to use data generated from different studies, even when collected using different sampling methods, to generate a robust predictive model. This model was successfully generated and tested using GIS-based rather than local habitat variables and showed improved performance when compared to a null model. In addition, ecological quality ratio data derived from the model responded to observed stressors in a test dataset. Implications for future large-scale implementation of river biomonitoring using a standardised approach with global application are presented.
ERIC Educational Resources Information Center
ALTMANN, BERTHOLD
AFTER A BRIEF SUMMARY OF THE TEST PROGRAM (DESCRIBED MORE FULLY IN LI 000 318), THE STATISTICAL RESULTS TABULATED AS OVERALL "ABC (APPROACH BY CONCEPT)-RELEVANCE RATIOS" AND "ABC-RECALL FIGURES" ARE PRESENTED AND REVIEWED. AN ABSTRACT MODEL DEVELOPED IN ACCORDANCE WITH MAX WEBER'S "IDEALTYPUS" ("DIE OBJEKTIVITAET…
1977-01-12
This archival photo shows the Voyager Proof Test Model undergoing a mechanical preparation and weight center of gravity test at NASA's Jet Propulsion Laboratory, Pasadena, California, on January 12, 1977. The stack of three white cylinders seen near center is a stand-in for the spacecraft's power generators (called RTGs). Above that, a silvery canister holds the spacecraft's magnetometer in its stowed configuration. https://photojournal.jpl.nasa.gov/catalog/PIA21477
Creep-Fatigue Damage Investigation and Modeling of Alloy 617 at High Temperatures
NASA Astrophysics Data System (ADS)
Tahir, Fraaz
The Very High Temperature Reactor (VHTR) is one of six conceptual designs proposed for Generation IV nuclear reactors. Alloy 617, a solid solution strengthened Ni-base superalloy, is currently the primary candidate material for the tubing of the Intermediate Heat Exchanger (IHX) in the VHTR design. Steady-state operation of the nuclear power plant at elevated temperatures leads to creep deformation, whereas loading transients including startup and shutdown generate fatigue. A detailed understanding of the creep-fatigue interaction in Alloy 617 is necessary before it can be considered as a material for nuclear construction in ASME Boiler and Pressure Vessel Code. Current design codes for components undergoing creep-fatigue interaction at elevated temperatures require creep-fatigue testing data covering the entire range from fatigue-dominant to creep-dominant loading. Classical strain-controlled tests, which produce stress relaxation during the hold period, show a saturation in cycle life with increasing hold periods due to the rapid stress-relaxation of Alloy 617 at high temperatures. Therefore, applying longer hold time in these tests cannot generate creep-dominated failure. In this study, uniaxial isothermal creep-fatigue tests with non-traditional loading waveforms were designed and performed at 850 and 950°C, with an objective of generating test data in the creep-dominant regime. The new loading waveforms are hybrid strain-controlled and force-controlled testing which avoid stress relaxation during the creep hold. The experimental data showed varying proportions of creep and fatigue damage, and provided evidence for the inadequacy of the widely-used time fraction rule for estimating creep damage under creep-fatigue conditions. Micro-scale damage features in failed test specimens, such as fatigue cracks and creep voids, were quantified using a Scanning Electron Microscope (SEM) to find a correlation between creep and fatigue damage. Quantitative statistical imaging analysis showed that the microstructural damage features (cracks and voids) are correlated with a new mechanical driving force parameter. The results from this image-based damage analysis were used to develop a phenomenological life-prediction methodology called the effective time fraction approach. Finally, the constitutive creep-fatigue response of the material at 950°C was modeled using a unified viscoplastic model coupled with a damage accumulation model. The simulation results were used to validate an energy-based constitutive life-prediction model, as a mechanistic model for potential component and structure level creep-fatigue analysis.
Carbon deposition model for oxygen-hydrocarbon combustion
NASA Technical Reports Server (NTRS)
Bossard, John A.
1988-01-01
The objectives are to use existing hardware to verify and extend the database generated on the original test programs. The data to be obtained are the carbon deposition characteristics when methane is used at injection densities comparable to full scale values. The database will be extended to include liquid natural gas (LNG) testing at low injection densities for gas generator/preburner conditions. The testing will be performed at mixture ratios between 0.25 and 0.60, and at chamber pressures between 750 and 1500 psi.
Test and evaluation of the Navy half-watt RTG. [Radioisotope Thermoelectric Generator
NASA Technical Reports Server (NTRS)
Rosell, F. E., Jr.; Lane, S. D.; Eggers, P. E.; Gawthrop, W. E.; Rouklove, P. G.; Truscello, V. C.
1976-01-01
The radioisotope thermoelectric generator (RTG) considered is to provide a continuous minimum power output of 0.5 watt at 6.0 to 8.5 volts for a minimum period of 15 years. The mechanical-electrical evaluation phase discussed involved the conduction of shock and vibration tests. The thermochemical-physical evaluation phase consisted of an analysis of the materials and the development of a thermal model. The thermoelectric evaluation phase included the accelerated testing of the thermoelectric modules.
NASA Technical Reports Server (NTRS)
Loyselle, Patricia; Prokopius, Kevin
2011-01-01
Proton exchange membrane (PEM) fuel cell technology is the leading candidate to replace the aging alkaline fuel cell technology, currently used on the Shuttle, for future space missions. This test effort marks the final phase of a 5-yr development program that began under the Second Generation Reusable Launch Vehicle (RLV) Program, transitioned into the Next Generation Launch Technologies (NGLT) Program, and continued under Constellation Systems in the Exploration Technology Development Program. Initially, the engineering model (EM) powerplant was evaluated with respect to its performance as compared to acceptance tests carried out at the manufacturer. This was to determine the sensitivity of the powerplant performance to changes in test environment. In addition, a series of tests were performed with the powerplant in the original standard orientation. This report details the continuing EM benchmark test results in three spatial orientations as well as extended duration testing in the mission profile test. The results from these tests verify the applicability of PEM fuel cells for future NASA missions. The specifics of these different tests are described in the following sections.
Nemesis Autonomous Test System
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Lee, Cin-Young; Horvath, Gregory A,; Clement, Bradley J.
2012-01-01
A generalized framework has been developed for systems validation that can be applied to both traditional and autonomous systems. The framework consists of an automated test case generation and execution system called Nemesis that rapidly and thoroughly identifies flaws or vulnerabilities within a system. By applying genetic optimization and goal-seeking algorithms on the test equipment side, a "war game" is conducted between a system and its complementary nemesis. The end result of the war games is a collection of scenarios that reveals any undesirable behaviors of the system under test. The software provides a reusable framework to evolve test scenarios using genetic algorithms using an operation model of the system under test. It can automatically generate and execute test cases that reveal flaws in behaviorally complex systems. Genetic algorithms focus the exploration of tests on the set of test cases that most effectively reveals the flaws and vulnerabilities of the system under test. It leverages advances in state- and model-based engineering, which are essential in defining the behavior of autonomous systems. It also uses goal networks to describe test scenarios.
NASA Astrophysics Data System (ADS)
Lee, S.; Petrykin, V.; Molodyk, A.; Samoilenkov, S.; Kaul, A.; Vavilov, A.; Vysotsky, V.; Fetisov, S.
2014-04-01
The SuperOx and SuperOx Japan LLC companies were founded with the goal of developing a cost-effective technology for second generation HTS (2G HTS) tapes by utilizing a combination of the most advanced chemical and physical deposition techniques, together with implementing original tape architectures. In this paper we present a brief overview of our production and experimental facilities and recent results of 2G HTS tape fabrication, and describe the first tests of the tapes in model cables for AC and DC power application.
Kumar, Atul; Samadder, S R
2017-10-01
Accurate prediction of the quantity of household solid waste generation is very much essential for effective management of municipal solid waste (MSW). In actual practice, modelling methods are often found useful for precise prediction of MSW generation rate. In this study, two models have been proposed that established the relationships between the household solid waste generation rate and the socioeconomic parameters, such as household size, total family income, education, occupation and fuel used in the kitchen. Multiple linear regression technique was applied to develop the two models, one for the prediction of biodegradable MSW generation rate and the other for non-biodegradable MSW generation rate for individual households of the city Dhanbad, India. The results of the two models showed that the coefficient of determinations (R 2 ) were 0.782 for biodegradable waste generation rate and 0.676 for non-biodegradable waste generation rate using the selected independent variables. The accuracy tests of the developed models showed convincing results, as the predicted values were very close to the observed values. Validation of the developed models with a new set of data indicated a good fit for actual prediction purpose with predicted R 2 values of 0.76 and 0.64 for biodegradable and non-biodegradable MSW generation rate respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Test aspects of the JPL Viterbi decoder
NASA Technical Reports Server (NTRS)
Breuer, M. A.
1989-01-01
The generation of test vectors and design-for-test aspects of the Jet Propulsion Laboratory (JPL) Very Large Scale Integration (VLSI) Viterbi decoder chip is discussed. Each processor integrated circuit (IC) contains over 20,000 gates. To achieve a high degree of testability, a scan architecture is employed. The logic has been partitioned so that very few test vectors are required to test the entire chip. In addition, since several blocks of logic are replicated numerous times on this chip, test vectors need only be generated for each block, rather than for the entire circuit. These unique blocks of logic have been identified and test sets generated for them. The approach employed for testing was to use pseudo-exhaustive test vectors whenever feasible. That is, each cone of logid is tested exhaustively. Using this approach, no detailed logic design or fault model is required. All faults which modify the function of a block of combinational logic are detected, such as all irredundant single and multiple stuck-at faults.
Modular, Semantics-Based Composition of Biosimulation Models
ERIC Educational Resources Information Center
Neal, Maxwell Lewis
2010-01-01
Biosimulation models are valuable, versatile tools used for hypothesis generation and testing, codification of biological theory, education, and patient-specific modeling. Driven by recent advances in computational power and the accumulation of systems-level experimental data, modelers today are creating models with an unprecedented level of…
DSN system performance test Doppler noise models; noncoherent configuration
NASA Technical Reports Server (NTRS)
Bunce, R.
1977-01-01
The newer model for variance, the Allan technique, now adopted for testing, is analyzed in the subject mode. A model is generated (including considerable contribution from the station secondary frequency standard), and rationalized with existing data. The variance model is definitely sound; the Allan technique mates theory and measure. The mean-frequency model is an estimate; this problem is yet to be rigorously resolved. The unaltered defining expressions are noncovergent, and the observed mean is quite erratic.
Eglin virtual range database for hardware-in-the-loop testing
NASA Astrophysics Data System (ADS)
Talele, Sunjay E.; Pickard, J. W., Jr.; Owens, Monte A.; Foster, Joseph; Watson, John S.; Amick, Mary Amenda; Anthony, Kenneth
1998-07-01
Realistic backgrounds are necessary to support high fidelity hardware-in-the-loop testing. Advanced avionics and weapon system sensors are driving the requirement for higher resolution imagery. The model-test-model philosophy being promoted by the T&E community is resulting in the need for backgrounds that are realistic or virtual representations of actual test areas. Combined, these requirements led to a major upgrade of the terrain database used for hardware-in-the-loop testing at the Guided Weapons Evaluation Facility (GWEF) at Eglin Air Force Base, Florida. This paper will describe the process used to generate the high-resolution (1-foot) database of ten sites totaling over 20 square kilometers of the Eglin range. this process involved generating digital elevation maps from stereo aerial imagery and classifying ground cover material using the spectral content. These databases were then optimized for real-time operation at 90 Hz.
Improved techniques for thermomechanical testing in support of deformation modeling
NASA Technical Reports Server (NTRS)
Castelli, Michael G.; Ellis, John R.
1992-01-01
The feasibility of generating precise thermomechanical deformation data to support constitutive model development was investigated. Here, the requirement is for experimental data that is free from anomalies caused by less than ideal equipment and procedures. A series of exploratory tests conducted on Hastelloy X showed that generally accepted techniques for strain controlled tests were lacking in at least three areas. Specifically, problems were encountered with specimen stability, thermal strain compensation, and temperature/mechanical strain phasing. The source of these difficulties was identified and improved thermomechanical testing techniques to correct them were developed. These goals were achieved by developing improved procedures for measuring and controlling thermal gradients and by designing a specimen specifically for thermomechanical testing. In addition, innovative control strategies were developed to correctly proportion and phase the thermal and mechanical components of strain. Subsequently, the improved techniques were used to generate deformation data for Hastelloy X over the temperature range, 200 to 1000 C.
A comparative study on different methods of automatic mesh generation of human femurs.
Viceconti, M; Bellingeri, L; Cristofolini, L; Toni, A
1998-01-01
The aim of this study was to evaluate comparatively five methods for automating mesh generation (AMG) when used to mesh a human femur. The five AMG methods considered were: mapped mesh, which provides hexahedral elements through a direct mapping of the element onto the geometry; tetra mesh, which generates tetrahedral elements from a solid model of the object geometry; voxel mesh which builds cubic 8-node elements directly from CT images; and hexa mesh that automatically generated hexahedral elements from a surface definition of the femur geometry. The various methods were tested against two reference models: a simplified geometric model and a proximal femur model. The first model was useful to assess the inherent accuracy of the meshes created by the AMG methods, since an analytical solution was available for the elastic problem of the simplified geometric model. The femur model was used to test the AMG methods in a more realistic condition. The femoral geometry was derived from a reference model (the "standardized femur") and the finite element analyses predictions were compared to experimental measurements. All methods were evaluated in terms of human and computer effort needed to carry out the complete analysis, and in terms of accuracy. The comparison demonstrated that each tested method deserves attention and may be the best for specific situations. The mapped AMG method requires a significant human effort but is very accurate and it allows a tight control of the mesh structure. The tetra AMG method requires a solid model of the object to be analysed but is widely available and accurate. The hexa AMG method requires a significant computer effort but can also be used on polygonal models and is very accurate. The voxel AMG method requires a huge number of elements to reach an accuracy comparable to that of the other methods, but it does not require any pre-processing of the CT dataset to extract the geometry and in some cases may be the only viable solution.
Aeroservoelastic Modeling of Body Freedom Flutter for Control System Design
NASA Technical Reports Server (NTRS)
Ouellette, Jeffrey
2017-01-01
The communication of this method is being used by NASA in the ongoing collaborations with groups interested in the X-56A flight test program. Model generation for body freedom flutter Addressing issues in: State Consistency, Low frequency dynamics, Unsteady aerodynamics. Applied approach to X-56A MUTT: Comparing to flight test data.
A global reference for caesarean section rates (C-Model): a multicountry cross-sectional study.
Souza, J P; Betran, A P; Dumont, A; de Mucio, B; Gibbs Pickens, C M; Deneux-Tharaux, C; Ortiz-Panozo, E; Sullivan, E; Ota, E; Togoobaatar, G; Carroli, G; Knight, H; Zhang, J; Cecatti, J G; Vogel, J P; Jayaratne, K; Leal, M C; Gissler, M; Morisaki, N; Lack, N; Oladapo, O T; Tunçalp, Ö; Lumbiganon, P; Mori, R; Quintana, S; Costa Passos, A D; Marcolin, A C; Zongo, A; Blondel, B; Hernández, B; Hogue, C J; Prunet, C; Landman, C; Ochir, C; Cuesta, C; Pileggi-Castro, C; Walker, D; Alves, D; Abalos, E; Moises, Ecd; Vieira, E M; Duarte, G; Perdona, G; Gurol-Urganci, I; Takahiko, K; Moscovici, L; Campodonico, L; Oliveira-Ciabati, L; Laopaiboon, M; Danansuriya, M; Nakamura-Pereira, M; Costa, M L; Torloni, M R; Kramer, M R; Borges, P; Olkhanud, P B; Pérez-Cuevas, R; Agampodi, S B; Mittal, S; Serruya, S; Bataglia, V; Li, Z; Temmerman, M; Gülmezoglu, A M
2016-02-01
To generate a global reference for caesarean section (CS) rates at health facilities. Cross-sectional study. Health facilities from 43 countries. Thirty eight thousand three hundred and twenty-four women giving birth from 22 countries for model building and 10,045,875 women giving birth from 43 countries for model testing. We hypothesised that mathematical models could determine the relationship between clinical-obstetric characteristics and CS. These models generated probabilities of CS that could be compared with the observed CS rates. We devised a three-step approach to generate the global benchmark of CS rates at health facilities: creation of a multi-country reference population, building mathematical models, and testing these models. Area under the ROC curves, diagnostic odds ratio, expected CS rate, observed CS rate. According to the different versions of the model, areas under the ROC curves suggested a good discriminatory capacity of C-Model, with summary estimates ranging from 0.832 to 0.844. The C-Model was able to generate expected CS rates adjusted for the case-mix of the obstetric population. We have also prepared an e-calculator to facilitate use of C-Model (www.who.int/reproductivehealth/publications/maternal_perinatal_health/c-model/en/). This article describes the development of a global reference for CS rates. Based on maternal characteristics, this tool was able to generate an individualised expected CS rate for health facilities or groups of health facilities. With C-Model, obstetric teams, health system managers, health facilities, health insurance companies, and governments can produce a customised reference CS rate for assessing use (and overuse) of CS. The C-Model provides a customized benchmark for caesarean section rates in health facilities and systems. © 2015 World Health Organization; licensed by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.
Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel
NASA Technical Reports Server (NTRS)
Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.
2016-01-01
The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.
A Hydrogen Peroxide Hot-Jet Simulator for Wind-Tunnel Tests of Turbojet-Exit Models
NASA Technical Reports Server (NTRS)
Runckel, Jack F.; Swihart, John M.
1959-01-01
A turbojet-engine-exhaust simulator which utilizes a hydrogen peroxide gas generator has been developed for powered-model testing in wind tunnels with air exchange. Catalytic decomposition of concentrated hydrogen peroxide provides a convenient and easily controlled method of providing a hot jet with characteristics that correspond closely to the jet of a gas turbine engine. The problems associated with simulation of jet exhausts in a transonic wind tunnel which led to the selection of a liquid monopropellant are discussed. The operation of the jet simulator consisting of a thrust balance, gas generator, exit nozzle, and auxiliary control system is described. Static-test data obtained with convergent nozzles are presented and shown to be in good agreement with ideal calculated values.
A neural model of rule generation in inductive reasoning.
Rasmussen, Daniel; Eliasmith, Chris
2011-01-01
Inductive reasoning is a fundamental and complex aspect of human intelligence. In particular, how do subjects, given a set of particular examples, generate general descriptions of the rules governing that set? We present a biologically plausible method for accomplishing this task and implement it in a spiking neuron model. We demonstrate the success of this model by applying it to the problem domain of Raven's Progressive Matrices, a widely used tool in the field of intelligence testing. The model is able to generate the rules necessary to correctly solve Raven's items, as well as recreate many of the experimental effects observed in human subjects. Copyright © 2011 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Sun, Congcong; Wang, Zhijie; Liu, Sanming; Jiang, Xiuchen; Sheng, Gehao; Liu, Tianyu
2017-05-01
Wind power has the advantages of being clean and non-polluting and the development of bundled wind-thermal generation power systems (BWTGSs) is one of the important means to improve wind power accommodation rate and implement “clean alternative” on generation side. A two-stage optimization strategy for BWTGSs considering wind speed forecasting results and load characteristics is proposed. By taking short-term wind speed forecasting results of generation side and load characteristics of demand side into account, a two-stage optimization model for BWTGSs is formulated. By using the environmental benefit index of BWTGSs as the objective function, supply-demand balance and generator operation as the constraints, the first-stage optimization model is developed with the chance-constrained programming theory. By using the operation cost for BWTGSs as the objective function, the second-stage optimization model is developed with the greedy algorithm. The improved PSO algorithm is employed to solve the model and numerical test verifies the effectiveness of the proposed strategy.
Well test mathematical model for fractures network in tight oil reservoirs
NASA Astrophysics Data System (ADS)
Diwu, Pengxiang; Liu, Tongjing; Jiang, Baoyi; Wang, Rui; Yang, Peidie; Yang, Jiping; Wang, Zhaoming
2018-02-01
Well test, especially build-up test, has been applied widely in the development of tight oil reservoirs, since it is the only available low cost way to directly quantify flow ability and formation heterogeneity parameters. However, because of the fractures network near wellbore, generated from artificial fracturing linking up natural factures, traditional infinite and finite conductivity fracture models usually result in significantly deviation in field application. In this work, considering the random distribution of natural fractures, physical model of fractures network is proposed, and it shows a composite model feature in the large scale. Consequently, a nonhomogeneous composite mathematical model is established with threshold pressure gradient. To solve this model semi-analytically, we proposed a solution approach including Laplace transform and virtual argument Bessel function, and this method is verified by comparing with existing analytical solution. The matching data of typical type curves generated from semi-analytical solution indicates that the proposed physical and mathematical model can describe the type curves characteristic in typical tight oil reservoirs, which have up warping in late-term rather than parallel lines with slope 1/2 or 1/4. It means the composite model could be used into pressure interpretation of artificial fracturing wells in tight oil reservoir.
Lockwood, Sarah Y.; Meisel, Jayda E.; Monsma, Frederick J.; Spence, Dana M.
2016-01-01
The process of bringing a drug to market involves many steps, including the preclinical stage, where various properties of the drug candidate molecule are determined. These properties, which include drug absorption, distribution, metabolism, and excretion, are often displayed in a pharmacokinetic (PK) profile. While PK profiles are determined in animal models, in vitro systems that model in vivo processes are available, although each possesses shortcomings. Here, we present a 3D-printed, diffusion-based, and dynamic in vitro PK device. The device contains six flow channels, each with integrated porous membrane-based insert wells. The pores of these membranes enable drugs to freely diffuse back and forth between the flow channels and the inserts, thus enabling both loading and clearance portions of a standard PK curve to be generated. The device is designed to work with 96-well plate technology and consumes single-digit milliliter volumes to generate multiple PK profiles, simultaneously. Generation of PK profiles by use of the device was initially performed with fluorescein as a test molecule. Effects of such parameters as flow rate, loading time, volume in the insert well, and initial concentration of the test molecule were investigated. A prediction model was generated from this data, enabling the user to predict the concentration of the test molecule at any point along the PK profile within a coefficient of variation of ~5%. Depletion of the analyte from the well was characterized and was determined to follow first-order rate kinetics, indicated by statistically equivalent (p > 0.05) depletion half-lives that were independent of the starting concentration. A PK curve for an approved antibiotic, levofloxacin, was generated to show utility beyond the fluorescein test molecule. PMID:26727249
Modeling and Simulation of the Second-Generation Orion Crew Module Air Bag Landing System
NASA Technical Reports Server (NTRS)
Timmers, Richard B.; Welch, Joseph V.; Hardy, Robin C.
2009-01-01
Air bags were evaluated as the landing attenuation system for earth landing of the Orion Crew Module (CM). An important element of the air bag system design process is proper modeling of the proposed configuration to determine if the resulting performance meets requirements. Analysis conducted to date shows that airbags are capable of providing a graceful landing of the CM in nominal and off-nominal conditions such as parachute failure, high horizontal winds, and unfavorable vehicle/ground angle combinations. The efforts presented here surround a second generation of the airbag design developed by ILC Dover, and is based on previous design, analysis, and testing efforts. In order to fully evaluate the second generation air bag design and correlate the dynamic simulations, a series of drop tests were carried out at NASA Langley's Landing and Impact Research (LandIR) facility. The tests consisted of a full-scale set of air bags attached to a full-scale test article representing the Orion Crew Module. The techniques used to collect experimental data, construct the simulations, and make comparisons to experimental data are discussed.
Composite Overwrapped Pressure Vessel (COPV) Stress Rupture Testing
NASA Technical Reports Server (NTRS)
Greene, Nathanael J.; Saulsberry, Regor L.; Leifeste, Mark R.; Yoder, Tommy B.; Keddy, Chris P.; Forth, Scott C.; Russell, Rick W.
2010-01-01
This paper reports stress rupture testing of Kevlar(TradeMark) composite overwrapped pressure vessels (COPVs) at NASA White Sands Test Facility. This 6-year test program was part of the larger effort to predict and extend the lifetime of flight vessels. Tests were performed to characterize control parameters for stress rupture testing, and vessel life was predicted by statistical modeling. One highly instrumented 102-cm (40-in.) diameter Kevlar(TradeMark) COPV was tested to failure (burst) as a single-point model verification. Significant data were generated that will enhance development of improved NDE methods and predictive modeling techniques, and thus better address stress rupture and other composite durability concerns that affect pressure vessel safety, reliability and mission assurance.
Inflation of Unreefed and Reefed Extraction Parachutes
NASA Technical Reports Server (NTRS)
Ray, Eric S.; Varela, Jose G.
2015-01-01
Data from the Orion and several other test programs have been used to reconstruct inflation parameters for 28 ft Do extraction parachutes as well as the parent aircraft pitch response during extraction. The inflation force generated by extraction parachutes is recorded directly during tow tests but is usually inferred from the payload accelerometer during Low Velocity Airdrop Delivery (LVAD) flight test extractions. Inflation parameters are dependent on the type of parent aircraft, number of canopies, and standard vs. high altitude extraction conditions. For standard altitudes, single canopy inflations are modeled as infinite mass, but the non-symmetric inflations in a cluster are modeled as finite mass. High altitude extractions have necessitated reefing the extraction parachutes, which are best modeled as infinite mass for those conditions. Distributions of aircraft pitch profiles and inflation parameters have been generated for use in Monte Carlo simulations of payload extractions.
NASA Astrophysics Data System (ADS)
Zhou, W.; Qiu, G. Y.; Oodo, S. O.; He, H.
2013-03-01
An increasing interest in wind energy and the advance of related technologies have increased the connection of wind power generation into electrical grids. This paper proposes an optimization model for determining the maximum capacity of wind farms in a power system. In this model, generator power output limits, voltage limits and thermal limits of branches in the grid system were considered in order to limit the steady-state security influence of wind generators on the power system. The optimization model was solved by a nonlinear primal-dual interior-point method. An IEEE-30 bus system with two wind farms was tested through simulation studies, plus an analysis conducted to verify the effectiveness of the proposed model. The results indicated that the model is efficient and reasonable.
NASA Technical Reports Server (NTRS)
Marshall, B. A.; Nichols, M. E.
1984-01-01
An experimental investigation (Test OA-309) was conducted using 0.0405-scale Space Shuttle Orbiter Model 16-0 in the North American Aerodynamics Laboratory 7.75 x 11.00-foot Lowspeed Wind Tunnel. The primary purpose was to locate and study any flow conditions or vortices that might have caused damage to the Advanced Flexible Reusable Surface Insulation (AFRSI) during the Space Transportation System STS-6 mission. A secondary objective was to evaluate vortex generators to be used for Wind Tunnel Test OS-314. Flowfield visualization was obtained by means of smoke, tufts, and oil flow. The test was conducted at Mach numbers between 0.07 and 0.23 and at dynamic pressures between 7 and 35 pounds per square foot. The angle-of-attack range of the model was -5 degrees through 35 degrees at 0 or 2 degrees of sideslip, while roll angle was held constant at zero degrees. The vortex generators were studied at angles of 0, 5, 10, and 15 degrees.
Dynamic Analysis and Test Results for an STC Stirling Generator
NASA Astrophysics Data System (ADS)
Qiu, Songgang; Peterson, Allen A.
2004-02-01
Long-life, high-efficiency generators based on free-piston Stirling machines are a future energy-conversion solution for both space and commercial applications. To aid in design and system integration efforts, Stirling Technology Company (STC) has developed dynamic simulation models for the internal moving subassemblies and for complete Stirling convertor assemblies. These dynamic models have been validated using test data from operating prototypes. Simplified versions of these models are presented to help explain the operating characteristics of the Stirling convertor. Power spectrum analysis is presented for the test data for casing acceleration, piston motion, displacer motion, and controller current/voltage during full power operation. The harmonics of a Stirling convertor and its moving components are identified for the STC zener-diode control scheme. The dynamic behavior of each moving component and its contribution to the system dynamics and resultant vibration forces are discussed. Additionally, the effects of a passive balancer and external suspension are predicted by another simplified system model.
Second-Generation Large Civil Tiltrotor 7- by 10-Foot Wind Tunnel Test Data Report
NASA Technical Reports Server (NTRS)
Theodore, Colin R.; Russell, Carl R.; Willink, Gina C.; Pete, Ashley E.; Adibi, Sierra A.; Ewert, Adam; Theuns, Lieselotte; Beierle, Connor
2016-01-01
An approximately 6-percent scale model of the NASA Second-Generation Large Civil Tiltrotor (LCTR2) Aircraft was tested in the U.S. Army 7- by 10-Foot Wind Tunnel at NASA Ames Research Center January 4 to April 19, 2012, and September 18 to November 1, 2013. The full model was tested, along with modified versions in order to determine the effects of the wing tip extensions and nacelles; the wing was also tested separately in the various configurations. In both cases, the wing and nacelles used were adopted from the U.S. Army High Efficiency Tilt Rotor (HETR) aircraft, in order to limit the cost of the experiment. The full airframe was tested in high-speed cruise and low-speed hover flight conditions, while the wing was tested only in cruise conditions, with Reynolds numbers ranging from 0 to 1.4 million. In all cases, the external scale system of the wind tunnel was used to collect data. Both models were mounted to the scale using two support struts attached underneath the wing; the full airframe model also used a third strut attached at the tail. The collected data provides insight into the performance of the preliminary design of the LCTR2 and will be used for computational fluid dynamics (CFD) validation and the development of flight dynamics simulation models.
The Effect of Mini and Midi Anchor Tests on Test Equating
ERIC Educational Resources Information Center
Arikan, Çigdem Akin
2018-01-01
The main purpose of this study is to compare the test forms to the midi anchor test and the mini anchor test performance based on item response theory. The research was conducted with using simulated data which were generated based on Rasch model. In order to equate two test forms the anchor item nonequivalent groups (internal anchor test) was…
Model-Based Development of Automotive Electronic Climate Control Software
NASA Astrophysics Data System (ADS)
Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan
With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.
Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC
Templeton, Alan R.
2009-01-01
Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasten, P.R.; Coobs, J.H.; Lotts, A.L.
1976-04-01
Progress is summarized in studies relating to HTGR fuel reprocessing, refabrication, and recycle; HTGR fuel materials development and performance testing; HTGR PCRV development; HTGR materials investigations; HTGR fuel chemistry; HTGR safety studies; and GCFR irradiation experiments and steam generator modeling.
Next-Generation Image and Sound Processing Strategies: Exploiting the Biological Model
2007-05-01
several video game clips which were recorded while observers interactively played the games. The feature vectors may be derived from either: the...phase, we use a different video game clip to test the model. Frames from the test clip are passed in parallel to a bottom-up saliency model, as well as... video games (Figure 6). We found that the TD model alone predicts where humans look about twice as well as does the BU model alone; in addition, a
Blanco-Martín, Laura; Wolters, Ralf; Rutqvist, Jonny; ...
2016-04-28
The Thermal Simulation for Drift Emplacement heater test is modeled with two simulators for coupled thermal-hydraulic-mechanical processes. Results from the two simulators are in very good agreement. The comparison between measurements and numerical results is also very satisfactory, regarding temperature, drift closure and rock deformation. Concerning backfill compaction, a parameter calibration through inverse modeling was performed due to insufficient data on crushed salt reconsolidation, particularly at high temperatures. We conclude that the two simulators investigated have the capabilities to reproduce the data available, which increases confidence in their use to reliably investigate disposal of heat-generating nuclear waste in saliferous geosystems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanco-Martín, Laura; Wolters, Ralf; Rutqvist, Jonny
The Thermal Simulation for Drift Emplacement heater test is modeled with two simulators for coupled thermal-hydraulic-mechanical processes. Results from the two simulators are in very good agreement. The comparison between measurements and numerical results is also very satisfactory, regarding temperature, drift closure and rock deformation. Concerning backfill compaction, a parameter calibration through inverse modeling was performed due to insufficient data on crushed salt reconsolidation, particularly at high temperatures. We conclude that the two simulators investigated have the capabilities to reproduce the data available, which increases confidence in their use to reliably investigate disposal of heat-generating nuclear waste in saliferous geosystems.
NASA Astrophysics Data System (ADS)
Oishi, Ikuo; Nishijima, Kenichi
2002-03-01
A 70 MW class superconducting model generator was designed, manufactured, and tested from 1988 to 1999 as Phase I, which was Japan's national project on applications of superconducting technologies to electric power apparatuses that was commissioned by NEDO as part of New Sunshine Program of AIST and MITI. Phase II then is now being carried out by almost same organization as Phase I. With the development of the 70 MW class superconducting model generator, technologies for a 200 MW class pilot generator were established. The world's largest output (79 MW), world's longest continuous operation (1500 h), and other sufficient characteristics were achieved on the 70 MW class superconducting model generator, and key technologies of design and manufacture required for the 200 MW class pilot generator were established. This project contributed to progress of R&D of power apparatuses. Super-GM has started the next project (Phase II), which shall develop the key technologies for larger-capacity and more-compact machine and is scheduled from 2000 to 2003. Phase II shall be the first step for commercialization of superconducting generator.
Procedure for the Selection and Validation of a Calibration Model I-Description and Application.
Desharnais, Brigitte; Camirand-Lemyre, Félix; Mireault, Pascal; Skinner, Cameron D
2017-05-01
Calibration model selection is required for all quantitative methods in toxicology and more broadly in bioanalysis. This typically involves selecting the equation order (quadratic or linear) and weighting factor correctly modelizing the data. A mis-selection of the calibration model will generate lower quality control (QC) accuracy, with an error up to 154%. Unfortunately, simple tools to perform this selection and tests to validate the resulting model are lacking. We present a stepwise, analyst-independent scheme for selection and validation of calibration models. The success rate of this scheme is on average 40% higher than a traditional "fit and check the QCs accuracy" method of selecting the calibration model. Moreover, the process was completely automated through a script (available in Supplemental Data 3) running in RStudio (free, open-source software). The need for weighting was assessed through an F-test using the variances of the upper limit of quantification and lower limit of quantification replicate measurements. When weighting was required, the choice between 1/x and 1/x2 was determined by calculating which option generated the smallest spread of weighted normalized variances. Finally, model order was selected through a partial F-test. The chosen calibration model was validated through Cramer-von Mises or Kolmogorov-Smirnov normality testing of the standardized residuals. Performance of the different tests was assessed using 50 simulated data sets per possible calibration model (e.g., linear-no weight, quadratic-no weight, linear-1/x, etc.). This first of two papers describes the tests, procedures and outcomes of the developed procedure using real LC-MS-MS results for the quantification of cocaine and naltrexone. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Flapping wing applied to wind generators
NASA Astrophysics Data System (ADS)
Colidiuc, Alexandra; Galetuse, Stelian; Suatean, Bogdan
2012-11-01
The new conditions at the international level for energy source distributions and the continuous increasing of energy consumption must lead to a new alternative resource with the condition of keeping the environment clean. This paper offers a new approach for a wind generator and is based on the theoretical aerodynamic model. This new model of wind generator helped me to test what influences would be if there will be a bird airfoil instead of a normal wind generator airfoil. The aim is to calculate the efficiency for the new model of wind generator. A representative direction for using the renewable energy is referred to the transformation of wind energy into electrical energy, with the help of wind turbines; the development of such systems lead to new solutions based on high efficiency, reduced costs and suitable to the implementation conditions.
Using Neural Networks to Generate Inferential Roles for Natural Language
Blouw, Peter; Eliasmith, Chris
2018-01-01
Neural networks have long been used to study linguistic phenomena spanning the domains of phonology, morphology, syntax, and semantics. Of these domains, semantics is somewhat unique in that there is little clarity concerning what a model needs to be able to do in order to provide an account of how the meanings of complex linguistic expressions, such as sentences, are understood. We argue that one thing such models need to be able to do is generate predictions about which further sentences are likely to follow from a given sentence; these define the sentence's “inferential role.” We then show that it is possible to train a tree-structured neural network model to generate very simple examples of such inferential roles using the recently released Stanford Natural Language Inference (SNLI) dataset. On an empirical front, we evaluate the performance of this model by reporting entailment prediction accuracies on a set of test sentences not present in the training data. We also report the results of a simple study that compares human plausibility ratings for both human-generated and model-generated entailments for a random selection of sentences in this test set. On a more theoretical front, we argue in favor of a revision to some common assumptions about semantics: understanding a linguistic expression is not only a matter of mapping it onto a representation that somehow constitutes its meaning; rather, understanding a linguistic expression is mainly a matter of being able to draw certain inferences. Inference should accordingly be at the core of any model of semantic cognition. PMID:29387031
Advanced recovery systems wind tunnel test report
NASA Technical Reports Server (NTRS)
Geiger, R. H.; Wailes, W. K.
1990-01-01
Pioneer Aerospace Corporation (PAC) conducted parafoil wind tunnel testing in the NASA-Ames 80 by 120 test sections of the National Full-Scale Aerodynamic Complex, Moffett Field, CA. The investigation was conducted to determine the aerodynamic characteristics of two scale ram air wings in support of air drop testing and full scale development of Advanced Recovery Systems for the Next Generation Space Transportation System. Two models were tested during this investigation. Both the primary test article, a 1/9 geometric scale model with wing area of 1200 square feet and secondary test article, a 1/36 geometric scale model with wing area of 300 square feet, had an aspect ratio of 3. The test results show that both models were statically stable about a model reference point at angles of attack from 2 to 10 degrees. The maximum lift-drag ratio varied between 2.9 and 2.4 for increasing wing loading.
The DOE/NASA SRG110 Program Overview
NASA Astrophysics Data System (ADS)
Shaltens, R. K.; Richardson, R. L.
2005-12-01
The Department of Energy is developing the Stirling Radioisotope Generator (SRG110) for NASAs Science Mission Directorate for potential surface and deep space missions. The SRG110 is one of two new radioisotope power systems (RPSs) currently being developed for NASA space missions, and is capable of operating in a range of planetary atmospheres and in deep space environments. It has a mass of approximately 27 kg and produces more than 125We(dc) at beginning of mission (BOM), with a design lifetime of fourteen years. Electrical power is produced by two (2) free-piston Stirlings convertor heated by two General Purpose Heat Source (GPHS) modules. The complete SRG110 system is approximately 38 cm x 36 cm and 76 cm long. The SRG110 generator is being designed in 3 stages: Engineering Model, Qualification Generator, and Flight Generator. Current plans call for the Engineering Model to be fabricated and tested by October 2006. Completion of testing of the Qualification Generator is scheduled for mid-2009. This development is being performed by Lockheed Martin, Valley Forge, PA and Infinia Corporation, Kennewick, WA under contract to the Department of Energy, Germantown, Md. Glenn Research Center, Cleveland, Ohio is providing independent testing and support for the technology transition for the SRG110 Program.
Thermal Analysis and Testing of Fastrac Gas Generator Design
NASA Technical Reports Server (NTRS)
Nguyen, H.
1998-01-01
The Fastrac Engine is being developed by the Marshall Space Flight Center (MSFC) to help meet the goal of substantially reducing the cost of access to space. This engine relies on a simple gas-generator cycle, which burns a small amount of RP-1 and oxygen to provide gas to drive the turbine and then exhausts the spent fuel. The Fastrac program envisions a combination of analysis, design and hot-fire evaluation testing. This paper provides the supporting thermal analysis of the gas generator design. In order to ensure that the design objectives were met, the evaluation tests have started on a component level and a total of 15 tests of different durations were completed to date at MSFC. The correlated thermal model results will also be compared against hot-fire thermocouple data gathered.
ERIC Educational Resources Information Center
Sengul Avsar, Asiye; Tavsancil, Ezel
2017-01-01
This study analysed polytomous items' psychometric properties according to nonparametric item response theory (NIRT) models. Thus, simulated datasets--three different test lengths (10, 20 and 30 items), three sample distributions (normal, right and left skewed) and three samples sizes (100, 250 and 500)--were generated by conducting 20…
Benchmark dose risk assessment software (BMDS) was designed by EPA to generate dose-response curves and facilitate the analysis, interpretation and synthesis of toxicological data. Partial results of QA/QC testing of the EPA benchmark dose software (BMDS) are presented. BMDS pr...
Methodological Aspects of Evaluation in Primary Reading.
ERIC Educational Resources Information Center
Henry, G.; Grisay, A.
This paper develops a model for generating sets of replicable items for testing a range of reading skills in the primary grades. The procedure is particularly concerned with tests to identify a child's profile in reading achievement and to inform a teacher, principal, or district of the actual level of achievement in reading. Although the model is…
Analytical Investigation of a Reflux Boiler
NASA Technical Reports Server (NTRS)
Simon, William E.; Young, Fred M.; Chambers, Terrence L.
1996-01-01
A thermal model of a single Ultralight Fabric Reflux Tube (UFRT) was constructed and tested against data for an array of such tubes tested in the NASA-JSC facility. Modifications to the single fin model were necessary to accommodate the change in radiation shape factors due to adjacent tubes. There was good agreement between the test data and data generated for the same cases by the thermal model. The thermal model was also used to generate single and linear array data for the lunar environment (the primary difference between the test and lunar data was due to lunar gravity). The model was also used to optimize the linear spacing of the reflux tubes in an array. The optimal spacing of the tubes was recommended to be about 5 tube diameters based on maximizing the heat transfer per unit mass. The model also showed that the thermal conductivity of the Nextel fabric was the major limitation to the heat transfer. This led to a suggestion that the feasibility of jacketing the Nextel fiber bundles with copper strands be investigated. This jacketing arrangement was estimated to be able to double the thermal conductivity of the fabric at a volume concentration of about 12-14%. Doubling the thermal conductivity of the fabric would double the amount of heat transferred at the same steam saturation temperature.
Statechart Analysis with Symbolic PathFinder
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2012-01-01
We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.
Spectrophotovoltaic orbital power generation
NASA Technical Reports Server (NTRS)
Knowles, G.; Carroll, J.
1983-01-01
A subscale model of a photovoltaic power system employing spectral splitting and 1000:1 concentration was fabricated and tested. The 10-in. aperture model demonstrated 15.5% efficiency with 86% of the energy produced by a GaAs solar cell and 14% of the energy produced by an Si cell. The calculated efficiency of the system using the same solar cells, but having perfect optics, would be approximately 20%. The model design, component measurements, test results, and mathematical model are presented.
Aerodynamic and acoustic test of a United Technologies model scale rotor at DNW
NASA Technical Reports Server (NTRS)
Yu, Yung H.; Liu, Sandy R.; Jordan, Dave E.; Landgrebe, Anton J.; Lorber, Peter F.; Pollack, Michael J.; Martin, Ruth M.
1990-01-01
The UTC model scale rotors, the DNW wind tunnel, the AFDD rotary wing test stand, the UTRC and AFDD aerodynamic and acoustic data acquisition systems, and the scope of test matrices are discussed and an introduction to the test results is provided. It is pointed out that a comprehensive aero/acoustic database of several configurations of the UTC scaled model rotor has been created. The data is expected to improve understanding of rotor aerodynamics, acoustics, and dynamics, and lead to enhanced analytical methodology and design capabilities for the next generation of rotorcraft.
Clothier, Richard; Starzec, Gemma; Pradel, Lionel; Baxter, Victoria; Jones, Melanie; Cox, Helen; Noble, Linda
2002-01-01
A range of cosmetics formulations with human patch-test data were supplied in a coded form, for the examination of the use of a combined in vitro permeability barrier assay and cell viability assay to generate, and then test, a prediction model for assessing potential human skin patch-test results. The target cells employed were of the Madin Darby canine kidney cell line, which establish tight junctions and adherens junctions able to restrict the permeability of sodium fluorescein across the barrier of the confluent cell layer. The prediction model for interpretation of the in vitro assay results included initial effects and the recovery profile over 72 hours. A set of the hand-wash, surfactant-based formulations were tested to generate the prediction model, and then six others were evaluated. The model system was then also evaluated with powder laundry detergents and hand moisturisers: their effects were predicted by the in vitro test system. The model was under-predictive for two of the ten hand-wash products. It was over-predictive for the moisturisers, (two out of six) and eight out of ten laundry powders. However, the in vivo human patch test data were variable, and 19 of the 26 predictions were correct or within 0.5 on the 0-4.0 scale used for the in vivo scores, i.e. within the same variable range reported for the repeat-test hand-wash in vivo data.
NASA Technical Reports Server (NTRS)
Beck, L. R.; Rodriguez, M. H.; Dister, S. W.; Rodriguez, A. D.; Washino, R. K.; Roberts, D. R.; Spanner, M. A.
1997-01-01
A blind test of two remote sensing-based models for predicting adult populations of Anopheles albimanus in villages, an indicator of malaria transmission risk, was conducted in southern Chiapas, Mexico. One model was developed using a discriminant analysis approach, while the other was based on regression analysis. The models were developed in 1992 for an area around Tapachula, Chiapas, using Landsat Thematic Mapper (TM) satellite data and geographic information system functions. Using two remotely sensed landscape elements, the discriminant model was able to successfully distinguish between villages with high and low An. albimanus abundance with an overall accuracy of 90%. To test the predictive capability of the models, multitemporal TM data were used to generate a landscape map of the Huixtla area, northwest of Tapachula, where the models were used to predict risk for 40 villages. The resulting predictions were not disclosed until the end of the test. Independently, An. albimanus abundance data were collected in the 40 randomly selected villages for which the predictions had been made. These data were subsequently used to assess the models' accuracies. The discriminant model accurately predicted 79% of the high-abundance villages and 50% of the low-abundance villages, for an overall accuracy of 70%. The regression model correctly identified seven of the 10 villages with the highest mosquito abundance. This test demonstrated that remote sensing-based models generated for one area can be used successfully in another, comparable area.
ERIC Educational Resources Information Center
Baghaei, Purya; Ravand, Hamdollah
2016-01-01
In this study the magnitudes of local dependence generated by cloze test items and reading comprehension items were compared and their impact on parameter estimates and test precision was investigated. An advanced English as a foreign language reading comprehension test containing three reading passages and a cloze test was analyzed with a…
Correlation of predicted and measured thermal stresses on a truss-type aircraft structure
NASA Technical Reports Server (NTRS)
Jenkins, J. M.; Schuster, L. S.; Carter, A. L.
1978-01-01
A test structure representing a portion of a hypersonic vehicle was instrumented with strain gages and thermocouples. This test structure was then subjected to laboratory heating representative of supersonic and hypersonic flight conditions. A finite element computer model of this structure was developed using several types of elements with the NASA structural analysis (NASTRAN) computer program. Temperature inputs from the test were used to generate predicted model thermal stresses and these were correlated with the test measurements.
Binomial test statistics using Psi functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2007-01-01
For the negative binomial model (probability generating function (p + 1 - pt){sup -k}) a logarithmic derivative is the Psi function difference {psi}(k + x) - {psi}(k); this and its derivatives lead to a test statistic to decide on the validity of a specified model. The test statistic uses a data base so there exists a comparison available between theory and application. Note that the test function is not dominated by outliers. Applications to (i) Fisher's tick data, (ii) accidents data, (iii) Weldon's dice data are included.
An Aspect of Political Socialization of Student Movement Participants in Korea.
ERIC Educational Resources Information Center
Park, Byeong-chul
1993-01-01
Tests hypotheses from lineage socialization and generation unit perspectives on Korean student protest participation using 360 self-administered questionnaires collected at 3 Korean universities. Results indicate that these hypotheses are not mutually exclusive but support the generation unit model. (SLD)
This document summarizes the process followed to utilize GT-POWER modeled engine and laboratory engine dyno test data to generate a full engine fuel consumption map which can be used by EPA's ALPHA vehicle simulations.
Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2004-01-01
A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Chroma-preserved luma controlling technique using YCbCr color space
NASA Astrophysics Data System (ADS)
Lee, Sooyeon; Kwak, Youngshin; Kim, Youn Jin
2013-02-01
YCbCr color space composed of luma and chominance components is preferred for its ease of image processing. However the non-orthogonality between YCbCr components induces unwanted perceived chroma change as controlling luma values. In this study, a new method was designed for the unwanted chroma change compensation generated by luma change. For six different YCC_hue angles, data points named `Original data' generated with uniformly distributed luma and Cb, Cr values. Then the weight values were applied to luma values of `Original data' set resulting in `Test data' set followed by `new YCC_chroma' calculation having miminum CIECAM02 ΔC between original and test data for `Test data' set. Finally mathematical model is developed to predict amount of YCC_chroma values to compensate CIECAM02 chroma changes. This model implemented for luma controlling algorithm having constant perceived chroma. The performance was tested numerically using data points and images. After compensation the result is improved 51.69% than that before compensation when CIECAM02 Δ C between `Original data' and `Test data' after compensation is compared. When new model is applied to test images, there is 32.03% improvement.
Operator procedure verification with a rapidly reconfigurable simulator
NASA Technical Reports Server (NTRS)
Iwasaki, Yumi; Engelmore, Robert; Fehr, Gary; Fikes, Richard
1994-01-01
Generating and testing procedures for controlling spacecraft subsystems composed of electro-mechanical and computationally realized elements has become a very difficult task. Before a spacecraft can be flown, mission controllers must envision a great variety of situations the flight crew may encounter during a mission and carefully construct procedures for operating the spacecraft in each possible situation. If, despite extensive pre-compilation of control procedures, an unforeseen situation arises during a mission, the mission controller must generate a new procedure for the flight crew in a limited amount of time. In such situations, the mission controller cannot systematically consider and test alternative procedures against models of the system being controlled, because the available simulator is too large and complex to reconfigure, run, and analyze quickly. A rapidly reconfigurable simulation environment that can execute a control procedure and show its effects on system behavior would greatly facilitate generation and testing of control procedures both before and during a mission. The How Things Work project at Stanford University has developed a system called DME (Device Modeling Environment) for modeling and simulating the behavior of electromechanical devices. DME was designed to facilitate model formulation and behavior simulation of device behavior including both continuous and discrete phenomena. We are currently extending DME for use in testing operator procedures, and we have built a knowledge base for modeling the Reaction Control System (RCS) of the space shuttle as a testbed. We believe that DME can facilitate design of operator procedures by providing mission controllers with a simulation environment that meets all these requirements.
Identification of quasi-steady compressor characteristics from transient data
NASA Technical Reports Server (NTRS)
Nunes, K. B.; Rock, S. M.
1984-01-01
The principal goal was to demonstrate that nonlinear compressor map parameters, which govern an in-stall response, can be identified from test data using parameter identification techniques. The tasks included developing and then applying an identification procedure to data generated by NASA LeRC on a hybrid computer. Two levels of model detail were employed. First was a lumped compressor rig model; second was a simplified turbofan model. The main outputs are the tools and procedures generated to accomplish the identification.
NASA Technical Reports Server (NTRS)
Jones, Gregory; Balakrishna, Sundareswara; DeMoss, Joshua; Goodliff, Scott; Bailey, Matthew
2015-01-01
Pressure fluctuations have been measured over the course of several tests in the National Transonic Facility to study unsteady phenomenon both with and without the influence of a model. Broadband spectral analysis will be used to characterize the length scales of the tunnel. Special attention will be given to the large-scale, low frequency data that influences the Mach number and force and moment variability. This paper will also discuss the significance of the vorticity and sound fields that can be related to the Common Research Model and will also highlight the comparisons to an empty tunnel configuration. The effectiveness of vortex generators placed at the interface of the test section and wind tunnel diffuser showed promise in reducing the empty tunnel unsteadiness, however, the vortex generators were ineffective in the presence of a model.
Predicate Argument Structure Analysis for Use Case Description Modeling
NASA Astrophysics Data System (ADS)
Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira
In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.
Cesca, S.; Battaglia, J.; Dahm, T.; Tessmer, E.; Heimann, S.; Okubo, P.
2008-01-01
The main goal of this study is to improve the modelling of the source mechanism associated with the generation of long period (LP) signals in volcanic areas. Our intent is to evaluate the effects that detailed structural features of the volcanic models play in the generation of LP signal and the consequent retrieval of LP source characteristics. In particular, effects associated with the presence of topography and crustal heterogeneities are here studied in detail. We focus our study on a LP event observed at Kilauea volcano, Hawaii, in 2001 May. A detailed analysis of this event and its source modelling is accompanied by a set of synthetic tests, which aim to evaluate the effects of topography and the presence of low velocity shallow layers in the source region. The forward problem of Green's function generation is solved numerically following a pseudo-spectral approach, assuming different 3-D models. The inversion is done in the frequency domain and the resulting source mechanism is represented by the sum of two time-dependent terms: a full moment tensor and a single force. Synthetic tests show how characteristic velocity structures, associated with shallow sources, may be partially responsible for the generation of the observed long-lasting ringing waveforms. When applying the inversion technique to Kilauea LP data set, inversions carried out for different crustal models led to very similar source geometries, indicating a subhorizontal cracks. On the other hand, the source time function and its duration are significantly different for different models. These results support the indication of a strong influence of crustal layering on the generation of the LP signal, while the assumption of homogeneous velocity model may bring to misleading results. ?? 2008 The Authors Journal compilation ?? 2008 RAS.
D Modelling of AN Indoor Space Using a Rotating Stereo Frame Camera System
NASA Astrophysics Data System (ADS)
Kang, J.; Lee, I.
2016-06-01
Sophisticated indoor design and growing development in urban architecture make indoor spaces more complex. And the indoor spaces are easily connected to public transportations such as subway and train stations. These phenomena allow to transfer outdoor activities to the indoor spaces. Constant development of technology has a significant impact on people knowledge about services such as location awareness services in the indoor spaces. Thus, it is required to develop the low-cost system to create the 3D model of the indoor spaces for services based on the indoor models. In this paper, we thus introduce the rotating stereo frame camera system that has two cameras and generate the indoor 3D model using the system. First, select a test site and acquired images eight times during one day with different positions and heights of the system. Measurements were complemented by object control points obtained from a total station. As the data were obtained from the different positions and heights of the system, it was possible to make various combinations of data and choose several suitable combinations for input data. Next, we generated the 3D model of the test site using commercial software with previously chosen input data. The last part of the processes will be to evaluate the accuracy of the generated indoor model from selected input data. In summary, this paper introduces the low-cost system to acquire indoor spatial data and generate the 3D model using images acquired by the system. Through this experiments, we ensure that the introduced system is suitable for generating indoor spatial information. The proposed low-cost system will be applied to indoor services based on the indoor spatial information.
NASA Technical Reports Server (NTRS)
Kirkman, K. L.; Brown, C. E.; Goodman, A.
1973-01-01
The effectiveness of various candidate aircraft-wing devices for attenuation of trailing vortices generated by large aircraft is evaluated on basis of results of experiments conducted with a 0.03-scale model of a Boeing 747 transport aircraft using a technique developed at the HYDRONAUTICS Ship Model Basin. Emphasis is on the effects produced by these devices in the far-field (up to 8 kilometers downstream of full-scale generating aircraft) where the unaltered vortex-wakes could still be hazardous to small following aircraft. The evaluation is based primarily on quantitative measurements of the respective vortex velocity distributions made by means of hot-film probe traverses in a transverse plane at selected stations downstream. The effects of these altered wakes on rolling moment induced on a small following aircraft are also studied using a modified lifting-surface theory with a synthesized Gates Learjet as a typical example. Lift and drag measurements concurrently obtained in the model tests are used to appraise the effects of each device investigated on the performance characteristics of the generating aircraft.
Gray correlation analysis and prediction models of living refuse generation in Shanghai city.
Liu, Gousheng; Yu, Jianguo
2007-01-01
A better understanding of the factors that affect the generation of municipal living refuse (MLF) and the accurate prediction of its generation are crucial for municipal planning projects and city management. Up to now, most of the design efforts have been based on a rough prediction of MLF without any actual support. In this paper, based on published data of socioeconomic variables and MLF generation from 1990 to 2003 in the city of Shanghai, the main factors that affect MLF generation have been quantitatively studied using the method of gray correlation coefficient. Several gray models, such as GM(1,1), GIM(1), GPPM(1) and GLPM(1), have been studied, and predicted results are verified with subsequent residual test. Results show that, among the selected seven factors, consumption of gas, water and electricity are the largest three factors affecting MLF generation, and GLPM(1) is the optimized model to predict MLF generation. Through this model, the predicted MLF generation in 2010 in Shanghai will be 7.65 million tons. The methods and results developed in this paper can provide valuable information for MLF management and related municipal planning projects.
ERIC Educational Resources Information Center
ALTMANN, BERTHOLD; BROWN, WILLIAM G.
THE FIRST-GENERATION APPROACH BY CONCEPT (ABC) STORAGE AND RETRIEVAL METHOD, A METHOD WHICH UTILIZES AS A SUBJECT APPROACH APPROPRIATE STANDARDIZED ENGLISH-LANGUAGE STATEMENTS PROCESSED AND PRINTED IN A PERMUTED INDEX FORMAT, UNDERWENT A PERFORMANCE TEST, THE PRIMARY OBJECTIVE OF WHICH WAS TO SPOT DEFICIENCIES AND TO DEVELOP A SECOND-GENERATION…
Liu, Richard T.; Alloy, Lauren B.; Mastin, Becky M.; Choi, Jimmy Y.; Boland, Elaine M.; Jenkins, Abby L.
2014-01-01
Although there is substantial evidence documenting the stress generation effect in depression (i.e., the tendency for depression-prone individuals to experience higher rates of life stress to which they contribute), additional research is required to advance current understanding of the specific types of dependent stress (i.e., events influenced by characteristics and attendant behaviors of the individual) relevant to this effect. The present study tested an extension of the stress generation hypothesis, in which the content of dependent stress that is produced by depression-prone individuals is contingent upon, and matches, the nature of their particular vulnerabilities. This extension was tested within the context of two cognitive models (i.e., hopelessness theory [Abramson, Metalsky, & Alloy, 1989] and Cole’s [1990, 1991] competency-based model) and two interpersonal models (i.e., Swann’s [1987] self-verification theory and Coyne’s [1976] interpersonal theory) of depression. Overall, support was obtained for vulnerability-specific stress generation. Specifically, in analyses across vulnerability domains, evidence of stress-generation specificity was found for all domain-specific cognitive vulnerabilities except self-perceived social competence. The within-domain analyses for cognitive vulnerabilities produced more mixed results, but were largely supportive. Additionally, excessive reassurance-seeking was specifically predictive of dependent stress in the social domain, and moderated, but did not mediate, the relation between negative inferential styles overall and in the interpersonal domain and their corresponding generated stress. Finally, no evidence was found for a stress generation effect with negative feedback-seeking. PMID:24679143
Reduced order modeling of head related transfer functions for virtual acoustic displays
NASA Astrophysics Data System (ADS)
Willhite, Joel A.; Frampton, Kenneth D.; Grantham, D. Wesley
2003-04-01
The purpose of this work is to improve the computational efficiency in acoustic virtual applications by creating and testing reduced order models of the head related transfer functions used in localizing sound sources. State space models of varying order were generated from zero-elevation Head Related Impulse Responses (HRIRs) using Kungs Single Value Decomposition (SVD) technique. The inputs to the models are the desired azimuths of the virtual sound sources (from minus 90 deg to plus 90 deg, in 10 deg increments) and the outputs are the left and right ear impulse responses. Trials were conducted in an anechoic chamber in which subjects were exposed to real sounds that were emitted by individual speakers across a numbered speaker array, phantom sources generated from the original HRIRs, and phantom sound sources generated with the different reduced order state space models. The error in the perceived direction of the phantom sources generated from the reduced order models was compared to errors in localization using the original HRIRs.
Harrell-Williams, Leigh; Wolfe, Edward W
2014-01-01
Previous research has investigated the influence of sample size, model misspecification, test length, ability distribution offset, and generating model on the likelihood ratio difference test in applications of item response models. This study extended that research to the evaluation of dimensionality using the multidimensional random coefficients multinomial logit model (MRCMLM). Logistic regression analysis of simulated data reveal that sample size and test length have a large effect on the capacity of the LR difference test to correctly identify unidimensionality, with shorter tests and smaller sample sizes leading to smaller Type I error rates. Higher levels of simulated misfit resulted in fewer incorrect decisions than data with no or little misfit. However, Type I error rates indicate that the likelihood ratio difference test is not suitable under any of the simulated conditions for evaluating dimensionality in applications of the MRCMLM.
Full-Body Musculoskeletal Model for Muscle-Driven Simulation of Human Gait.
Rajagopal, Apoorva; Dembia, Christopher L; DeMers, Matthew S; Delp, Denny D; Hicks, Jennifer L; Delp, Scott L
2016-10-01
Musculoskeletal models provide a non-invasive means to study human movement and predict the effects of interventions on gait. Our goal was to create an open-source 3-D musculoskeletal model with high-fidelity representations of the lower limb musculature of healthy young individuals that can be used to generate accurate simulations of gait. Our model includes bony geometry for the full body, 37 degrees of freedom to define joint kinematics, Hill-type models of 80 muscle-tendon units actuating the lower limbs, and 17 ideal torque actuators driving the upper body. The model's musculotendon parameters are derived from previous anatomical measurements of 21 cadaver specimens and magnetic resonance images of 24 young healthy subjects. We tested the model by evaluating its computational time and accuracy of simulations of healthy walking and running. Generating muscle-driven simulations of normal walking and running took approximately 10 minutes on a typical desktop computer. The differences between our muscle-generated and inverse dynamics joint moments were within 3% (RMSE) of the peak inverse dynamics joint moments in both walking and running, and our simulated muscle activity showed qualitative agreement with salient features from experimental electromyography data. These results suggest that our model is suitable for generating muscle-driven simulations of healthy gait. We encourage other researchers to further validate and apply the model to study other motions of the lower extremity. The model is implemented in the open-source software platform OpenSim. The model and data used to create and test the simulations are freely available at https://simtk.org/home/full_body/, allowing others to reproduce these results and create their own simulations.
Test of a Nb thin film superconducting detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lacquaniti, V.; Maggi, S.; Menichetti, E.
1993-08-01
Results from tests of several Nb thin film microstrip superconducting detectors are reported. A preliminary measurement of critical radius of the hot spot generated by 5 MeV [alpha]-particles is compared with simple model predictions.
Ba, Zhaoqing; Meng, Fei-Long; Gostissa, Monica; Huang, Pei-Yi; Ke, Qiang; Wang, Zhe; Dao, Mai N; Fujiwara, Yuko; Rajewsky, Klaus; Zhang, Baochun; Alt, Frederick W
2015-06-01
The Epstein-Barr virus (EBV) latent membrane protein 1 (LMP1) contributes to oncogenic human B-cell transformation. Mouse B cells conditionally expressing LMP1 are not predisposed to B-cell malignancies, as LMP1-expressing B cells are eliminated by T cells. However, mice with conditional B-cell LMP1 expression and genetic elimination of α/β and γ/δ T cells ("CLT" mice) die early in association with B-cell lymphoproliferation and lymphomagenesis. Generation of CLT mice involves in-breeding multiple independently segregating alleles. Thus, although introduction of additional activating or knockout mutations into the CLT model is desirable for further B-cell expansion and immunosurveillance studies, doing such experiments by germline breeding is time-consuming, expensive, and sometimes unfeasible. To generate a more tractable model, we generated clonal CLT embryonic stem (ES) cells from CLT embryos and injected them into RAG2-deficient blastocysts to generate chimeric mice, which, like germline CLT mice, harbor splenic CLT B cells and lack T cells. CLT chimeric mice generated by this RAG2-deficient blastocyst complementation ("RDBC") approach die rapidly in association with B-cell lymphoproliferation and lymphoma. Because CLT lymphomas routinely express the activation-induced cytidine deaminase (AID) antibody diversifier, we tested potential AID roles by eliminating the AID gene in CLT ES cells and testing them via RDBC. We found that CLT and AID-deficient CLT ES chimeras had indistinguishable phenotypes, showing that AID is not essential for LMP1-induced lymphomagenesis. Beyond expanding accessibility and utility of CLT mice as a cancer immunotherapy model, our studies provide a new approach for facilitating generation of genetically complex mouse cancer models. ©2015 American Association for Cancer Research.
Ba, Zhaoqing; Meng, Fei-Long; Gostissa, Monica; Huang, Pei-Yi; Ke, Qiang; Wang, Zhe; Dao, Mai N.; Fujiwara, Yuko; Rajewsky, Klaus; Baochun, Zhang; Alt, Frederick W.
2015-01-01
The Epstein-Barr virus (EBV) latent membrane protein 1 (LMP1) contributes to oncogenic human B-cell transformation. Mouse B cells conditionally expressing LMP1 are not predisposed to B-cell malignancies, as LMP1-expressing B cells are eliminated by T cells. However, mice with conditional B-cell LMP1 expression and genetic elimination of α/β and γ/δ T cells (“CLT” mice) die early in association with B-cell lymphoproliferation and lymphomagenesis. Generation of CLT mice involves in-breeding multiple independently segregating alleles. Thus, while introduction of additional activating or knock-out mutations into the CLT model is desirable for further B-cell expansion and immunosurveillance studies, doing such experiments by germline breeding is time-consuming, expensive and sometimes unfeasible. To generate a more tractable model, we generated clonal CLT ES cells from CLT embryos and injected them into RAG2-deficient blastocysts to generate chimeric mice, which like germline CLT mice harbor splenic CLT B cells and lack T cells. CLT chimeric mice generated by this RAG2-deficient blastocyst complementation (“RDBC”) approach die rapidly in association with B-cell lymphoproliferation and lymphoma. As CLT lymphomas routinely express the Activation-Induced Cytidine Deaminase (AID) antibody diversifier, we tested potential AID roles by eliminating the AID gene in CLT ES cells and testing them via RDBC. We found that CLT and AID-deficient CLT ES chimeras had indistinguishable phenotypes, showing that AID is not essential for LMP1-induced lymphomagenesis. Beyond expanding accessibility and utility of CLT mice as a cancer immunotherapy model, our studies provide a new approach for facilitating generation of genetically complex mouse cancer models. PMID:25934172
Development of a low cost test rig for standalone WECS subject to electrical faults.
Himani; Dahiya, Ratna
2016-11-01
In this paper, a contribution to the development of low-cost wind turbine (WT) test rig for stator fault diagnosis of wind turbine generator is proposed. The test rig is developed using a 2.5kW, 1750 RPM DC motor coupled to a 1.5kW, 1500 RPM self-excited induction generator interfaced with a WT mathematical model in LabVIEW. The performance of the test rig is benchmarked with already proven wind turbine test rigs. In order to detect the stator faults using non-stationary signals in self-excited induction generator, an online fault diagnostic technique of DWT-based multi-resolution analysis is proposed. It has been experimentally proven that for varying wind conditions wavelet decomposition allows good differentiation between faulty and healthy conditions leading to an effective diagnostic procedure for wind turbine condition monitoring. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Structural dynamic testing of composite propfan blades for a cruise missile wind tunnel model
NASA Technical Reports Server (NTRS)
Elgin, Stephen D.; Sutliff, Thomas J.
1993-01-01
The Naval Weapons Center at China Lake, California is currently evaluating a counter rotating propfan system as a means of propulsion for the next generation of cruise missiles. The details and results of a structural dynamic test program are presented for scale model graphite-epoxy composite propfan blades. These blades are intended for use on a cruise missile wind tunnel model. Both dynamic characteristics and strain operating limits of the blades are presented. Complications associated with high strain level fatigue testing methods are also discussed.
Advances in Time Estimation Methods for Molecular Data.
Kumar, Sudhir; Hedges, S Blair
2016-04-01
Molecular dating has become central to placing a temporal dimension on the tree of life. Methods for estimating divergence times have been developed for over 50 years, beginning with the proposal of molecular clock in 1962. We categorize the chronological development of these methods into four generations based on the timing of their origin. In the first generation approaches (1960s-1980s), a strict molecular clock was assumed to date divergences. In the second generation approaches (1990s), the equality of evolutionary rates between species was first tested and then a strict molecular clock applied to estimate divergence times. The third generation approaches (since ∼2000) account for differences in evolutionary rates across the tree by using a statistical model, obviating the need to assume a clock or to test the equality of evolutionary rates among species. Bayesian methods in the third generation require a specific or uniform prior on the speciation-process and enable the inclusion of uncertainty in clock calibrations. The fourth generation approaches (since 2012) allow rates to vary from branch to branch, but do not need prior selection of a statistical model to describe the rate variation or the specification of speciation model. With high accuracy, comparable to Bayesian approaches, and speeds that are orders of magnitude faster, fourth generation methods are able to produce reliable timetrees of thousands of species using genome scale data. We found that early time estimates from second generation studies are similar to those of third and fourth generation studies, indicating that methodological advances have not fundamentally altered the timetree of life, but rather have facilitated time estimation by enabling the inclusion of more species. Nonetheless, we feel an urgent need for testing the accuracy and precision of third and fourth generation methods, including their robustness to misspecification of priors in the analysis of large phylogenies and data sets. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Quantum random number generation
Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu; ...
2016-06-28
Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferraioli, Luigi; Hueller, Mauro; Vitale, Stefano
The scientific objectives of the LISA Technology Package experiment on board of the LISA Pathfinder mission demand accurate calibration and validation of the data analysis tools in advance of the mission launch. The level of confidence required in the mission outcomes can be reached only by intensively testing the tools on synthetically generated data. A flexible procedure allowing the generation of a cross-correlated stationary noise time series was set up. A multichannel time series with the desired cross-correlation behavior can be generated once a model for a multichannel cross-spectral matrix is provided. The core of the procedure comprises a noisemore » coloring, multichannel filter designed via a frequency-by-frequency eigendecomposition of the model cross-spectral matrix and a subsequent fit in the Z domain. The common problem of initial transients in a filtered time series is solved with a proper initialization of the filter recursion equations. The noise generator performance was tested in a two-dimensional case study of the closed-loop LISA Technology Package dynamics along the two principal degrees of freedom.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garbett, K; Mendler, O J; Gardner, G C
In PWR steam generator tube rupture (SGTR) faults, a direct pathway for the release of radioactive fission products can exist if there is a coincident stuck-open safety relief valve (SORV) or if the safety relief valve is cycled. In addition to the release of fission products from the bulk steam generator water by moisture carryover, there exists the possibility that some primary coolant may be released without having first mixed with the bulk water - a process called primary coolant bypassing. The MB-2 Phase II test program was designed specifically to identify the processes for droplet carryover during SGTR faultsmore » and to provide data of sufficient accuracy for use in developing physical models and computer codes to describe activity release. The test program consisted of sixteen separate tests designed to cover a range of steady-state and transient fault conditions. These included a full SGTR/SORV transient simulation, two SGTR overfill tests, ten steady-state SGTR tests at water levels ranging from very low levels in the bundle up to those when the dryer was flooded, and three moisture carryover tests without SGTR. In these tests the influence of break location and the effect of bypassing the dryer were also studied. In a final test the behavior with respect to aerosol particles in a dry steam generator, appropriate to a severe accident fault, was investigated.« less
Software Quality Assurance and Verification for the MPACT Library Generation Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea
This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less
Biomechanical comparison of the human cadaveric pelvis with a fourth generation composite model.
Girardi, Brandon L; Attia, Tarik; Backstein, David; Safir, Oleg; Willett, Thomas L; Kuzyk, Paul R T
2016-02-29
The use of cadavers for orthopaedic biomechanics research is well established, but presents difficulties to researchers in terms of cost, biosafety, availability, and ease of use. High fidelity composite models of human bone have been developed for use in biomechanical studies. While several studies have utilized composite models of the human pelvis for testing orthopaedic reconstruction techniques, few biomechanical comparisons of the properties of cadaveric and composite pelves exist. The aim of this study was to compare the mechanical properties of cadaveric pelves to those of the 4th generation composite model. An Instron ElectroPuls E10000 mechanical testing machine was used to load specimens with orientation, boundary conditions and degrees of freedom that approximated those occurring during the single legged phase of walking, including hip abductor force. Each specimen was instrumented with strain gauge rosettes. Overall specimen stiffness and principal strains were calculated from the test data. Composite specimens showed significantly higher overall stiffness and slightly less overall variability between specimens (composite K=1448±54N/m, cadaver K=832±62N/m; p<0.0001). Strains measured at specific sites in the composite models and cadavers were similar (but did differ) only when the applied load was scaled to overall construct stiffness. This finding regarding strain distribution and the difference in overall stiffness must be accounted for when using these composite models for biomechanics research. Altering the cortical wall thickness or tuning the elastic moduli of the composite material may improve future generations of the composite model. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Sevart, F. D.; Patel, S. M.; Wattman, W. J.
1972-01-01
Testing and evaluation of stability augmentation systems for aircraft flight control were conducted. The flutter suppression system analysis of a scale supersonic transport wing model is described. Mechanization of the flutter suppression system is reported. The ride control synthesis for the B-52 aeroelastic model is discussed. Model analyses were conducted using equations of motion generated from generalized mass and stiffness data.
High temperature fatigue behavior of Haynes 188
NASA Technical Reports Server (NTRS)
Halford, Gary R.; Saltsman, James F.; Kalluri, Sreeramesh
1988-01-01
The high temperature, creep-fatigue behavior of Haynes 188 was investigated as an element in a broader thermomechanical fatigue life prediction model development program at the NASA-Lewis. The models are still in the development stage, but the data that were generated possess intrinsic value on their own. Results generated to date is reported. Data were generated to characterize isothermal low cycle fatigue resistance at temperatures of 316, 704, and 927 C with cyclic failure lives ranging from 10 to more than 20,000. These results follow trends that would be predicted from a knowledge of tensile properties, i.e., as the tensile ductility varies with temperature, so varies the cyclic inelastic straining capacity. Likewise, as the tensile strength decreases, so does the high cyclic fatigue resistance. A few two-minute hold-time cycles at peak compressive strain were included in tests at 760 C. These results were obtained in support of a redesign effort for the Orbital Maneuverable System engine. No detrimental effects on cyclic life were noted despite the added exposure time for creep and oxidation. Finally, a series of simulated thermal fatigue tests, referred to as bithermal fatigue tests, were conducted using 316 C as the minimum and 760 C as the maximum temperature. Only out-of-phase bithermal tests were conducted to date. These test results are intended for use as input to a more general thermomechanical fatigue life prediction model based on the concepts of the total strain version of Strainrange Partitioning.
NASA Astrophysics Data System (ADS)
Rai, Aakash C.; Lin, Chao-Hsin; Chen, Qingyan
2015-02-01
Ozone-terpene reactions are important sources of indoor ultrafine particles (UFPs), a potential health hazard for human beings. Humans themselves act as possible sites for ozone-initiated particle generation through reactions with squalene (a terpene) that is present in their skin, hair, and clothing. This investigation developed a numerical model to probe particle generation from ozone reactions with clothing worn by humans. The model was based on particle generation measured in an environmental chamber as well as physical formulations of particle nucleation, condensational growth, and deposition. In five out of the six test cases, the model was able to predict particle size distributions reasonably well. The failure in the remaining case demonstrated the fundamental limitations of nucleation models. The model that was developed was used to predict particle generation under various building and airliner cabin conditions. These predictions indicate that ozone reactions with human-worn clothing could be an important source of UFPs in densely occupied classrooms and airliner cabins. Those reactions could account for about 40% of the total UFPs measured on a Boeing 737-700 flight. The model predictions at this stage are indicative and should be improved further.
Parametric Testing of Launch Vehicle FDDR Models
NASA Technical Reports Server (NTRS)
Schumann, Johann; Bajwa, Anupa; Berg, Peter; Thirumalainambi, Rajkumar
2011-01-01
For the safe operation of a complex system like a (manned) launch vehicle, real-time information about the state of the system and potential faults is extremely important. The on-board FDDR (Failure Detection, Diagnostics, and Response) system is a software system to detect and identify failures, provide real-time diagnostics, and to initiate fault recovery and mitigation. The ERIS (Evaluation of Rocket Integrated Subsystems) failure simulation is a unified Matlab/Simulink model of the Ares I Launch Vehicle with modular, hierarchical subsystems and components. With this model, the nominal flight performance characteristics can be studied. Additionally, failures can be injected to see their effects on vehicle state and on vehicle behavior. A comprehensive test and analysis of such a complicated model is virtually impossible. In this paper, we will describe, how parametric testing (PT) can be used to support testing and analysis of the ERIS failure simulation. PT uses a combination of Monte Carlo techniques with n-factor combinatorial exploration to generate a small, yet comprehensive set of parameters for the test runs. For the analysis of the high-dimensional simulation data, we are using multivariate clustering to automatically find structure in this high-dimensional data space. Our tools can generate detailed HTML reports that facilitate the analysis.
Prediction of Acoustic Loads Generated by Propulsion Systems
NASA Technical Reports Server (NTRS)
Perez, Linamaria; Allgood, Daniel C.
2011-01-01
NASA Stennis Space Center is one of the nation's premier facilities for conducting large-scale rocket engine testing. As liquid rocket engines vary in size, so do the acoustic loads that they produce. When these acoustic loads reach very high levels they may cause damages both to humans and to actual structures surrounding the testing area. To prevent these damages, prediction tools are used to estimate the spectral content and levels of the acoustics being generated by the rocket engine plumes and model their propagation through the surrounding atmosphere. Prior to the current work, two different acoustic prediction tools were being implemented at Stennis Space Center, each having their own advantages and disadvantages depending on the application. Therefore, a new prediction tool was created, using NASA SP-8072 handbook as a guide, which would replicate the same prediction methods as the previous codes, but eliminate any of the drawbacks the individual codes had. Aside from replicating the previous modeling capability in a single framework, additional modeling functions were added thereby expanding the current modeling capability. To verify that the new code could reproduce the same predictions as the previous codes, two verification test cases were defined. These verification test cases also served as validation cases as the predicted results were compared to actual test data.
Estimation of the Regression Effect Using a Latent Trait Model.
ERIC Educational Resources Information Center
Quinn, Jimmy L.
A logistic model was used to generate data to serve as a proxy for an immediate retest from item responses to a fourth grade standardized reading comprehension test of 45 items. Assuming that the actual test may be considered a pretest and the proxy data may be considered a retest, the effect of regression was investigated using a percentage of…
Ultra High Bypass Integrated System Test
2015-09-14
NASA’s Environmentally Responsible Aviation Project, in collaboration with the Federal Aviation Administration (FAA) and Pratt & Whitney, completed testing of an Ultra High Bypass Ratio Turbofan Model in the 9’ x 15’ Low Speed Wind Tunnel at NASA Glenn Research Center. The fan model is representative of the next generation of efficient and quiet Ultra High Bypass Ratio Turbofan Engine designs.
ERIC Educational Resources Information Center
Pavlik, Philip I. Jr.; Cen, Hao; Koedinger, Kenneth R.
2009-01-01
This paper describes a novel method to create a quantitative model of an educational content domain of related practice item-types using learning curves. By using a pairwise test to search for the relationships between learning curves for these item-types, we show how the test results in a set of pairwise transfer relationships that can be…
Selective interference with image retention and generation: evidence for the workspace model.
van der Meulen, Marian; Logie, Robert H; Della Sala, Sergio
2009-08-01
We address three types of model of the relationship between working memory (WM) and long-term memory (LTM): (a) the gateway model, in which WM acts as a gateway between perceptual input and LTM; (b) the unitary model, in which WM is seen as the currently activated areas of LTM; and (c) the workspace model, in which perceptual input activates LTM, and WM acts as a separate workspace for processing and temporary retention of these activated traces. Predictions of these models were tested, focusing on visuospatial working memory and using dual-task methodology to combine two main tasks (visual short-term retention and image generation) with two interference tasks (irrelevant pictures and spatial tapping). The pictures selectively disrupted performance on the generation task, whereas the tapping selectively interfered with the retention task. Results are consistent with the predictions of the workspace model.
Kinetic Modeling of Accelerated Stability Testing Enabled by Second Harmonic Generation Microscopy.
Song, Zhengtian; Sarkar, Sreya; Vogt, Andrew D; Danzer, Gerald D; Smith, Casey J; Gualtieri, Ellen J; Simpson, Garth J
2018-04-03
The low limits of detection afforded by second harmonic generation (SHG) microscopy coupled with image analysis algorithms enabled quantitative modeling of the temperature-dependent crystallization of active pharmaceutical ingredients (APIs) within amorphous solid dispersions (ASDs). ASDs, in which an API is maintained in an amorphous state within a polymer matrix, are finding increasing use to address solubility limitations of small-molecule APIs. Extensive stability testing is typically performed for ASD characterization, the time frame for which is often dictated by the earliest detectable onset of crystal formation. Here a study of accelerated stability testing on ritonavir, a human immunodeficiency virus (HIV) protease inhibitor, has been conducted. Under the condition for accelerated stability testing at 50 °C/75%RH and 40 °C/75%RH, ritonavir crystallization kinetics from amorphous solid dispersions were monitored by SHG microscopy. SHG microscopy coupled by image analysis yielded limits of detection for ritonavir crystals as low as 10 ppm, which is about 2 orders of magnitude lower than other methods currently available for crystallinity detection in ASDs. The four decade dynamic range of SHG microscopy enabled quantitative modeling with an established (JMAK) kinetic model. From the SHG images, nucleation and crystal growth rates were independently determined.
Acoustic Treatment Design Scaling Methods. Volume 1; Overview, Results, and Recommendations
NASA Technical Reports Server (NTRS)
Kraft, R. E.; Yu, J.
1999-01-01
Scale model fan rigs that simulate new generation ultra-high-bypass engines at about 1/5-scale are achieving increased importance as development vehicles for the design of low-noise aircraft engines. Testing at small scale allows the tests to be performed in existing anechoic wind tunnels, which provides an accurate simulation of the important effects of aircraft forward motion on the noise generation. The ability to design, build, and test miniaturized acoustic treatment panels on scale model fan rigs representative of the fullscale engine provides not only a cost-savings, but an opportunity to optimize the treatment by allowing tests of different designs. The primary objective of this study was to develop methods that will allow scale model fan rigs to be successfully used as acoustic treatment design tools. The study focuses on finding methods to extend the upper limit of the frequency range of impedance prediction models and acoustic impedance measurement methods for subscale treatment liner designs, and confirm the predictions by correlation with measured data. This phase of the program had as a goal doubling the upper limit of impedance measurement from 6 kHz to 12 kHz. The program utilizes combined analytical and experimental methods to achieve the objectives.
NASA Astrophysics Data System (ADS)
Darwiche, Mahmoud Khalil M.
The research presented herein is a contribution to the understanding of the numerical modeling of fully nonlinear, transient water waves. The first part of the work involves the development of a time-domain model for the numerical generation of fully nonlinear, transient waves by a piston type wavemaker in a three-dimensional, finite, rectangular tank. A time-domain boundary-integral model is developed for simulating the evolving fluid field. A robust nonsingular, adaptive integration technique for the assembly of the boundary-integral coefficient matrix is developed and tested. A parametric finite-difference technique for calculating the fluid- particle kinematics is also developed and tested. A novel compatibility and continuity condition is implemented to minimize the effect of the singularities that are inherent at the intersections of the various Dirichlet and/or Neumann subsurfaces. Results are presented which demonstrate the accuracy and convergence of the numerical model. The second portion of the work is a study of the interaction of the numerically-generated, fully nonlinear, transient waves with a bottom-mounted, surface-piercing, vertical, circular cylinder. The numerical model developed in the first part of this dissertation is extended to include the presence of the cylinder at the centerline of the basin. The diffraction of the numerically generated waves by the cylinder is simulated, and the particle kinematics of the diffracted flow field are calculated and reported. Again, numerical results showing the accuracy and convergence of the extended model are presented.
Full body musculoskeletal model for muscle-driven simulation of human gait
Rajagopal, Apoorva; Dembia, Christopher L.; DeMers, Matthew S.; Delp, Denny D.; Hicks, Jennifer L.; Delp, Scott L.
2017-01-01
Objective Musculoskeletal models provide a non-invasive means to study human movement and predict the effects of interventions on gait. Our goal was to create an open-source, three-dimensional musculoskeletal model with high-fidelity representations of the lower limb musculature of healthy young individuals that can be used to generate accurate simulations of gait. Methods Our model includes bony geometry for the full body, 37 degrees of freedom to define joint kinematics, Hill-type models of 80 muscle-tendon units actuating the lower limbs, and 17 ideal torque actuators driving the upper body. The model’s musculotendon parameters are derived from previous anatomical measurements of 21 cadaver specimens and magnetic resonance images of 24 young healthy subjects. We tested the model by evaluating its computational time and accuracy of simulations of healthy walking and running. Results Generating muscle-driven simulations of normal walking and running took approximately 10 minutes on a typical desktop computer. The differences between our muscle-generated and inverse dynamics joint moments were within 3% (RMSE) of the peak inverse dynamics joint moments in both walking and running, and our simulated muscle activity showed qualitative agreement with salient features from experimental electromyography data. Conclusion These results suggest that our model is suitable for generating muscle-driven simulations of healthy gait. We encourage other researchers to further validate and apply the model to study other motions of the lower-extremity. Significance The model is implemented in the open source software platform OpenSim. The model and data used to create and test the simulations are freely available at https://simtk.org/home/full_body/, allowing others to reproduce these results and create their own simulations. PMID:27392337
Generation of Simulated Tracking Data for LADEE Operational Readiness Testing
NASA Technical Reports Server (NTRS)
Woodburn, James; Policastri, Lisa; Owens, Brandon
2015-01-01
Operational Readiness Tests were an important part of the pre-launch preparation for the LADEE mission. The generation of simulated tracking data to stress the Flight Dynamics System and the Flight Dynamics Team was important for satisfying the testing goal of demonstrating that the software and the team were ready to fly the operational mission. The simulated tracking was generated in a manner to incorporate the effects of errors in the baseline dynamical model, errors in maneuver execution and phenomenology associated with various tracking system based components. The ability of the mission team to overcome these challenges in a realistic flight dynamics scenario indicated that the team and flight dynamics system were ready to fly the LADEE mission. Lunar Atmosphere and Dust Environment.
One Dimensional Turing-Like Handshake Test for Motor Intelligence
Karniel, Amir; Avraham, Guy; Peles, Bat-Chen; Levy-Tzedek, Shelly; Nisky, Ilana
2010-01-01
In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake. PMID:21206462
Flow quality studies of the NASA Lewis Research Center Icing Research Tunnel diffuser
NASA Technical Reports Server (NTRS)
Arrington, E. Allen; Pickett, Mark T.; Sheldon, David W.
1994-01-01
The purpose was to document the airflow characteristics in the diffuser of the NASA Lewis Research Center Icing Research Tunnel and to determine the effects of vortex generators on the flow quality in the diffuser. The results were used to determine how to improve the flow in this portion of the tunnel so that it can be more effectively used as an icing test section and such that overall tunnel efficiency can be improved. The demand for tunnel test time and the desire to test models that are too large for the test section were two of the drivers behind this diffuser study. For all vortex generator configurations tested, the flow quality was improved.
Postbuckling and Growth of Delaminations in Composite Plates Subjected to Axial Compression
NASA Technical Reports Server (NTRS)
Reeder, James R.; Chunchu, Prasad B.; Song, Kyongchan; Ambur, Damodar R.
2002-01-01
The postbuckling response and growth of circular delaminations in flat and curved plates are investigated as part of a study to identify the criticality of delamination locations through the laminate thickness. The experimental results from tests on delaminated plates are compared with finite element analysis results generated using shell models. The analytical prediction of delamination growth is obtained by assessing the strain energy release rate results from the finite element model and comparing them to a mixed-mode fracture toughness failure criterion. The analytical results for onset of delamination growth compare well with experimental results generated using a 3-dimensional displacement visualization system. The record of delamination progression measured in this study has resulted in a fully 3-dimensional test case with which progressive failure models can be validated.
Helicopter noise research at the Langley V/STOL tunnel
NASA Technical Reports Server (NTRS)
Hoad, D. R.; Green, G. C.
1978-01-01
The noise generated from a 1/4-scale AH-1G helicopter configuration was investigated in the Langley V/STOL tunnel. Microphones were installed in positions scaled to those for which flight test data were available. Model and tunnel conditions were carefully set to properly scaled flight conditions. Data presented indicate a high degree of similarity between model and flight test results. It was found that the pressure time history waveforms are very much alike in shape and amplitude. Blade slap when it occurred seemed to be generated in about the same location in the rotor disk as on the flight vehicle. If model and tunnel conditions were properly matched, including inflow turbulence characteristics, the intensity of the blade-slap impulse seemed to correlate well with flight.
Quantum random number generation for loophole-free Bell tests
NASA Astrophysics Data System (ADS)
Mitchell, Morgan; Abellan, Carlos; Amaya, Waldimar
2015-05-01
We describe the generation of quantum random numbers at multi-Gbps rates, combined with real-time randomness extraction, to give very high purity random numbers based on quantum events at most tens of ns in the past. The system satisfies the stringent requirements of quantum non-locality tests that aim to close the timing loophole. We describe the generation mechanism using spontaneous-emission-driven phase diffusion in a semiconductor laser, digitization, and extraction by parity calculation using multi-GHz logic chips. We pay special attention to experimental proof of the quality of the random numbers and analysis of the randomness extraction. In contrast to widely-used models of randomness generators in the computer science literature, we argue that randomness generation by spontaneous emission can be extracted from a single source.
Mathematical model of snake-type multi-directional wave generation
NASA Astrophysics Data System (ADS)
Muarif; Halfiani, Vera; Rusdiana, Siti; Munzir, Said; Ramli, Marwan
2018-01-01
Research on extreme wave generation is one intensive research on water wave study because the fact that the occurrence of this wave in the ocean can cause serious damage to the ships and offshore structures. One method to be used to generate the wave is self-correcting. This method controls the signal on the wavemakers in a wave tank. Some studies also consider the nonlinear wave generation in a wave tank by using numerical approach. Study on wave generation is essential in the effectiveness and efficiency of offshore structure model testing before it can be operated in the ocean. Generally, there are two types of wavemakers implemented in the hydrodynamic laboratory, piston-type and flap-type. The flap-type is preferred to conduct a testing to a ship in deep water. Single flap wavemaker has been explained in many studies yet snake-type wavemaker (has more than one flap) is still a case needed to be examined. Hence, the formulation in controlling the wavemaker need to be precisely analyzed such that the given input can generate the desired wave in the space-limited wave tank. By applying the same analogy and methodhology as the previous study, this article represents multi-directional wave generation by implementing snake-type wavemakers.
Modeling and Simulation of the Second-Generation Orion Crew Module Air Bag Landing System
NASA Technical Reports Server (NTRS)
Timmers, Richard B.; Hardy, Robin C.; Willey, Cliff E.; Welch, Joseph V.
2009-01-01
Air bags were evaluated as the landing attenuation system for earth landing of the Orion Crew Module (CM). Analysis conducted to date shows that airbags are capable of providing a graceful landing of the CM in nominal and off-nominal conditions such as parachute failure, high horizontal winds, and unfavorable vehicle/ground angle combinations, while meeting crew and vehicle safety requirements. The analyses and associated testing presented here surround a second generation of the airbag design developed by ILC Dover, building off of relevant first-generation design, analysis, and testing efforts. In order to fully evaluate the second generation air bag design and correlate the dynamic simulations, a series of drop tests were carried out at NASA Langley s Landing and Impact Research (LandIR) facility in Hampton, Virginia. The tests consisted of a full-scale set of air bags attached to a full-scale test article representing the Orion Crew Module. The techniques used to collect experimental data, develop the simulations, and make comparisons to experimental data are discussed.
Ghose, Soumya; Greer, Peter B; Sun, Jidi; Pichler, Peter; Rivest-Henault, David; Mitra, Jhimli; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Dowling, Jason A
2017-10-27
In MR only radiation therapy planning, generation of the tissue specific HU map directly from the MRI would eliminate the need of CT image acquisition and may improve radiation therapy planning. The aim of this work is to generate and validate substitute CT (sCT) scans generated from standard T2 weighted MR pelvic scans in prostate radiation therapy dose planning. A Siemens Skyra 3T MRI scanner with laser bridge, flat couch and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole pelvis MRI (1.6 mm 3D isotropic T2w SPACE sequence) was acquired. Patients received a routine planning CT scan. Co-registered whole pelvis CT and T2w MRI pairs were used as training images. Advanced tissue specific non-linear regression models to predict HU for the fat, muscle, bladder and air were created from co-registered CT-MRI image pairs. On a test case T2w MRI, the bones and bladder were automatically segmented using a novel statistical shape and appearance model, while other soft tissues were separated using an Expectation-Maximization based clustering model. The CT bone in the training database that was most 'similar' to the segmented bone was then transformed with deformable registration to create the sCT component of the test case T2w MRI bone tissue. Predictions for the bone, air and soft tissue from the separate regression models were successively combined to generate a whole pelvis sCT. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same IMRT dose plan was found to be [Formula: see text] (mean ± standard deviation) for 39 patients. The 3D Gamma pass rate was [Formula: see text] (2 mm/2%). The novel hybrid model is computationally efficient, generating an sCT in 20 min from standard T2w images for prostate cancer radiation therapy dose planning and DRR generation.
NASA Astrophysics Data System (ADS)
Ghose, Soumya; Greer, Peter B.; Sun, Jidi; Pichler, Peter; Rivest-Henault, David; Mitra, Jhimli; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Dowling, Jason A.
2017-11-01
In MR only radiation therapy planning, generation of the tissue specific HU map directly from the MRI would eliminate the need of CT image acquisition and may improve radiation therapy planning. The aim of this work is to generate and validate substitute CT (sCT) scans generated from standard T2 weighted MR pelvic scans in prostate radiation therapy dose planning. A Siemens Skyra 3T MRI scanner with laser bridge, flat couch and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole pelvis MRI (1.6 mm 3D isotropic T2w SPACE sequence) was acquired. Patients received a routine planning CT scan. Co-registered whole pelvis CT and T2w MRI pairs were used as training images. Advanced tissue specific non-linear regression models to predict HU for the fat, muscle, bladder and air were created from co-registered CT-MRI image pairs. On a test case T2w MRI, the bones and bladder were automatically segmented using a novel statistical shape and appearance model, while other soft tissues were separated using an Expectation-Maximization based clustering model. The CT bone in the training database that was most ‘similar’ to the segmented bone was then transformed with deformable registration to create the sCT component of the test case T2w MRI bone tissue. Predictions for the bone, air and soft tissue from the separate regression models were successively combined to generate a whole pelvis sCT. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same IMRT dose plan was found to be 0.3%+/-0.9% (mean ± standard deviation) for 39 patients. The 3D Gamma pass rate was 99.8+/-0.00 (2 mm/2%). The novel hybrid model is computationally efficient, generating an sCT in 20 min from standard T2w images for prostate cancer radiation therapy dose planning and DRR generation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pitarka, A.
In this project we developed GEN_SRF4 a computer program for generating kinematic rupture models, compatible with the SRF format, using Irikura and Miyake (2011) asperity-based earthquake rupture model (IM2011, hereafter). IM2011, also known as Irkura’s recipe, has been widely used to model and simulate ground motion from earthquakes in Japan. An essential part of the method is its kinematic rupture generation technique, which is based on a deterministic rupture asperity modeling approach. The source model simplicity and efficiency of IM2011 at reproducing ground motion from earthquakes recorded in Japan makes it attractive to developers and users of the Southern Californiamore » Earthquake Center Broadband Platform (SCEC BB platform). Besides writing the code the objective of our study was to test the transportability of IM2011 to broadband simulation methods used by the SCEC BB platform. Here we test it using the Graves and Pitarka (2010) method, implemented in the platform. We performed broadband (0.1- -10 Hz) ground motion simulations for a M6.7 scenario earthquake using rupture models produced with both GEN_SRF4 and rupture generator of Graves and Pitarka (2016), (GP2016 hereafter). In the simulations we used the same Green’s functions, and same high frequency approach for calculating the low-frequency and high-frequency parts of ground motion, respectively.« less
Li, Qiuying; Pham, Hoang
2017-01-01
In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
K. Payette; D. Tillman
During the period July 1, 2001--September 30, 2001, Allegheny Energy Supply Co., LLC (Allegheny) continued construction of the Willow Island cofiring project, completed the installation of the fuel storage facility, the fuel receiving facility, and the processing building. All mechanical equipment has been installed and electrical construction has proceeded. During this time period significant short term testing of the Albright Generating Station cofiring facility was completed, and the 100-hour test was planned for early October. The testing demonstrated that cofiring at the Albright Generating Station could contribute to a ''4P Strategy''--reduction of SO{sub 2}, NO{sub x}, mercury, and greenhouse gasmore » emissions. This report summarizes the activities associated with the Designer Opportunity Fuel program, and demonstrations at Willow Island and Albright Generating Stations. It details the construction activities at both sites along with the combustion modeling at the Willow Island site.« less
Development of dynamic Bayesian models for web application test management
NASA Astrophysics Data System (ADS)
Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.
2018-03-01
The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.
Hot-bench simulation of the active flexible wing wind-tunnel model
NASA Technical Reports Server (NTRS)
Buttrill, Carey S.; Houck, Jacob A.
1990-01-01
Two simulations, one batch and one real-time, of an aeroelastically-scaled wind-tunnel model were developed. The wind-tunnel model was a full-span, free-to-roll model of an advanced fighter concept. The batch simulation was used to generate and verify the real-time simulation and to test candidate control laws prior to implementation. The real-time simulation supported hot-bench testing of a digital controller, which was developed to actively control the elastic deformation of the wind-tunnel model. Time scaling was required for hot-bench testing. The wind-tunnel model, the mathematical models for the simulations, the techniques employed to reduce the hot-bench time-scale factors, and the verification procedures are described.
The development of comparative bias index
NASA Astrophysics Data System (ADS)
Aimran, Ahmad Nazim; Ahmad, Sabri; Afthanorhan, Asyraf; Awang, Zainudin
2017-08-01
Structural Equation Modeling (SEM) is a second generation statistical analysis techniques developed for analyzing the inter-relationships among multiple variables in a model simultaneously. There are two most common used methods in SEM namely Covariance-Based Structural Equation Modeling (CB-SEM) and Partial Least Square Path Modeling (PLS-PM). There have been continuous debates among researchers in the use of PLS-PM over CB-SEM. While there is few studies were conducted to test the performance of CB-SEM and PLS-PM bias in estimating simulation data. This study intends to patch this problem by a) developing the Comparative Bias Index and b) testing the performance of CB-SEM and PLS-PM using developed index. Based on balanced experimental design, two multivariate normal simulation data with of distinct specifications of size 50, 100, 200 and 500 are generated and analyzed using CB-SEM and PLS-PM.
Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified
NASA Technical Reports Server (NTRS)
Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.
2005-01-01
Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.
NASA Technical Reports Server (NTRS)
Johnson, Paul K.
2007-01-01
NASA Glenn Research Center (GRC) contracted Barber-Nichols, Arvada, CO to construct a dual Brayton power conversion system for use as a hardware proof of concept and to validate results from a computational code known as the Closed Cycle System Simulation (CCSS). Initial checkout tests were performed at Barber- Nichols to ready the system for delivery to GRC. This presentation describes the system hardware components and lists the types of checkout tests performed along with a couple issues encountered while conducting the tests. A description of the CCSS model is also presented. The checkout tests did not focus on generating data, therefore, no test data or model analyses are presented.
The wandering self: Tracking distracting self-generated thought in a cognitively demanding context.
Huijser, Stefan; van Vugt, Marieke K; Taatgen, Niels A
2018-02-01
We investigated how self-referential processing (SRP) affected self-generated thought in a complex working memory task (CWM) to test the predictions of a computational cognitive model. This model described self-generated thought as resulting from competition between task- and distracting processes, and predicted that self-generated thought interferes with rehearsal, reducing memory performance. SRP was hypothesized to influence this goal competition process by encouraging distracting self-generated thinking. We used a spatial CWM task to examine if SRP instigated such thoughts, and employed eye-tracking to examine rehearsal interference in eye-movement and self-generated thinking in pupil size. The results showed that SRP was associated with lower performance and higher rates of self-generated thought. Self-generated thought was associated with less rehearsal and we observed a smaller pupil size for mind wandering. We conclude that SRP can instigate self-generated thought and that goal competition provides a likely explanation for how self-generated thoughts arises in a demanding task. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Myers, B.; Beard, T. D.; Weiskopf, S. R.; Jackson, S. T.; Tittensor, D.; Harfoot, M.; Senay, G. B.; Casey, K.; Lenton, T. M.; Leidner, A. K.; Ruane, A. C.; Ferrier, S.; Serbin, S.; Matsuda, H.; Shiklomanov, A. N.; Rosa, I.
2017-12-01
Biodiversity and ecosystems services underpin political targets for the conservation of biodiversity; however, previous incarnations of these biodiversity-related targets have not relied on integrated model based projections of possible outcomes based on climate and land use change. Although a few global biodiversity models are available, most biodiversity models lie along a continuum of geography and components of biodiversity. Model-based projections of the future of global biodiversity are critical to support policymakers in the development of informed global conservation targets, but the scientific community lacks a clear strategy for integrating diverse data streams in developing, and evaluating the performance of, such biodiversity models. Therefore, in this paper, we propose a framework for ongoing testing and refinement of model-based projections of biodiversity trends and change, by linking a broad variety of biodiversity models with data streams generated by advances in remote sensing, coupled with new and emerging in-situ observation technologies to inform development of essential biodiversity variables, future global biodiversity targets, and indicators. Our two main objectives are to (1) develop a framework for model testing and refining projections of a broad range of biodiversity models, focusing on global models, through the integration of diverse data streams and (2) identify the realistic outputs that can be developed and determine coupled approaches using remote sensing and new and emerging in-situ observations (e.g., metagenomics) to better inform the next generation of global biodiversity targets.
A Vignette (User's Guide) for “An R Package for Statistical ...
StatCharrms is a graphical user front-end for ease of use in analyzing data generated from OCSPP 890.2200, Medaka Extended One Generation Reproduction Test (MEOGRT) and OCSPP 890.2300, Larval Amphibian Gonad Development Assay (LAGDA). The analyses StatCharrms is capable of performing are: Rao-Scott adjusted Cochran-Armitage test for trend By Slices (RSCABS), a Standard Cochran-Armitage test for trend By Slices (SCABS), mixed effects Cox proportional model, Jonckheere-Terpstra step down trend test, Dunn test, one way ANOVA, weighted ANOVA, mixed effects ANOVA, repeated measures ANOVA, and Dunnett test. This document provides a User’s Manual (termed a Vignette by the Comprehensive R Archive Network (CRAN)) for the previously created R-code tool called StatCharrms (Statistical analysis of Chemistry, Histopathology, and Reproduction endpoints using Repeated measures and Multi-generation Studies). The StatCharrms R-code has been publically available directly from EPA staff since the approval of OCSPP 890.2200 and 890.2300, and now is available publically available at the CRAN.
Generation of a Hypomorphic Model of Propionic Acidemia Amenable to Gene Therapy Testing
Guenzel, Adam J; Hofherr, Sean E; Hillestad, Matthew; Barry, Mary; Weaver, Eric; Venezia, Sarah; Kraus, Jan P; Matern, Dietrich; Barry, Michael A
2013-01-01
Propionic acidemia (PA) is a recessive genetic disease that results in an inability to metabolize certain amino acids and odd-chain fatty acids. Current treatment involves restricting consumption of these substrates or liver transplantation. Deletion of the Pcca gene in mice mimics the most severe forms of the human disease. Pcca− mice die within 36 hours of birth, making it difficult to test intravenous systemic therapies in them. We generated an adult hypomorphic model of PA in Pcca− mice using a transgene bearing an A138T mutant of the human PCCA protein. Pcca−/−(A138T) mice have 2% of wild-type PCC activity, survive to adulthood, and have elevations in propionyl-carnitine, methylcitrate, glycine, alanine, lysine, ammonia, and markers associated with cardiomyopathy similar to those in patients with PA. This adult model allowed gene therapy testing by intravenous injection with adenovirus serotype 5 (Ad5) and adeno-associated virus 2/8 (AAV8) vectors. Ad5-mediated more rapid increases in PCCA protein and propionyl-CoA carboxylase (PCC) activity in the liver than AAV8 and both vectors reduced propionylcarnitine and methylcitrate levels. Phenotypic correction was transient with first generation Ad whereas AAV8-mediated long-lasting effects. These data suggest that this PA model may be a useful platform for optimizing systemic intravenous therapies for PA. PMID:23648696
Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival
2015-07-10
The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Barati Farimani, Amir; Gomes, Joseph; Pande, Vijay
2017-11-01
We have developed a new data-driven model paradigm for the rapid inference and solution of the constitutive equations of fluid mechanic by deep learning models. Using generative adversarial networks (GAN), we train models for the direct generation of solutions to steady state heat conduction and incompressible fluid flow without knowledge of the underlying governing equations. Rather than using artificial neural networks to approximate the solution of the constitutive equations, GANs can directly generate the solutions to these equations conditional upon an arbitrary set of boundary conditions. Both models predict temperature, velocity and pressure fields with great test accuracy (>99.5%). The application of our framework for inferring and generating the solutions of partial differential equations can be applied to any physical phenomena and can be used to learn directly from experiments where the underlying physical model is complex or unknown. We also have shown that our framework can be used to couple multiple physics simultaneously, making it amenable to tackle multi-physics problems.
Yuan, Jintao; Yu, Shuling; Zhang, Ting; Yuan, Xuejie; Cao, Yunyuan; Yu, Xingchen; Yang, Xuan; Yao, Wu
2016-06-01
Octanol/water (K(OW)) and octanol/air (K(OA)) partition coefficients are two important physicochemical properties of organic substances. In current practice, K(OW) and K(OA) values of some polychlorinated biphenyls (PCBs) are measured using generator column method. Quantitative structure-property relationship (QSPR) models can serve as a valuable alternative method of replacing or reducing experimental steps in the determination of K(OW) and K(OA). In this paper, two different methods, i.e., multiple linear regression based on dragon descriptors and hologram quantitative structure-activity relationship, were used to predict generator-column-derived log K(OW) and log K(OA) values of PCBs. The predictive ability of the developed models was validated using a test set, and the performances of all generated models were compared with those of three previously reported models. All results indicated that the proposed models were robust and satisfactory and can thus be used as alternative models for the rapid assessment of the K(OW) and K(OA) of PCBs. Copyright © 2016 Elsevier Inc. All rights reserved.
van Strien, Maarten J; Slager, Cornelis T J; de Vries, Bauke; Grêt-Regamey, Adrienne
2016-06-01
Many studies have assessed the effect of landscape patterns on spatial ecological processes by simulating these processes in computer-generated landscapes with varying composition and configuration. To generate such landscapes, various neutral landscape models have been developed. However, the limited set of landscape-level pattern variables included in these models is often inadequate to generate landscapes that reflect real landscapes. In order to achieve more flexibility and variability in the generated landscapes patterns, a more complete set of class- and patch-level pattern variables should be implemented in these models. These enhancements have been implemented in Landscape Generator (LG), which is a software that uses optimization algorithms to generate landscapes that match user-defined target values. Developed for participatory spatial planning at small scale, we enhanced the usability of LG and demonstrated how it can be used for larger scale ecological studies. First, we used LG to recreate landscape patterns from a real landscape (i.e., a mountainous region in Switzerland). Second, we generated landscape series with incrementally changing pattern variables, which could be used in ecological simulation studies. We found that LG was able to recreate landscape patterns that approximate those of real landscapes. Furthermore, we successfully generated landscape series that would not have been possible with traditional neutral landscape models. LG is a promising novel approach for generating neutral landscapes and enables testing of new hypotheses regarding the influence of landscape patterns on ecological processes. LG is freely available online.
Recent R&D status for 70 MW class superconducting generators in the Super-GM project
NASA Astrophysics Data System (ADS)
Ageta, Takasuke
2000-05-01
Three types of 70 MW class superconducting generators called model machines have been developed to establish basic technologies for a pilot machine. The series of on-site verification tests was completed in June 1999. The world's highest generator output (79 MW), the world's longest continuous operation (1500 hours) and other excellent results were obtained. The model machine was connected to a commercial power grid and fundamental data were collected for future utilization. It is expected that fundamental technologies on design and manufacture required for a 200 MW class pilot machine are established.
NASA Technical Reports Server (NTRS)
Xiang, Xuwu; Smith, Eric A.; Tripoli, Gregory J.
1992-01-01
A hybrid statistical-physical retrieval scheme is explored which combines a statistical approach with an approach based on the development of cloud-radiation models designed to simulate precipitating atmospheres. The algorithm employs the detailed microphysical information from a cloud model as input to a radiative transfer model which generates a cloud-radiation model database. Statistical procedures are then invoked to objectively generate an initial guess composite profile data set from the database. The retrieval algorithm has been tested for a tropical typhoon case using Special Sensor Microwave/Imager (SSM/I) data and has shown satisfactory results.
MULTI-LABORATORY STUDY OF FLOW-INDUCED HEMOLYSIS USING THE FDA BENCHMARK NOZZLE MODEL
Herbertson, Luke H.; Olia, Salim E.; Daly, Amanda; Noatch, Christopher P.; Smith, William A.; Kameneva, Marina V.; Malinauskas, Richard A.
2015-01-01
Multilaboratory in vitro blood damage testing was performed on a simple nozzle model to determine how different flow parameters and blood properties affect device-induced hemolysis and to generate data for comparison with computational fluid dynamics-based predictions of blood damage as part of an FDA initiative for assessing medical device safety. Three independent laboratories evaluated hemolysis as a function of nozzle entrance geometry, flow rate, and blood properties. Bovine blood anticoagulated with acid citrate dextrose solution (2–80 h post-draw) was recirculated through nozzle-containing and paired nozzle-free control loops for 2 h. Controlled parameters included hematocrit (36 ± 1.5%), temperature (25°C), blood volume, flow rate, and pressure. Three nozzle test conditions were evaluated (n = 26–36 trials each): (i) sudden contraction at the entrance with a blood flow rate of 5 L/min, (ii) gradual cone at the entrance with a 6-L/min blood flow rate, and (iii) sudden-contraction inlet at 6 L/min. The blood damage caused only by the nozzle model was calculated by subtracting the hemolysis generated by the paired control loop test. Despite high intralaboratory variability, significant differences among the three test conditions were observed, with the sharp nozzle entrance causing the most hemolysis. Modified index of hemolysis (MIHnozzle) values were 0.292 ± 0.249, 0.021 ± 0.128, and 1.239 ± 0.667 for conditions i–iii, respectively. Porcine blood generated hemolysis results similar to those obtained with bovine blood. Although the interlaboratory hemolysis results are only applicable for the specific blood parameters and nozzle model used here, these empirical data may help to advance computational fluid dynamics models for predicting blood damage. PMID:25180887
Control of Wheel/Rail Noise and Vibration
DOT National Transportation Integrated Search
1982-04-01
An analytical model of the generation of wheel/rail noise has been developed and validated through an extensive series of field tests carried out at the Transportation Test Center using the State of the Art Car. A sensitivity analysis has been perfor...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammel, T.E.; Srinivas, V.
1978-11-01
This initial definition of the power degradation prediction technique outlines a model for predicting SIG/Galileo mean EOM power using component test data and data from a module power degradation demonstration test program. (LCL)
Immunotoxicant screening and prioritization in the 21st century
Current immunotoxicity testing guidance for drugs, high production volume chemicals and pesticides specifies the use of animal models that measure immune function or evaluation of general indicators of immune system health generated in routine toxicity testing. The assays are ...
Immunotoxicant screening and prioritization in the 21st century*
Current immunotoxicity testing guidance for drugs, high production volume chemicals and pesticides specifies the use of animal models that measure immune function or evaluation of general indicators of immune system health generated in routine toxicity testing. The assays are r...
An 810 ft/sec soil impact test of a 2-foot diameter model nuclear reactor containment system
NASA Technical Reports Server (NTRS)
Puthoff, R. L.
1972-01-01
A soil impact test was conducted on a 880-pound 2-foot diameter sphere model. The impact area consisted of back filled desert earth and rock. The impact generated a crater 5 feet in diameter by 5 feet deep. It buried itself a total of 15 feet - as measured to the bottom of the model. After impact the containment vessel was pressure checked. No leaks were detected nor cracks observed.
Modeling Regional Seismic Waves from Underground Nuclear Explosion
1989-05-15
consider primarily the long-period tangenital motions in this pilot study because less computational effort is involved compared to modeling the P-SV system...error testing can be a time- consuming endeavor but the basic approach has proven effective in previous studies (Vidale et aL, 1985; Helmberger and Vidale...at various depths in a variety of basin models were generated to test the above hypothesis. When the source is situated in the sediments and when the
Beta Testing of CFD Code for the Analysis of Combustion Systems
NASA Technical Reports Server (NTRS)
Yee, Emma; Wey, Thomas
2015-01-01
A preliminary version of OpenNCC was tested to assess its accuracy in generating steady-state temperature fields for combustion systems at atmospheric conditions using three-dimensional tetrahedral meshes. Meshes were generated from a CAD model of a single-element lean-direct injection combustor, and the latest version of OpenNCC was used to calculate combustor temperature fields. OpenNCC was shown to be capable of generating sustainable reacting flames using a tetrahedral mesh, and the subsequent results were compared to experimental results. While nonreacting flow results closely matched experimental results, a significant discrepancy was present between the code's reacting flow results and experimental results. When wide air circulation regions with high velocities were present in the model, this appeared to create inaccurately high temperature fields. Conversely, low recirculation velocities caused low temperature profiles. These observations will aid in future modification of OpenNCC reacting flow input parameters to improve the accuracy of calculated temperature fields.
NASA Technical Reports Server (NTRS)
Wilbur, Matthew L.; Yeager, William T., Jr.; Sekula, Martin K.
2002-01-01
The vibration reduction capabilities of a model rotor system utilizing controlled, strain-induced blade twisting are examined. The model rotor blades, which utilize piezoelectric active fiber composite actuators, were tested in the NASA Langley Transonic Dynamics Tunnel using open-loop control to determine the effect of active-twist on rotor vibratory loads. The results of this testing have been encouraging, and have demonstrated that active-twist rotor designs offer the potential for significant load reductions in future helicopter rotor systems. Active twist control was found to use less than 1% of the power necessary to operate the rotor system and had a pronounced effect on both rotating- and fixed-system loads, offering reductions in individual harmonic loads of up to 100%. A review of the vibration reduction results obtained is presented, which includes a limited set of comparisons with results generated using the second-generation version of the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD II) rotorcraft comprehensive analysis.
Do Non-Economic Quality of Life Factors Drive Immigration?
ERIC Educational Resources Information Center
Pacheco, Gail Anne; Rossouw, Stephanie; Lewer, Joshua
2013-01-01
This paper contributes to the immigration literature by generating two unique non-economic quality of life (QOL) indices and testing their role on recent migration patterns. Applying the generated QOL indices in conjunction with four independent welfare measures to an augmented gravity model of immigration, this paper finds an insignificant…
2018-01-01
Profile Database E-17 Attachment 2: NRMM Data Input Requirements E-25 Attachment 3: General Physics -Based Model Data Input Requirements E-28...E-15 Figure E-11 Examples of Unique Surface Types E-20 Figure E-12 Correlating Physical Testing with Simulation E-21 Figure E-13 Simplified Tire...Table 10-8 Scoring Values 10-19 Table 10-9 Accuracy – Physics -Based 10-20 Table 10-10 Accuracy – Validation Through Measurement 10-22 Table 10-11
A New Method for Incremental Testing of Finite State Machines
NASA Technical Reports Server (NTRS)
Pedrosa, Lehilton Lelis Chaves; Moura, Arnaldo Vieira
2010-01-01
The automatic generation of test cases is an important issue for conformance testing of several critical systems. We present a new method for the derivation of test suites when the specification is modeled as a combined Finite State Machine (FSM). A combined FSM is obtained conjoining previously tested submachines with newly added states. This new concept is used to describe a fault model suitable for incremental testing of new systems, or for retesting modified implementations. For this fault model, only the newly added or modified states need to be tested, thereby considerably reducing the size of the test suites. The new method is a generalization of the well-known W-method and the G-method, but is scalable, and so it can be used to test FSMs with an arbitrarily large number of states.
Extraction and representation of common feature from uncertain facial expressions with cloud model.
Wang, Shuliang; Chi, Hehua; Yuan, Hanning; Geng, Jing
2017-12-01
Human facial expressions are key ingredient to convert an individual's innate emotion in communication. However, the variation of facial expressions affects the reliable identification of human emotions. In this paper, we present a cloud model to extract facial features for representing human emotion. First, the uncertainties in facial expression are analyzed in the context of cloud model. The feature extraction and representation algorithm is established under cloud generators. With forward cloud generator, facial expression images can be re-generated as many as we like for visually representing the extracted three features, and each feature shows different roles. The effectiveness of the computing model is tested on Japanese Female Facial Expression database. Three common features are extracted from seven facial expression images. Finally, the paper is concluded and remarked.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roedel, S.
1979-06-01
The purpose of the receptacle test program was to test various types of hermetically sealed electrical receptacles and to select one model as the spaceflight hardware item for SIG/Galileo thermoelectric generators. The design goal of the program was to qualify a hermetic seal integrity of less than or equal to 1 x 10/sup -9/ std cc He/sec -atm at 400/sup 0/F (204/sup 0/C) and verify a reliability of 0.95 at a 50% confidence level for a flight mission in excess of 7 years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Xiongfeng; Yuan, Xiao; Cao, Zhu
Quantum physics can be exploited to generate true random numbers, which play important roles in many applications, especially in cryptography. Genuine randomness from the measurement of a quantum system reveals the inherent nature of quantumness -- coherence, an important feature that differentiates quantum mechanics from classical physics. The generation of genuine randomness is generally considered impossible with only classical means. Based on the degree of trustworthiness on devices, quantum random number generators (QRNGs) can be grouped into three categories. The first category, practical QRNG, is built on fully trusted and calibrated devices and typically can generate randomness at a highmore » speed by properly modeling the devices. The second category is self-testing QRNG, where verifiable randomness can be generated without trusting the actual implementation. The third category, semi-self-testing QRNG, is an intermediate category which provides a tradeoff between the trustworthiness on the device and the random number generation speed.« less
3D Face Modeling Using the Multi-Deformable Method
Hwang, Jinkyu; Yu, Sunjin; Kim, Joongrock; Lee, Sangyoun
2012-01-01
In this paper, we focus on the problem of the accuracy performance of 3D face modeling techniques using corresponding features in multiple views, which is quite sensitive to feature extraction errors. To solve the problem, we adopt a statistical model-based 3D face modeling approach in a mirror system consisting of two mirrors and a camera. The overall procedure of our 3D facial modeling method has two primary steps: 3D facial shape estimation using a multiple 3D face deformable model and texture mapping using seamless cloning that is a type of gradient-domain blending. To evaluate our method's performance, we generate 3D faces of 30 individuals and then carry out two tests: accuracy test and robustness test. Our method shows not only highly accurate 3D face shape results when compared with the ground truth, but also robustness to feature extraction errors. Moreover, 3D face rendering results intuitively show that our method is more robust to feature extraction errors than other 3D face modeling methods. An additional contribution of our method is that a wide range of face textures can be acquired by the mirror system. By using this texture map, we generate realistic 3D face for individuals at the end of the paper. PMID:23201976
Acquisition Community Team Dynamics: The Tuckman Model vs. the DAU Model
2007-04-30
courses . These student teams are used to enable the generation of more complex products and to prepare the students for the ...requirement for stage discreteness was met, I developed a stage-separation test that, when applied to the data representing the experience of a... test the reliability, and validate an improved questionnaire instrument that: – Redefines “Storming” with new storming questions Less focused
ERIC Educational Resources Information Center
Freund, Philipp Alexander; Holling, Heinz
2011-01-01
The interpretation of retest scores is problematic because they are potentially affected by measurement and predictive bias, which impact construct validity, and because their size differs as a function of various factors. This paper investigates the construct stability of scores on a figural matrices test and models retest effects at the level of…
NASA Astrophysics Data System (ADS)
Faria, J. M.; Mahomad, S.; Silva, N.
2009-05-01
The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.
NASA Common Research Model Test Envelope Extension With Active Sting Damping at NTF
NASA Technical Reports Server (NTRS)
Rivers, Melissa B.; Balakrishna, S.
2014-01-01
The NASA Common Research Model (CRM) high Reynolds number transonic wind tunnel testing program was established to generate an experimental database for applied Computational Fluid Dynamics (CFD) validation studies. During transonic wind tunnel tests, the CRM encounters large sting vibrations when the angle of attack approaches the second pitching moment break, which can sometimes become divergent. CRM transonic test data analysis suggests that sting divergent oscillations are related to negative net sting damping episodes associated with flow separation instability. The National Transonic Facility (NTF) has been addressing remedies to extend polar testing up to and beyond the second pitching moment break point of the test articles using an active piezoceramic damper system for both ambient and cryogenic temperatures. This paper reviews CRM test results to gain understanding of sting dynamics with a simple model describing the mechanics of a sting-model system and presents the performance of the damper under cryogenic conditions.
NASA Astrophysics Data System (ADS)
Lucas, G.; Lénárt, C.; Solymosi, J.
2015-08-01
This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree) and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines). Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long), 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect). The second model shows only 1% difference with the variation of feature number; so this last is less interesting for planning optimization applications. Last model rather simply fulfils the task it was designed for by drawing navigation lines.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel l.; Brown, Clifford A.; Walker, Bruce E.
2014-01-01
An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14- by 22-ft wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8 percent scale model. The UCFANS is a 5.8 percent rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the test was to provide an estimate of the acoustic shielding benefits possible from mounting the engine on the upper surface of an HWB aircraft using the projected signature of the engine currently proposed for the HWB. The modal structures at the rating points were generated from inlet and exhaust nacelle configurations--a flat plate model was used as the shielding surface and vertical control surfaces with correct plan form shapes were also tested to determine their additional impact on shielding. Radiated acoustic data were acquired from a traversing linear array of 13 microphones, spanning 36 in. Two planes perpendicular, and two planes parallel, to the axis of the nacelle were acquired from the array sweep. In each plane the linear array traversed four sweeps, for a total span of 168 in. acquired. The resolution of the sweep is variable, so that points closer to the model are taken at a higher resolution. Contour plots of Sound Pressure Levels, and integrated Power Levels, from nacelle alone and shielded configurations are presented in this paper; as well as the in-duct mode power levels
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.; Brown, Cliff; Walker, Bruce E.
2014-01-01
An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14x22 wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8% scale model. The UCFANS is a 5.8% rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the test was to provide an estimate of the acoustic shielding benefits possible from mounting the engine on the upper surface of an HWB aircraft using the projected signature of the engine currently proposed for the HWB. The modal structures at the rating points were generated from inlet and exhaust nacelle configurations - a flat plate model was used as the shielding surface and vertical control surfaces with correct plan form shapes were also tested to determine their additional impact on shielding. Radiated acoustic data were acquired from a traversing linear array of 13 microphones, spanning 36 inches. Two planes perpendicular, and two planes parallel, to the axis of the nacelle were acquired from the array sweep. In each plane the linear array traversed 4 sweeps, for a total span of 168 inches acquired. The resolution of the sweep is variable, so that points closer to the model are taken at a higher resolution. Contour plots of Sound Pressure Levels, and integrated Power Levels, from nacelle alone and shielded configurations are presented in this paper; as well as the in-duct mode power levels.
Improvement of structural models using covariance analysis and nonlinear generalized least squares
NASA Technical Reports Server (NTRS)
Glaser, R. J.; Kuo, C. P.; Wada, B. K.
1992-01-01
The next generation of large, flexible space structures will be too light to support their own weight, requiring a system of structural supports for ground testing. The authors have proposed multiple boundary-condition testing (MBCT), using more than one support condition to reduce uncertainties associated with the supports. MBCT would revise the mass and stiffness matrix, analytically qualifying the structure for operation in space. The same procedure is applicable to other common test conditions, such as empty/loaded tanks and subsystem/system level tests. This paper examines three techniques for constructing the covariance matrix required by nonlinear generalized least squares (NGLS) to update structural models based on modal test data. The methods range from a complicated approach used to generate the simulation data (i.e., the correct answer) to a diagonal matrix based on only two constants. The results show that NGLS is very insensitive to assumptions about the covariance matrix, suggesting that a workable NGLS procedure is possible. The examples also indicate that the multiple boundary condition procedure more accurately reduces errors than individual boundary condition tests alone.
Computational Support of 9x7 Wind Tunnel Test of Sonic Boom Models with Plumes
NASA Technical Reports Server (NTRS)
Jensen, James C.; Denison, Marie; Durston, Don; Cliff, Susan E.
2017-01-01
NASA and its industry partners are performing studies of supersonic aircraft concepts with low sonic boom pressure signatures. The interaction of the nozzle jet flow with the aircrafts' aft components is typically where the greatest uncertainly in the pressure signature is observed with high-fidelity numerical simulations. An extensive wind tunnel test was conducted in February 2016 in the NASA Ames 9- by 7- Foot Supersonic Wind Tunnel to help address the nozzle jet effects on sonic boom. Five test models with a variety of shock generators of differing waveforms and strengths were tested with a convergent-divergent nozzle for a wide range of nozzle pressure ratios. The LAVA unstructured flow solver was used to generate first CFD comparisons with the new experimental database using best practice meshing and analysis techniques for sonic boom vehicle design for all five different configurations. LAVA was also used to redesign the internal flow path of the nozzle and to better understand the flow field in the test section, both of which significantly improved the quality of the test data.
Wu, X; Lund, M S; Sun, D; Zhang, Q; Su, G
2015-10-01
One of the factors affecting the reliability of genomic prediction is the relationship among the animals of interest. This study investigated the reliability of genomic prediction in various scenarios with regard to the relationship between test and training animals, and among animals within the training data set. Different training data sets were generated from EuroGenomics data and a group of Nordic Holstein bulls (born in 2005 and afterwards) as a common test data set. Genomic breeding values were predicted using a genomic best linear unbiased prediction model and a Bayesian mixture model. The results showed that a closer relationship between test and training animals led to a higher reliability of genomic predictions for the test animals, while a closer relationship among training animals resulted in a lower reliability. In addition, the Bayesian mixture model in general led to a slightly higher reliability of genomic prediction, especially for the scenario of distant relationships between training and test animals. Therefore, to prevent a decrease in reliability, constant updates of the training population with animals from more recent generations are required. Moreover, a training population consisting of less-related animals is favourable for reliability of genomic prediction. © 2015 Blackwell Verlag GmbH.
Genetic Analysis of Reduced γ-Tocopherol Content in Ethiopian Mustard Seeds.
García-Navarro, Elena; Fernández-Martínez, José M; Pérez-Vich, Begoña; Velasco, Leonardo
2016-01-01
Ethiopian mustard (Brassica carinata A. Braun) line BCT-6, with reduced γ-tocopherol content in the seeds, has been previously developed. The objective of this research was to conduct a genetic analysis of seed tocopherols in this line. BCT-6 was crossed with the conventional line C-101 and the F1, F2, and BC plant generations were analyzed. Generation mean analysis using individual scaling tests indicated that reduced γ-tocopherol content fitted an additive-dominant genetic model with predominance of additive effects and absence of epistatic interactions. This was confirmed through a joint scaling test and additional testing of the goodness of fit of the model. Conversely, epistatic interactions were identified for total tocopherol content. Estimation of the minimum number of genes suggested that both γ- and total tocopherol content may be controlled by two genes. A positive correlation between total tocopherol content and the proportion of γ-tocopherol was identified in the F2 generation. Additional research on the feasibility of developing germplasm with high tocopherol content and reduced concentration of γ-tocopherol is required.
Influence of Steering Control Devices Mounted in Cars for the Disabled on Passive Safety
NASA Astrophysics Data System (ADS)
Masiá, J.; Eixerés, B.; Dols, J. F.; Colomina, F. J.
2009-11-01
The purpose of this research is to analyze the influence of steering control devices for disabled people on passive safety. It is based on the advances made in the modelling and simulation of the driver position and in the suit verification test. The influence of these devices is studied through airbag deployment and/or its influence on driver safety. We characterize the different adaptations that are used in adapted cars that can be found mounted in vehicles in order to generating models that are verified by experimental test. A three dimensional design software package was used to develop the model. The simulations were generated using a dynamic simulation program employing LSDYNA finite elements. This program plots the geometry and assigns materials. The airbag is shaped, meshed and folded just as it is mounted in current vehicles. The thermodynamic model of expansion of gases is assigned and the contact interfaces are defined. Static tests were carried out on deployment of the airbag to contrast with and to validate the computational models and to measure the behaviour of the airbag when there are steering adaptations mounted in the vehicle.
A Wind-Tunnel Investigation of Tilt-Rotor Gust Alleviation Systems
NASA Technical Reports Server (NTRS)
Ham, N. D.; Whitaker, H. P.
1978-01-01
The alleviation of the effects of gusts on tilt rotor aircraft by means of active control systems was investigated. The gust generator, the derivation of the equations of motion of the rotor wing combination, the correlation of these equations with the results of wind tunnel model tests, the use of the equations to design various gust alleviating active control systems, and the testing and evaluation of these control systems by means of wind tunnel model tests were developed.
A Model Based Security Testing Method for Protocol Implementation
Fu, Yu Long; Xin, Xiao Long
2014-01-01
The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation. PMID:25105163
A model based security testing method for protocol implementation.
Fu, Yu Long; Xin, Xiao Long
2014-01-01
The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.
Xu, Xiaojuan; Weber, Daniel; Burge, Rebekah; VanAmberg, Kelsey
2016-01-01
The zebrafish has become a useful animal model for studying the effects of environmental contaminants on neurobehavioral development due to its ease of breeding, high number of eggs per female, short generation times, and a well-established avoidance conditioning paradigm. Using avoidance conditioning as the behavioral paradigm, the present study investigated the effects of embryonic exposure to lead (Pb) on learning in adult zebrafish and the third (F3) generation of those fish. In Experiment 1, adult zebrafish that were developmentally exposed to 0.0, 0.1, 1.0 or 10.0μM Pb (2-24h post fertilization) as embryos were trained and tested for avoidance responses. The results showed that adult zebrafish hatched from embryos exposed to 0.0 or 0.1μM Pb learned avoidance responses during training and displayed significantly increased avoidance responses during testing, while those hatched from embryos exposed to 1.0 or 10.0μM Pb displayed no significant increases in avoidance responses from training to testing. In Experiment 2, the F3 generation of zebrafish that were developmentally exposed to an identical exposure regimen as in Experiment 1 were trained and tested for avoidance responses. The results showed that the F3 generation of zebrafish developmentally exposed as embryos to 0.0 or 0.1μM Pb learned avoidance responses during training and displayed significantly increased avoidance responses during testing, while the F3 generation of zebrafish developmentally exposed as embryos to 1.0 or 10.0μM Pb displayed no significant changes in avoidance responses from training to testing. Thus, developmental Pb exposure produced learning impairments that persisted for at least three generations, demonstrating trans-generational effects of embryonic exposure to Pb. Copyright © 2015. Published by Elsevier B.V.
Sociality influences cultural complexity.
Muthukrishna, Michael; Shulman, Ben W; Vasilescu, Vlad; Henrich, Joseph
2014-01-07
Archaeological and ethnohistorical evidence suggests a link between a population's size and structure, and the diversity or sophistication of its toolkits or technologies. Addressing these patterns, several evolutionary models predict that both the size and social interconnectedness of populations can contribute to the complexity of its cultural repertoire. Some models also predict that a sudden loss of sociality or of population will result in subsequent losses of useful skills/technologies. Here, we test these predictions with two experiments that permit learners to access either one or five models (teachers). Experiment 1 demonstrates that naive participants who could observe five models, integrate this information and generate increasingly effective skills (using an image editing tool) over 10 laboratory generations, whereas those with access to only one model show no improvement. Experiment 2, which began with a generation of trained experts, shows how learners with access to only one model lose skills (in knot-tying) more rapidly than those with access to five models. In the final generation of both experiments, all participants with access to five models demonstrate superior skills to those with access to only one model. These results support theoretical predictions linking sociality to cumulative cultural evolution.
Sociality influences cultural complexity
Muthukrishna, Michael; Shulman, Ben W.; Vasilescu, Vlad; Henrich, Joseph
2014-01-01
Archaeological and ethnohistorical evidence suggests a link between a population's size and structure, and the diversity or sophistication of its toolkits or technologies. Addressing these patterns, several evolutionary models predict that both the size and social interconnectedness of populations can contribute to the complexity of its cultural repertoire. Some models also predict that a sudden loss of sociality or of population will result in subsequent losses of useful skills/technologies. Here, we test these predictions with two experiments that permit learners to access either one or five models (teachers). Experiment 1 demonstrates that naive participants who could observe five models, integrate this information and generate increasingly effective skills (using an image editing tool) over 10 laboratory generations, whereas those with access to only one model show no improvement. Experiment 2, which began with a generation of trained experts, shows how learners with access to only one model lose skills (in knot-tying) more rapidly than those with access to five models. In the final generation of both experiments, all participants with access to five models demonstrate superior skills to those with access to only one model. These results support theoretical predictions linking sociality to cumulative cultural evolution. PMID:24225461
An Approach to Model Based Testing of Multiagent Systems
Nadeem, Aamer
2015-01-01
Autonomous agents perform on behalf of the user to achieve defined goals or objectives. They are situated in dynamic environment and are able to operate autonomously to achieve their goals. In a multiagent system, agents cooperate with each other to achieve a common goal. Testing of multiagent systems is a challenging task due to the autonomous and proactive behavior of agents. However, testing is required to build confidence into the working of a multiagent system. Prometheus methodology is a commonly used approach to design multiagents systems. Systematic and thorough testing of each interaction is necessary. This paper proposes a novel approach to testing of multiagent systems based on Prometheus design artifacts. In the proposed approach, different interactions between the agent and actors are considered to test the multiagent system. These interactions include percepts and actions along with messages between the agents which can be modeled in a protocol diagram. The protocol diagram is converted into a protocol graph, on which different coverage criteria are applied to generate test paths that cover interactions between the agents. A prototype tool has been developed to generate test paths from protocol graph according to the specified coverage criterion. PMID:25874263
Duffy, Cayla M.; Swanson, Jacob; Northrop, William; Nixon, Joshua P.; Butterick, Tammy A.
2018-01-01
The brain is the central regulator for integration and control of responses to environmental cues. Previous studies suggest that air pollution may directly impact brain health by triggering the onset of chronic neuroinflammation. We hypothesize that nanoparticle components of combustion-generated air pollution may underlie these effects. To test this association, a microglial in vitro biological sensor model was used for testing neuroinflammatory response caused by low-dose nanoparticle exposure. The model was first validated using 20 nm silver nanoparticles (AgNP). Next, neuroinflammatory response was tested after exposure to size-selected 20 nm combustion-generated nanoparticles (CGNP) collected from a modern diesel engine. We show that low concentrations of CGNPs promote low-grade inflammatory response indicated by increased pro-inflammatory cytokine release (tumor necrosis factor-α), similar to that observed after AgNP exposure. We also demonstrate increased production of reactive oxygen species and nuclear factor kappa-light-chain-enhancer of activated B cells (NF-κB) p65 phosphorylation in microglia after CGNP stimulation. Finally, we show conditioned media from CGNP-stimulated microglia significantly reduced hypothalamic neuronal survival in vitro. To our knowledge, this data show for the first time that exposure to AgNP and CGNP elicits microglial neuroinflammatory response through the activation of NF-κB. PMID:29522448
A Model Independent S/W Framework for Search-Based Software Testing
Baik, Jongmoon
2014-01-01
In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314
Sohr-Preston, Sara L.; Scaramella, Laura V.; Martin, Monica J.; Neppl, Tricia K.; Ontai, Lenna; Conger, Rand
2012-01-01
This 3-generation, longitudinal study evaluated a family investment perspective on family socioeconomic status (SES), parental investments in children, and child development. The theoretical framework was tested for first generation parents (G1), their children (G2), and for the children of the second generation (G3). G1 SES was expected to predict clear and responsive parental communication. Parental investments were expected to predict educational attainment and parenting for G2 and vocabulary development for G3. For the 139 families in the study, data were collected when G2 were adolescents and early adults and their oldest biological child (G3) was 3–4 years of age. The results demonstrate the importance of SES and parental investments for the development of children and adolescents across multiple generations. PMID:23199236
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patton, A.D.; Ayoub, A.K.; Singh, C.
1982-07-01
This report describes the structure and operation of prototype computer programs developed for a Monte Carlo simulation model, GENESIS, and for two analytical models, OPCON and OPPLAN. It includes input data requirements and sample test cases.
Methodology for the development of normative data for Spanish-speaking pediatric populations.
Rivera, D; Arango-Lasprilla, J C
2017-01-01
To describe the methodology utilized to calculate reliability and the generation of norms for 10 neuropsychological tests for children in Spanish-speaking countries. The study sample consisted of over 4,373 healthy children from nine countries in Latin America (Chile, Cuba, Ecuador, Guatemala, Honduras, Mexico, Paraguay, Peru, and Puerto Rico) and Spain. Inclusion criteria for all countries were to have between 6 to 17 years of age, an Intelligence Quotient of≥80 on the Test of Non-Verbal Intelligence (TONI-2), and score of <19 on the Children's Depression Inventory. Participants completed 10 neuropsychological tests. Reliability and norms were calculated for all tests. Test-retest analysis showed excellent or good- reliability on all tests (r's>0.55; p's<0.001) except M-WCST perseverative errors whose coefficient magnitude was fair. All scores were normed using multiple linear regressions and standard deviations of residual values. Age, age2, sex, and mean level of parental education (MLPE) were included as predictors in the models by country. The non-significant variables (p > 0.05) were removed and the analysis were run again. This is the largest Spanish-speaking children and adolescents normative study in the world. For the generation of normative data, the method based on linear regression models and the standard deviation of residual values was used. This method allows determination of the specific variables that predict test scores, helps identify and control for collinearity of predictive variables, and generates continuous and more reliable norms than those of traditional methods.
Life Prediction Model for Grid-Connected Li-ion Battery Energy Storage System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Saxon, Aron R; Keyser, Matthew A
Lithium-ion (Li-ion) batteries are being deployed on the electrical grid for a variety of purposes, such as to smooth fluctuations in solar renewable power generation. The lifetime of these batteries will vary depending on their thermal environment and how they are charged and discharged. To optimal utilization of a battery over its lifetime requires characterization of its performance degradation under different storage and cycling conditions. Aging tests were conducted on commercial graphite/nickel-manganese-cobalt (NMC) Li-ion cells. A general lifetime prognostic model framework is applied to model changes in capacity and resistance as the battery degrades. Across 9 aging test conditions frommore » 0oC to 55oC, the model predicts capacity fade with 1.4% RMS error and resistance growth with 15% RMS error. The model, recast in state variable form with 8 states representing separate fade mechanisms, is used to extrapolate lifetime for example applications of the energy storage system integrated with renewable photovoltaic (PV) power generation.« less
NASA Astrophysics Data System (ADS)
He, Yang-Hui; Jejjala, Vishnu; Matti, Cyril; Nelson, Brent D.; Stillman, Michael
2015-10-01
We present an intriguing and precise interplay between algebraic geometry and the phenomenology of generations of particles. Using the electroweak sector of the MSSM as a testing ground, we compute the moduli space of vacua as an algebraic variety for multiple generations of Standard Model matter and Higgs doublets. The space is shown to have Calabi-Yau, Grassmannian, and toric signatures, which sensitively depend on the number of generations of leptons, as well as inclusion of Majorana mass terms for right-handed neutrinos. We speculate as to why three generations is special.
Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Venkat K; Palmintier, Bryan S; Hodge, Brian S
The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present themore » goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.« less
Maximum wind energy extraction strategies using power electronic converters
NASA Astrophysics Data System (ADS)
Wang, Quincy Qing
2003-10-01
This thesis focuses on maximum wind energy extraction strategies for achieving the highest energy output of variable speed wind turbine power generation systems. Power electronic converters and controls provide the basic platform to accomplish the research of this thesis in both hardware and software aspects. In order to send wind energy to a utility grid, a variable speed wind turbine requires a power electronic converter to convert a variable voltage variable frequency source into a fixed voltage fixed frequency supply. Generic single-phase and three-phase converter topologies, converter control methods for wind power generation, as well as the developed direct drive generator, are introduced in the thesis for establishing variable-speed wind energy conversion systems. Variable speed wind power generation system modeling and simulation are essential methods both for understanding the system behavior and for developing advanced system control strategies. Wind generation system components, including wind turbine, 1-phase IGBT inverter, 3-phase IGBT inverter, synchronous generator, and rectifier, are modeled in this thesis using MATLAB/SIMULINK. The simulation results have been verified by a commercial simulation software package, PSIM, and confirmed by field test results. Since the dynamic time constants for these individual models are much different, a creative approach has also been developed in this thesis to combine these models for entire wind power generation system simulation. An advanced maximum wind energy extraction strategy relies not only on proper system hardware design, but also on sophisticated software control algorithms. Based on literature review and computer simulation on wind turbine control algorithms, an intelligent maximum wind energy extraction control algorithm is proposed in this thesis. This algorithm has a unique on-line adaptation and optimization capability, which is able to achieve maximum wind energy conversion efficiency through continuously improving the performance of wind power generation systems. This algorithm is independent of wind power generation system characteristics, and does not need wind speed and turbine speed measurements. Therefore, it can be easily implemented into various wind energy generation systems with different turbine inertia and diverse system hardware environments. In addition to the detailed description of the proposed algorithm, computer simulation results are presented in the thesis to demonstrate the advantage of this algorithm. As a final confirmation of the algorithm feasibility, the algorithm has been implemented inside a single-phase IGBT inverter, and tested with a wind simulator system in research laboratory. Test results were found consistent with the simulation results. (Abstract shortened by UMI.)
Menze, Bjoern H.; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-André; Székely, Gabor; Ayache, Nicholas; Golland, Polina
2016-01-01
We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM) to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as “tumor core” or “fluid-filled structure”, but without a one-to-one correspondence to the hypo-or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the generative-discriminative model to be one of the top ranking methods in the BRATS evaluation. PMID:26599702
The S-curve for forecasting waste generation in construction projects.
Lu, Weisheng; Peng, Yi; Chen, Xi; Skitmore, Martin; Zhang, Xiaoling
2016-10-01
Forecasting construction waste generation is the yardstick of any effort by policy-makers, researchers, practitioners and the like to manage construction and demolition (C&D) waste. This paper develops and tests an S-curve model to indicate accumulative waste generation as a project progresses. Using 37,148 disposal records generated from 138 building projects in Hong Kong in four consecutive years from January 2011 to June 2015, a wide range of potential S-curve models are examined, and as a result, the formula that best fits the historical data set is found. The S-curve model is then further linked to project characteristics using artificial neural networks (ANNs) so that it can be used to forecast waste generation in future construction projects. It was found that, among the S-curve models, cumulative logistic distribution is the best formula to fit the historical data. Meanwhile, contract sum, location, public-private nature, and duration can be used to forecast construction waste generation. The study provides contractors with not only an S-curve model to forecast overall waste generation before a project commences, but also with a detailed baseline to benchmark and manage waste during the course of construction. The major contribution of this paper is to the body of knowledge in the field of construction waste generation forecasting. By examining it with an S-curve model, the study elevates construction waste management to a level equivalent to project cost management where the model has already been readily accepted as a standard tool. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picologlou, B F; Batenin, V M
1981-01-01
A description of the main results obtained during Tests No. 6 and 7 at the U-25B Facility using the new channel No. 2 is presented. The purpose of these tests was to operate the MHD generator at its design parameters. Described here are new plasma diagnostic devices: a traversing dual electrical probe for determining distribution of electron concentrations, and a traversing probe that includes a pitot tube for measuring total and static pressure, and a light detector for measuring plasma luminescence. Data are presented on heat flux distribution along the channel, the first data of this type obtained for anmore » MHD facility of such size. Results are given of experimental studies of plasma characteristics, gasdynamic, thermal, and electrical MHD channel performance, and temporal and spatial nonuniformities. Typical modes of operation are analyzed by means of local electrical analyses. Computer models are used to obtain predictions for both localized and overall generator characteristics. These theoretical predictions agree closely with the results of the local analyses, as well as with measurements of the overall gasdynamic and electrical characteristics of the generator.« less
Automated Item Generation with Recurrent Neural Networks.
von Davier, Matthias
2018-03-12
Utilizing technology for automated item generation is not a new idea. However, test items used in commercial testing programs or in research are still predominantly written by humans, in most cases by content experts or professional item writers. Human experts are a limited resource and testing agencies incur high costs in the process of continuous renewal of item banks to sustain testing programs. Using algorithms instead holds the promise of providing unlimited resources for this crucial part of assessment development. The approach presented here deviates in several ways from previous attempts to solve this problem. In the past, automatic item generation relied either on generating clones of narrowly defined item types such as those found in language free intelligence tests (e.g., Raven's progressive matrices) or on an extensive analysis of task components and derivation of schemata to produce items with pre-specified variability that are hoped to have predictable levels of difficulty. It is somewhat unlikely that researchers utilizing these previous approaches would look at the proposed approach with favor; however, recent applications of machine learning show success in solving tasks that seemed impossible for machines not too long ago. The proposed approach uses deep learning to implement probabilistic language models, not unlike what Google brain and Amazon Alexa use for language processing and generation.
Metallic Rotor Sizing and Performance Model for Flywheel Systems
NASA Technical Reports Server (NTRS)
Moore, Camille J.; Kraft, Thomas G.
2012-01-01
The NASA Glenn Research Center (GRC) is developing flywheel system requirements and designs for terrestrial and spacecraft applications. Several generations of flywheels have been designed and tested at GRC using in-house expertise in motors, magnetic bearings, controls, materials and power electronics. The maturation of a flywheel system from the concept phase to the preliminary design phase is accompanied by maturation of the Integrated Systems Performance model, where estimating relationships are replaced by physics based analytical techniques. The modeling can incorporate results from engineering model testing and emerging detail from the design process.
Fracture prediction using modified mohr coulomb theory for non-linear strain paths using AA3104-H19
NASA Astrophysics Data System (ADS)
Dick, Robert; Yoon, Jeong Whan
2016-08-01
Experiment results from uniaxial tensile tests, bi-axial bulge tests, and disk compression tests for a beverage can AA3104-H19 material are presented. The results from the experimental tests are used to determine material coefficients for both Yld2000 and Yld2004 models. Finite element simulations are developed to study the influence of materials model on the predicted earing profile. It is shown that only the YLD2004 model is capable of accurately predicting the earing profile as the YLD2000 model only predicts 4 ears. Excellent agreement with the experimental data for earing is achieved using the AA3104-H19 material data and the Yld2004 constitutive model. Mechanical tests are also conducted on the AA3104-H19 to generate fracture data under different stress triaxiality conditions. Tensile tests are performed on specimens with a central hole and notched specimens. Torsion of a double bridge specimen is conducted to generate points near pure shear conditions. The Nakajima test is utilized to produce points in bi-axial tension. The data from the experiments is used to develop the fracture locus in the principal strain space. Mapping from principal strain space to stress triaxiality space, principal stress space, and polar effective plastic strain space is accomplished using a generalized mapping technique. Finite element modeling is used to validate the Modified Mohr-Coulomb (MMC) fracture model in the polar space. Models of a hole expansion during cup drawing and a cup draw/reverse redraw/expand forming sequence demonstrate the robustness of the modified PEPS fracture theory for the condition with nonlinear forming paths and accurately predicts the onset of failure. The proposed methods can be widely used for predicting failure for the examples which undergo nonlinear strain path including rigid-packaging and automotive forming.
Automated knowledge generation
NASA Technical Reports Server (NTRS)
Myler, Harley R.; Gonzalez, Avelino J.
1988-01-01
The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).
NASA Technical Reports Server (NTRS)
Henderson, Brenda S.; Doty, Mike
2012-01-01
Acoustic and flow-field experiments were conducted on exhaust concepts for the next generation supersonic, commercial aircraft. The concepts were developed by Lockheed Martin (LM), Rolls-Royce Liberty Works (RRLW), and General Electric Global Research (GEGR) as part of an N+2 (next generation forward) aircraft system study initiated by the Supersonics Project in NASA s Fundamental Aeronautics Program. The experiments were conducted in the Aero-Acoustic Propulsion Laboratory at the NASA Glenn Research Center. The exhaust concepts presented here utilized lobed-mixers and ejectors. A powered third-stream was implemented to improve ejector acoustic performance. One concept was found to produce stagnant flow within the ejector and the other produced discrete-frequency tones (due to flow separations within the model) that degraded the acoustic performance of the exhaust concept. NASA's Environmentally Responsible Aviation (ERA) Project has been investigating a Hybrid Wing Body (HWB) aircraft as a possible configuration for meeting N+2 system level goals for noise, emissions, and fuel burn. A recently completed NRA led by Boeing Research and Technology resulted in a full-scale aircraft design and wind tunnel model. This model will be tested acoustically in NASA Langley's 14-by 22-Foot Subsonic Tunnel and will include dual jet engine simulators and broadband engine noise simulators as part of the test campaign. The objectives of the test are to characterize the system level noise, quantify the effects of shielding, and generate a valuable database for prediction method development. Further details of the test and various component preparations are described.
Hase, E; Sato, K; Yonekura, D; Minamikawa, T; Takahashi, M; Yasui, T
2016-11-01
This study aimed to evaluate the histological and mechanical features of tendon healing in a rabbit model with second-harmonic-generation (SHG) imaging and tensile testing. A total of eight male Japanese white rabbits were used for this study. The flexor digitorum tendons in their right leg were sharply transected, and then were repaired by intratendinous stitching. At four weeks post-operatively, the rabbits were killed and the flexor digitorum tendons in both right and left legs were excised and used as specimens for tendon healing (n = 8) and control (n = 8), respectively. Each specimen was examined by SHG imaging, followed by tensile testing, and the results of the two testing modalities were assessed for correlation. While the SHG light intensity of the healing tendon samples was significantly lower than that of the uninjured tendon samples, 2D Fourier transform SHG images showed a clear difference in collagen fibre structure between the uninjured and the healing samples, and among the healing samples. The mean intensity of the SHG image showed a moderate correlation (R 2 = 0.37) with Young's modulus obtained from the tensile testing. Our results indicate that SHG microscopy may be a potential indicator of tendon healing.Cite this article: E. Hase, K. Sato, D. Yonekura, T. Minamikawa, M. Takahashi, T. Yasui. Evaluation of the histological and mechanical features of tendon healing in a rabbit model with the use of second-harmonic-generation imaging and tensile testing. Bone Joint Res 2016;5:577-585. DOI: 10.1302/2046-3758.511.BJR-2016-0162.R1. © 2016 Yasui et al.
NASA Astrophysics Data System (ADS)
Roadman, Jason Markos
Modern technology operating in the atmospheric boundary layer can always benefit from more accurate wind tunnel testing. While scaled atmospheric boundary layer tunnels have been well developed, tunnels replicating portions of the atmospheric boundary layer turbulence at full scale are a comparatively new concept. Testing at full-scale Reynolds numbers with full-scale turbulence in an "atmospheric wind tunnel" is sought. Many programs could utilize such a tool including Micro Aerial Vehicle(MAV) development, the wind energy industry, fuel efficient vehicle design, and the study of bird and insect flight, to name just a few. The small scale of MAVs provide the somewhat unique capability of full scale Reynolds number testing in a wind tunnel. However, that same small scale creates interactions under real world flight conditions, atmospheric gusts for example, that lead to a need for testing under more complex flows than the standard uniform flow found in most wind tunnels. It is for these reasons that MAVs are used as the initial testing application for the atmospheric gust tunnel. An analytical model for both discrete gusts and a continuous spectrum of gusts is examined. Then, methods for generating gusts in agreement with that model are investigated. Previously used methods are reviewed and a gust generation apparatus is designed. Expected turbulence and gust characteristics of this apparatus are compared with atmospheric data. The construction of an active "gust generator" for a new atmospheric tunnel is reviewed and the turbulence it generates is measured utilizing single and cross hot wires. Results from this grid are compared to atmospheric turbulence and it is shown that various gust strengths can be produced corresponding to weather ranging from calm to quite gusty. An initial test is performed in the atmospheric wind tunnel whereby the effects of various turbulence conditions on transition and separation on the upper surface of a MAV wing is investigated using the surface oil flow visualization technique.
Infrared thermography non-destructive evaluation of lithium-ion battery
NASA Astrophysics Data System (ADS)
Wang, Zi-jun; Li, Zhi-qiang; Liu, Qiang
2011-08-01
The power lithium-ion battery with its high specific energy, high theoretical capacity and good cycle-life is a prime candidate as a power source for electric vehicles (EVs) and hybrid electric vehicles (HEVs). Safety is especially important for large-scale lithium-ion batteries, especially the thermal analysis is essential for their development and design. Thermal modeling is an effective way to understand the thermal behavior of the lithium-ion battery during charging and discharging. With the charging and discharging, the internal heat generation of the lithium-ion battery becomes large, and the temperature rises leading to an uneven temperature distribution induces partial degradation. Infrared (IR) Non-destructive Evaluation (NDE) has been well developed for decades years in materials, structures, and aircraft. Most thermographic methods need thermal excitation to the measurement structures. In NDE of battery, the thermal excitation is the heat generated from carbon and cobalt electrodes in electrolyte. A technique named "power function" has been developed to determine the heat by chemical reactions. In this paper, the simulations of the transient response of the temperature distribution in the lithium-ion battery are developed. The key to resolving the security problem lies in the thermal controlling, including the heat generation and the internal and external heat transfer. Therefore, three-dimensional modelling for capturing geometrical thermal effects on battery thermal abuse behaviour is required. The simulation model contains the heat generation during electrolyte decomposition and electrical resistance component. Oven tests are simulated by three-dimensional model and the discharge test preformed by test system. Infrared thermography of discharge is recorded in order to analyze the security of the lithium-ion power battery. Nondestructive detection is performed for thermal abuse analysis and discharge analysis.
NASA Astrophysics Data System (ADS)
Kut, Stanislaw; Ryzinska, Grazyna; Niedzialek, Bernadetta
2016-01-01
The article presents the results of tests in order to verifying the effectiveness of the nine selected elastomeric material models (Neo-Hookean, Mooney with two and three constants, Signorini, Yeoh, Ogden, Arruda-Boyce, Gent and Marlow), which the material constants were determined in one material test - the uniaxial tension testing. The convergence assessment of nine analyzed models were made on the basis of their performance from an experimental bending test of the elastomer samples from the results of numerical calculations FEM for each material models. To calculate the material constants for the analyzed materials, a model has been generated by the stressstrain characteristics created as a result of experimental uniaxial tensile test with elastomeric dumbbell samples, taking into account the parameters received in its 18th cycle. Using such a calculated material constants numerical simulation of the bending process of a elastomeric, parallelepipedic sampleswere carried out using MARC / Mentat program.
Intrusion detection using rough set classification.
Zhang, Lian-hua; Zhang, Guan-hua; Zhang, Jie; Bai, Ying-cai
2004-09-01
Recently machine learning-based intrusion detection approaches have been subjected to extensive researches because they can detect both misuse and anomaly. In this paper, rough set classification (RSC), a modern learning algorithm, is used to rank the features extracted for detecting intrusions and generate intrusion detection models. Feature ranking is a very critical step when building the model. RSC performs feature ranking before generating rules, and converts the feature ranking to minimal hitting set problem addressed by using genetic algorithm (GA). This is done in classical approaches using Support Vector Machine (SVM) by executing many iterations, each of which removes one useless feature. Compared with those methods, our method can avoid many iterations. In addition, a hybrid genetic algorithm is proposed to increase the convergence speed and decrease the training time of RSC. The models generated by RSC take the form of "IF-THEN" rules, which have the advantage of explication. Tests and comparison of RSC with SVM on DARPA benchmark data showed that for Probe and DoS attacks both RSC and SVM yielded highly accurate results (greater than 99% accuracy on testing set).
NASA Technical Reports Server (NTRS)
Kadambi, J. R.; Schneider, S. J.; Stewart, W. A.
1986-01-01
The natural circulation of a single phase fluid in a scale model of a pressurized water reactor system during a postulated grade core accident is analyzed. The fluids utilized were water and SF6. The design of the reactor model and the similitude requirements are described. Four LDA tests were conducted: water with 28 kW of heat in the simulated core, with and without the participation of simulated steam generators; water with 28 kW of heat in the simulated core, with the participation of simulated steam generators and with cold upflow of 12 lbm/min from the lower plenum; and SF6 with 0.9 kW of heat in the simulated core and without the participation of the simulated steam generators. For the water tests, the velocity of the water in the center of the core increases with vertical height and continues to increase in the upper plenum. For SF6, it is observed that the velocities are an order of magnitude higher than those of water; however, the velocity patterns are similar.
Electric Water Heater Modeling and Control Strategies for Demand Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diao, Ruisheng; Lu, Shuai; Elizondo, Marcelo A.
2012-07-22
Abstract— Demand response (DR) has a great potential to provide balancing services at normal operating conditions and emergency support when a power system is subject to disturbances. Effective control strategies can significantly relieve the balancing burden of conventional generators and reduce investment on generation and transmission expansion. This paper is aimed at modeling electric water heaters (EWH) in households and tests their response to control strategies to implement DR. The open-loop response of EWH to a centralized signal is studied by adjusting temperature settings to provide regulation services; and two types of decentralized controllers are tested to provide frequency supportmore » following generator trips. EWH models are included in a simulation platform in DIgSILENT to perform electromechanical simulation, which contains 147 households in a distribution feeder. Simulation results show the dependence of EWH response on water heater usage . These results provide insight suggestions on the need of control strategies to achieve better performance for demand response implementation. Index Terms— Centralized control, decentralized control, demand response, electrical water heater, smart grid« less
Coupling of electromagnetic and structural dynamics for a wind turbine generator
NASA Astrophysics Data System (ADS)
Matzke, D.; Rick, S.; Hollas, S.; Schelenz, R.; Jacobs, G.; Hameyer, K.
2016-09-01
This contribution presents a model interface of a wind turbine generator to represent the reciprocal effects between the mechanical and the electromagnetic system. Therefore, a multi-body-simulation (MBS) model in Simpack is set up and coupled with a quasi-static electromagnetic (EM) model of the generator in Matlab/Simulink via co-simulation. Due to lack of data regarding the structural properties of the generator the modal properties of the MBS model are fitted with respect to results of an experimental modal analysis (EMA) on the reference generator. The used method and the results of this approach are presented in this paper. The MB S model and the interface are set up in such a way that the EM forces can be applied to the structure and the response of the structure can be fed back to the EM model. The results of this cosimulation clearly show an influence of the feedback of the mechanical response which is mainly damping in the torsional degree of freedom and effects due to eccentricity in radial direction. The accuracy of these results will be validated via test bench measurements and presented in future work. Furthermore it is suggested that the EM model should be adjusted in future works so that transient effects are represented.
Summary of CPAS Gen II Parachute Analysis
NASA Technical Reports Server (NTRS)
Morris, Aaron L.; Bledsoe, Kristin J.; Fraire, Usbaldo, Jr.; Moore, James W.; Olson, Leah M.; Ray, Eric
2011-01-01
The Orion spacecraft is currently under development by NASA and Lockheed Martin. Like Apollo, Orion will use a series of parachutes to slow its descent and splashdown safely. The Orion parachute system, known as the CEV Parachute Assembly System (CPAS), is being designed by NASA, the Engineering and Science Contract Group (ESCG), and Airborne Systems. The first generation (Gen I) of CPAS testing consisted of thirteen tests and was executed in the 2007-2008 timeframe. The Gen I tests provided an initial understanding of the CPAS parachutes. Knowledge gained from Gen I testing was used to plan the second generation of testing (Gen II). Gen II consisted of six tests: three singleparachute tests, designated as Main Development Tests, and three Cluster Development Tests. Gen II required a more thorough investigation into parachute performance than Gen I. Higher fidelity instrumentation, enhanced analysis methods and tools, and advanced test techniques were developed. The results of the Gen II test series are being incorporated into the CPAS design. Further testing and refinement of the design and model of parachute performance will occur during the upcoming third generation of testing (Gen III). This paper will provide an overview of the developments in CPAS analysis following the end of Gen I, including descriptions of new tools and techniques as well as overviews of the Gen II tests.
Speech-discrimination scores modeled as a binomial variable.
Thornton, A R; Raffin, M J
1978-09-01
Many studies have reported variability data for tests of speech discrimination, and the disparate results of these studies have not been given a simple explanation. Arguments over the relative merits of 25- vs 50-word tests have ignored the basic mathematical properties inherent in the use of percentage scores. The present study models performance on clinical tests of speech discrimination as a binomial variable. A binomial model was developed, and some of its characteristics were tested against data from 4120 scores obtained on the CID Auditory Test W-22. A table for determining significant deviations between scores was generated and compared to observed differences in half-list scores for the W-22 tests. Good agreement was found between predicted and observed values. Implications of the binomial characteristics of speech-discrimination scores are discussed.
Crash Simulation of a Vertical Drop Test of a B737 Fuselage Section with Overhead Bins and Luggage
NASA Technical Reports Server (NTRS)
Jackson, Karen E.; Fasanella, Edwin L.
2004-01-01
The focus of this paper is to describe a crash simulation of a 30-ft/s vertical drop test of a Boeing 737 (B737) fuselage section. The drop test of the 10-ft. long fuselage section of a B737 aircraft was conducted in November of 2000 at the FAA Technical Center in Atlantic City, NJ. The fuselage section was outfitted with two different commercial overhead stowage bins. In addition, 3,229-lbs. of luggage were packed in the cargo hold to represent a maximum take-off weight condition. The main objective of the test was to evaluate the response and failure modes of the overhead stowage bins in a narrow-body transport fuselage section when subjected to a severe, but survivable, impact. A secondary objective of the test was to generate experimental data for correlation with the crash simulation. A full-scale 3-dimensional finite element model of the fuselage section was developed and a crash simulation was conducted using the explicit, nonlinear transient dynamic code, MSC.Dytran. Pre-test predictions of the fuselage and overhead bin responses were generated for correlation with the drop test data. A description of the finite element model and an assessment of the analytical/experimental correlation are presented. In addition, suggestions for modifications to the model to improve correlation are proposed.
Baxter, Nielson T; Koumpouras, Charles C; Rogers, Mary A M; Ruffin, Mack T; Schloss, Patrick D
2016-11-14
There is a significant demand for colorectal cancer (CRC) screening methods that are noninvasive, inexpensive, and capable of accurately detecting early stage tumors. It has been shown that models based on the gut microbiota can complement the fecal occult blood test and fecal immunochemical test (FIT). However, a barrier to microbiota-based screening is the need to collect and store a patient's stool sample. Using stool samples collected from 404 patients, we tested whether the residual buffer containing resuspended feces in FIT cartridges could be used in place of intact stool samples. We found that the bacterial DNA isolated from FIT cartridges largely recapitulated the community structure and membership of patients' stool microbiota and that the abundance of bacteria associated with CRC were conserved. We also found that models for detecting CRC that were generated using bacterial abundances from FIT cartridges were equally predictive as models generated using bacterial abundances from stool. These findings demonstrate the potential for using residual buffer from FIT cartridges in place of stool for microbiota-based screening for CRC. This may reduce the need to collect and process separate stool samples and may facilitate combining FIT and microbiota-based biomarkers into a single test. Additionally, FIT cartridges could constitute a novel data source for studying the role of the microbiome in cancer and other diseases.
NASA Astrophysics Data System (ADS)
Sun, Hao; Wang, Cheng; Wang, Boliang
2011-02-01
We present a hybrid generative-discriminative learning method for human action recognition from video sequences. Our model combines a bag-of-words component with supervised latent topic models. A video sequence is represented as a collection of spatiotemporal words by extracting space-time interest points and describing these points using both shape and motion cues. The supervised latent Dirichlet allocation (sLDA) topic model, which employs discriminative learning using labeled data under a generative framework, is introduced to discover the latent topic structure that is most relevant to action categorization. The proposed algorithm retains most of the desirable properties of generative learning while increasing the classification performance though a discriminative setting. It has also been extended to exploit both labeled data and unlabeled data to learn human actions under a unified framework. We test our algorithm on three challenging data sets: the KTH human motion data set, the Weizmann human action data set, and a ballet data set. Our results are either comparable to or significantly better than previously published results on these data sets and reflect the promise of hybrid generative-discriminative learning approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Shaobu; Lu, Shuai; Zhou, Ning
In interconnected power systems, dynamic model reduction can be applied on generators outside the area of interest to mitigate the computational cost with transient stability studies. This paper presents an approach of deriving the reduced dynamic model of the external area based on dynamic response measurements, which comprises of three steps, dynamic-feature extraction, attribution and reconstruction (DEAR). In the DEAR approach, a feature extraction technique, such as singular value decomposition (SVD), is applied to the measured generator dynamics after a disturbance. Characteristic generators are then identified in the feature attribution step for matching the extracted dynamic features with the highestmore » similarity, forming a suboptimal ‘basis’ of system dynamics. In the reconstruction step, generator state variables such as rotor angles and voltage magnitudes are approximated with a linear combination of the characteristic generators, resulting in a quasi-nonlinear reduced model of the original external system. Network model is un-changed in the DEAR method. Tests on several IEEE standard systems show that the proposed method gets better reduction ratio and response errors than the traditional coherency aggregation methods.« less
Rotating rake design for unique measurement of fan-generated spinning acoustic modes
NASA Technical Reports Server (NTRS)
Konno, Kevin E.; Hausmann, Clifford R.
1993-01-01
In light of the current emphasis on noise reduction in subsonic aircraft design, NASA has been actively studying the source of and propagation of noise generated by subsonic fan engines. NASA/LeRC has developed and tested a unique method of accurately measuring these spinning acoustic modes generated by an experimental fan. This mode measuring method is based on the use of a rotating microphone rake. Testing was conducted in the 9 x 15 Low-speed Wind Tunnel. The rotating rake was tested with the Advanced Ducted Propeller (ADP) model. This memorandum discusses the design and performance of the motor/drive system for the fan-synchronized rotating acoustic rake. This novel motor/drive design approach is now being adapted for additional acoustic mode studies in new test rigs as baseline data for the future design of active noise control for subsonic fan engines. Included in this memorandum are the research requirements, motor/drive specifications, test performance results, and a description of the controls and software involved.
NASA Technical Reports Server (NTRS)
Schlundt, D. W.
1976-01-01
The installed performance degradation of a swivel nozzle thrust deflector system obtained during increased vectoring angles of a large-scale test program was investigated and improved. Small-scale models were used to generate performance data for analyzing selected swivel nozzle configurations. A single-swivel nozzle design model with five different nozzle configurations and a twin-swivel nozzle design model, scaled to 0.15 size of the large-scale test hardware, were statically tested at low exhaust pressure ratios of 1.4, 1.3, 1.2, and 1.1 and vectored at four nozzle positions from 0 deg cruise through 90 deg vertical used for the VTOL mode.
ERIC Educational Resources Information Center
Bryant, Alyssa N.
2011-01-01
Based upon a national longitudinal dataset of 14,527 college students generated by the UCLA Spirituality in Higher Education Project, this study used structural equation modeling to test the applicability of a model of ecumenical worldview development for students of diverse genders, races, and worldviews. The model suggests that challenging…
Booth, James F; Naud, Catherine M; Willison, Jeff
2018-03-01
The representation of extratropical cyclones (ETCs) precipitation in general circulation models (GCMs) and a weather research and forecasting (WRF) model is analyzed. This work considers the link between ETC precipitation and dynamical strength and tests if parameterized convection affects this link for ETCs in the North Atlantic Basin. Lagrangian cyclone tracks of ETCs in ERA-Interim reanalysis (ERAI), the GISS and GFDL CMIP5 models, and WRF with two horizontal resolutions are utilized in a compositing analysis. The 20-km resolution WRF model generates stronger ETCs based on surface wind speed and cyclone precipitation. The GCMs and ERAI generate similar composite means and distributions for cyclone precipitation rates, but GCMs generate weaker cyclone surface winds than ERAI. The amount of cyclone precipitation generated by the convection scheme differs significantly across the datasets, with GISS generating the most, followed by ERAI and then GFDL. The models and reanalysis generate relatively more parameterized convective precipitation when the total cyclone-averaged precipitation is smaller. This is partially due to the contribution of parameterized convective precipitation occurring more often late in the ETC life cycle. For reanalysis and models, precipitation increases with both cyclone moisture and surface wind speed, and this is true if the contribution from the parameterized convection scheme is larger or not. This work shows that these different models generate similar total ETC precipitation despite large differences in the parameterized convection, and these differences do not cause unexpected behavior in ETC precipitation sensitivity to cyclone moisture or surface wind speed.
ERIC Educational Resources Information Center
Johnson, Danette Ifert; Mrowka, Kaleigh
2010-01-01
This investigation tests Wittrock's generative learning model as an explanation for the positive relationship found between quizzing and student performance in a number of studies. Results support the theory, suggesting that quizzes structured to include multiple levels of Bloom, Engelhart, Furst, Hill and Krathwohl's (1956) taxonomy, and thereby…
ERIC Educational Resources Information Center
Warrington, Cartmell
2017-01-01
In this study, the Big Five factor model of personality traits theory was tested for its ability to predict or explain Employee Information Security Behavior (EISB), when Generational Cohort (GCOHORT) moderated the relationship between the five factors of personality and EISB. The independent variables (IVs) Extraversion, Agreeableness,…
Technical note: A linear model for predicting δ13 Cprotein.
Pestle, William J; Hubbe, Mark; Smith, Erin K; Stevenson, Joseph M
2015-08-01
Development of a model for the prediction of δ(13) Cprotein from δ(13) Ccollagen and Δ(13) Cap-co . Model-generated values could, in turn, serve as "consumer" inputs for multisource mixture modeling of paleodiet. Linear regression analysis of previously published controlled diet data facilitated the development of a mathematical model for predicting δ(13) Cprotein (and an experimentally generated error term) from isotopic data routinely generated during the analysis of osseous remains (δ(13) Cco and Δ(13) Cap-co ). Regression analysis resulted in a two-term linear model (δ(13) Cprotein (%) = (0.78 × δ(13) Cco ) - (0.58× Δ(13) Cap-co ) - 4.7), possessing a high R-value of 0.93 (r(2) = 0.86, P < 0.01), and experimentally generated error terms of ±1.9% for any predicted individual value of δ(13) Cprotein . This model was tested using isotopic data from Formative Period individuals from northern Chile's Atacama Desert. The model presented here appears to hold significant potential for the prediction of the carbon isotope signature of dietary protein using only such data as is routinely generated in the course of stable isotope analysis of human osseous remains. These predicted values are ideal for use in multisource mixture modeling of dietary protein source contribution. © 2015 Wiley Periodicals, Inc.
Development of a Novel Quantitative Adverse Outcome Pathway Predictive Model for Lung Cancer
Traditional methods for carcinogenicity testing are resource-intensive, retrospective, and time consuming. An increasing testing burden has generated interest in the adverse outcome pathway (AOP) concept as a tool to evaluate chemical safety in a more efficient, rapid and effecti...
Soft computing methods in design of superalloys
NASA Technical Reports Server (NTRS)
Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.
1995-01-01
Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modeled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.
Soft Computing Methods in Design of Superalloys
NASA Technical Reports Server (NTRS)
Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.
1996-01-01
Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.
Quarterly Progress Report: Modeling and Simulation of the Homopolar Motor Test Apparatus
2006-05-01
Quarterly Progress Report: Modeling and Simulation of the Homopolar Motor Test Apparatus 5. FUNDING NUMBERS Contract # N00014-1-0588 6. AUTHOR(S) K...superconducting homopolar motor /generator (SCHPMG) machine for ship propulsion. Electrical contact (brush/slip ring) performance is a limiting factor in SCHPMG...SUBJECT TERMS superconducting homopolar motors , inhomogenous brush wear, polarity dependence, destabilized force 15. NUMBER OF PAGES 11 16. PRICE CODE
Role of APOE Isoforms in the Pathogenesis of TBI induced Alzheimer’s Disease
2016-10-01
deletion, APOE targeted replacement, complex breeding, CCI model optimization, mRNA library generation, high throughput massive parallel sequencing...demonstrate that the lack of Abca1 increases amyloid plaques and decreased APOE protein levels in AD-model mice. In this proposal we will test the hypothesis...injury, inflammatory reaction, transcriptome, high throughput massive parallel sequencing, mRNA-seq., behavioral testing, memory impairment, recovery 3
NASA Technical Reports Server (NTRS)
Irani, E.; Snyder, M. H.
1988-01-01
An averaging total pressure wake rake used by the Cessna Aircraft Company in flight tests of a modified 210 airplane with a laminar flow wing was calibrated in wind tunnel tests against a five-tube pressure probe. The model generating the wake was a full-scale model of the Cessna airplane wing. Indications of drag trends were the same for both instruments.
NASA Technical Reports Server (NTRS)
Pitts, E. R.
1981-01-01
Program converts cell-net data into logic-gate models for use in test and simulation programs. Input consists of either Place, Route, and Fold (PRF) or Place-and-Route-in-Two-Dimensions (PR2D) layout data deck. Output consists of either Test Pattern Generator (TPG) or Logic-Simulation (LOGSIM) logic circuitry data deck. Designer needs to build only logic-gate-model circuit description since program acts as translator. Language is FORTRAN IV.
Torsional Vibration in the National Wind Technology Center’s 2.5-Megawatt Dynamometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sethuraman, Latha; Keller, Jonathan; Wallen, Robb
2016-08-31
This report documents the torsional drivetrain dynamics of the NWTC's 2.5-megawatt dynamometer as identified experimentally and as calculated using lumped parameter models using known inertia and stiffness parameters. The report is presented in two parts beginning with the identification of the primary torsional modes followed by the investigation of approaches to damp the torsional vibrations. The key mechanical parameters for the lumped parameter models and justification for the element grouping used in the derivation of the torsional modes are presented. The sensitivities of the torsional modes to different test article properties are discussed. The oscillations observed from the low-speed andmore » generator torque measurements were used to identify the extent of damping inherently achieved through active and passive compensation techniques. A simplified Simulink model of the dynamometer test article integrating the electro-mechanical power conversion and control features was established to emulate the torque behavior that was observed during testing. The torque response in the high-speed, low-speed, and generator shafts were tested and validated against experimental measurements involving step changes in load with the dynamometer operating under speed-regulation mode. The Simulink model serves as a ready reference to identify the torque sensitivities to various system parameters and to explore opportunities to improve torsional damping under different conditions.« less
Preliminary Analysis of the Transient Reactor Test Facility (TREAT) with PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Lee, C. H.
The neutron transport code PROTEUS has been used to perform preliminary simulations of the Transient Reactor Test Facility (TREAT). TREAT is an experimental reactor designed for the testing of nuclear fuels and other materials under transient conditions. It operated from 1959 to 1994, when it was placed on non-operational standby. The restart of TREAT to support the U.S. Department of Energy’s resumption of transient testing is currently underway. Both single assembly and assembly-homogenized full core models have been evaluated. Simulations were performed using a historic set of WIMS-ANL-generated cross-sections as well as a new set of Serpent-generated cross-sections. To supportmore » this work, further analyses were also performed using additional codes in order to investigate particular aspects of TREAT modeling. DIF3D and the Monte-Carlo codes MCNP and Serpent were utilized in these studies. MCNP and Serpent were used to evaluate the effect of geometry homogenization on the simulation results and to support code-to-code comparisons. New meshes for the PROTEUS simulations were created using the CUBIT toolkit, with additional meshes generated via conversion of selected DIF3D models to support code-to-code verifications. All current analyses have focused on code-to-code verifications, with additional verification and validation studies planned. The analysis of TREAT with PROTEUS-SN is an ongoing project. This report documents the studies that have been performed thus far, and highlights key challenges to address in future work.« less
NASA Astrophysics Data System (ADS)
Wang, Hao; Zhang, Fengge; Guan, Tao; Yu, Siyang
2017-09-01
A brushless electrically excited synchronous generator (BEESG) with a hybrid rotor is a novel electrically excited synchronous generator. The BEESG proposed in this paper is composed of a conventional stator with two different sets of windings with different pole numbers, and a hybrid rotor with powerful coupling capacity. The pole number of the rotor is different from those of the stator windings. Thus, an analysis method different from that applied to conventional generators should be applied to the BEESG. In view of this problem, the equivalent circuit and electromagnetic torque expression of the BEESG are derived on the basis of electromagnetic relation of the proposed generator. The generator is simulated and tested experimentally using the established equivalent circuit model. The experimental and simulation data are then analyzed and compared. Results show the validity of the equivalent circuit model.
NASA Astrophysics Data System (ADS)
Dağlarli, Evren; Temeltaş, Hakan
2008-04-01
In this study, behavior generation and self-learning paradigms are investigated for the real-time applications of multi-goal mobile robot tasks. The method is capable to generate new behaviors and it combines them in order to achieve multi goal tasks. The proposed method is composed from three layers: Behavior Generating Module, Coordination Level and Emotion -Motivation Level. Last two levels use Hidden Markov models to manage dynamical structure of behaviors. The kinematics and dynamic model of the mobile robot with non-holonomic constraints are considered in the behavior based control architecture. The proposed method is tested on a four-wheel driven and four-wheel steered mobile robot with constraints in simulation environment and results are obtained successfully.
Experimentally generated randomness certified by the impossibility of superluminal signals.
Bierhorst, Peter; Knill, Emanuel; Glancy, Scott; Zhang, Yanbao; Mink, Alan; Jordan, Stephen; Rommal, Andrea; Liu, Yi-Kai; Christensen, Bradley; Nam, Sae Woo; Stevens, Martin J; Shalm, Lynden K
2018-04-01
From dice to modern electronic circuits, there have been many attempts to build better devices to generate random numbers. Randomness is fundamental to security and cryptographic systems and to safeguarding privacy. A key challenge with random-number generators is that it is hard to ensure that their outputs are unpredictable 1-3 . For a random-number generator based on a physical process, such as a noisy classical system or an elementary quantum measurement, a detailed model that describes the underlying physics is necessary to assert unpredictability. Imperfections in the model compromise the integrity of the device. However, it is possible to exploit the phenomenon of quantum non-locality with a loophole-free Bell test to build a random-number generator that can produce output that is unpredictable to any adversary that is limited only by general physical principles, such as special relativity 1-11 . With recent technological developments, it is now possible to carry out such a loophole-free Bell test 12-14,22 . Here we present certified randomness obtained from a photonic Bell experiment and extract 1,024 random bits that are uniformly distributed to within 10 -12 . These random bits could not have been predicted according to any physical theory that prohibits faster-than-light (superluminal) signalling and that allows independent measurement choices. To certify and quantify the randomness, we describe a protocol that is optimized for devices that are characterized by a low per-trial violation of Bell inequalities. Future random-number generators based on loophole-free Bell tests may have a role in increasing the security and trust of our cryptographic systems and infrastructure.
NASA Technical Reports Server (NTRS)
Snyder, Gregory A.; Taylor, Lawrence A.; Neal, Clive R.
1992-01-01
A chemical model for simulating the sources of the lunar mare basalts was developed by considering a modified mafic cumulate source formed during the combined equilibrium and fractional crystallization of a lunar magma ocean (LMO). The parameters which influence the initial LMO and its subsequent crystallization are examined, and both trace and major elements are modeled. It is shown that major elements tightly constrain the composition of mare basalt sources and the pathways to their creation. The ability of this LMO model to generate viable mare basalt source regions was tested through a case study involving the high-Ti basalts.
Li, Qiuying; Pham, Hoang
2017-01-01
In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance. PMID:28750091
NASA Astrophysics Data System (ADS)
Seresangtakul, Pusadee; Takara, Tomio
In this paper, the distinctive tones of Thai in running speech are studied. We present rules to synthesize F0 contours of Thai tones in running speech by using the generative model of F0 contours. Along with our method, the pitch contours of Thai polysyllabic words, both disyllabic and trisyllabic words, were analyzed. The coarticulation effect of Thai tones in running speech were found. Based on the analysis of the polysyllabic words using this model, rules are derived and applied to synthesize Thai polysyllabic tone sequences. We performed listening tests to evaluate intelligibility of the rules for Thai tones generation. The average intelligibility scores became 98.8%, and 96.6% for disyllabic and trisyllabic words, respectively. From these result, the rule of the tones' generation was shown to be effective. Furthermore, we constructed the connecting rules to synthesize suprasegmental F0 contours using the trisyllable training rules' parameters. The parameters of the first, the third, and the second syllables were selected and assigned to the initial, the ending, and the remaining syllables in a sentence, respectively. Even such a simple rule, the synthesized phrases/senetences were completely identified in listening tests. The MOSs (Mean Opinion Score) was 3.50 while the original and analysis/synthesis samples were 4.82 and 3.59, respectively.
Test and Evaluation Report of the IMED Volumetric Infusion Pump Model 960A
1992-02-01
tested Ambient tempera- ture was out of test lim- its. Windshield anti-ice X Pitot heat X Vent blower X Windshield wiper X Heater X APU X Generator #1 X...Patterson John A. Dellinger, Air Force Base, OH 45433 Southwest Research Institute P. 0. Box 28510 Henry L. Taylor San Antonio, TX 78284 Director
Aeroacoustic Characteristics of Model Jet Test Facility Flow Conditioners
NASA Technical Reports Server (NTRS)
Kinzie, Kevin W.; Henderson, Brenda S.; Haskin, Harry H.
2005-01-01
An experimental investigation of flow conditioning devices used to suppress internal rig noise in high speed, high temperature experimental jet facilities is discussed. The aerodynamic and acoustic characteristics of a number of devices including pressure loss and extraneous noise generation are measured. Both aerodynamic and acoustic characteristics are strongly dependent on the porosity of the flow conditioner and the closure ratio of the duct system. For unchoked flow conditioners, the pressure loss follows conventional incompressible flow models. However, for choked flow conditioners, a compressible flow model where the duct and flow conditioner system is modeled as a convergent-divergent nozzle can be used to estimate pressure loss. Choked flow conditioners generate significantly more noise than unchoked conditioners. In addition, flow conditioners with small hole diameters or sintered metal felt material generate less self-noise noise compared to flow conditioners with larger holes.
Pretest analysis of natural circulation on the PWR model PACTEL with horizontal steam generators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kervinen, T.; Riikonen, V.; Ritonummi, T.
A new tests facility - parallel channel tests loop (PACTEL)- has been designed and built to simulate the major components and system behavior of pressurized water reactors (PWRs) during postulated small- and medium-break loss-of-coolant accidents. Pretest calculations have been performed for the first test series, and the results of these calculations are being used for planning experiments, for adjusting the data acquisition system, and for choosing the optimal position and type of instrumentation. PACTEL is a volumetrically scaled (1:305) model of the VVER-440 PWR. In all the calculated cases, the natural circulation was found to be effective in removing themore » heat from the core to the steam generator. The loop mass flow rate peaked at 60% mass inventory. The straightening of the loop seals increased the mass flow rate significantly.« less
Heat generation in aircraft tires under braked rolling conditions
NASA Technical Reports Server (NTRS)
Clark, S. K.; Dodge, R. N.
1984-01-01
An analytical model was developed to approximate the internal temperature distribution in an aircraft tire operating under conditions of unyawed braked rolling. The model employs an array of elements to represent the tire cross section and considers the heat generated within the tire to be caused by the change in strain energy associated with cyclic tire deflection. The additional heating due to tire slip and stresses induced by braking are superimposed on the previously developed free rolling model. An extensive experimental program was conducted to verify temperatures predicted from the analytical model. Data from these tests were compared with calculations over a range of operating conditions. The model results were in reasonably good agreement with measured values.
Dual gait generative models for human motion estimation from a single camera.
Zhang, Xin; Fan, Guoliang
2010-08-01
This paper presents a general gait representation framework for video-based human motion estimation. Specifically, we want to estimate the kinematics of an unknown gait from image sequences taken by a single camera. This approach involves two generative models, called the kinematic gait generative model (KGGM) and the visual gait generative model (VGGM), which represent the kinematics and appearances of a gait by a few latent variables, respectively. The concept of gait manifold is proposed to capture the gait variability among different individuals by which KGGM and VGGM can be integrated together, so that a new gait with unknown kinematics can be inferred from gait appearances via KGGM and VGGM. Moreover, a new particle-filtering algorithm is proposed for dynamic gait estimation, which is embedded with a segmental jump-diffusion Markov Chain Monte Carlo scheme to accommodate the gait variability in a long observed sequence. The proposed algorithm is trained from the Carnegie Mellon University (CMU) Mocap data and tested on the Brown University HumanEva data with promising results.
Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.
2000-01-01
The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.
Modeling language and cognition with deep unsupervised learning: a tutorial overview
Zorzi, Marco; Testolin, Alberto; Stoianov, Ivilin P.
2013-01-01
Deep unsupervised learning in stochastic recurrent neural networks with many layers of hidden units is a recent breakthrough in neural computation research. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. In this article we discuss the theoretical foundations of this approach and we review key issues related to training, testing and analysis of deep networks for modeling language and cognitive processing. The classic letter and word perception problem of McClelland and Rumelhart (1981) is used as a tutorial example to illustrate how structured and abstract representations may emerge from deep generative learning. We argue that the focus on deep architectures and generative (rather than discriminative) learning represents a crucial step forward for the connectionist modeling enterprise, because it offers a more plausible model of cortical learning as well as a way to bridge the gap between emergentist connectionist models and structured Bayesian models of cognition. PMID:23970869
Modeling language and cognition with deep unsupervised learning: a tutorial overview.
Zorzi, Marco; Testolin, Alberto; Stoianov, Ivilin P
2013-01-01
Deep unsupervised learning in stochastic recurrent neural networks with many layers of hidden units is a recent breakthrough in neural computation research. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. In this article we discuss the theoretical foundations of this approach and we review key issues related to training, testing and analysis of deep networks for modeling language and cognitive processing. The classic letter and word perception problem of McClelland and Rumelhart (1981) is used as a tutorial example to illustrate how structured and abstract representations may emerge from deep generative learning. We argue that the focus on deep architectures and generative (rather than discriminative) learning represents a crucial step forward for the connectionist modeling enterprise, because it offers a more plausible model of cortical learning as well as a way to bridge the gap between emergentist connectionist models and structured Bayesian models of cognition.
Note: The full function test explosive generator.
Reisman, D B; Javedani, J B; Griffith, L V; Ellsworth, G F; Kuklo, R M; Goerz, D A; White, A D; Tallerico, L J; Gidding, D A; Murphy, M J; Chase, J B
2010-03-01
We have conducted three tests of a new pulsed power device called the full function test. These tests represented the culmination of an effort to establish a high energy pulsed power capability based on high explosive pulsed power (HEPP) technology. This involved an extensive computational modeling, engineering, fabrication, and fielding effort. The experiments were highly successful and a new U.S. record for magnetic energy was obtained.
Improving plant bioaccumulation science through consistent reporting of experimental data.
Fantke, Peter; Arnot, Jon A; Doucette, William J
2016-10-01
Experimental data and models for plant bioaccumulation of organic contaminants play a crucial role for assessing the potential human and ecological risks associated with chemical use. Plants are receptor organisms and direct or indirect vectors for chemical exposures to all other organisms. As new experimental data are generated they are used to improve our understanding of plant-chemical interactions that in turn allows for the development of better scientific knowledge and conceptual and predictive models. The interrelationship between experimental data and model development is an ongoing, never-ending process needed to advance our ability to provide reliable quality information that can be used in various contexts including regulatory risk assessment. However, relatively few standard experimental protocols for generating plant bioaccumulation data are currently available and because of inconsistent data collection and reporting requirements, the information generated is often less useful than it could be for direct applications in chemical assessments and for model development and refinement. We review existing testing guidelines, common data reporting practices, and provide recommendations for revising testing guidelines and reporting requirements to improve bioaccumulation knowledge and models. This analysis provides a list of experimental parameters that will help to develop high quality datasets and support modeling tools for assessing bioaccumulation of organic chemicals in plants and ultimately addressing uncertainty in ecological and human health risk assessments. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yuanyuan; Diao, Ruisheng; Huang, Renke
Maintaining good quality of power plant stability models is of critical importance to ensure the secure and economic operation and planning of today’s power grid with its increasing stochastic and dynamic behavior. According to North American Electric Reliability (NERC) standards, all generators in North America with capacities larger than 10 MVA are required to validate their models every five years. Validation is quite costly and can significantly affect the revenue of generator owners, because the traditional staged testing requires generators to be taken offline. Over the past few years, validating and calibrating parameters using online measurements including phasor measurement unitsmore » (PMUs) and digital fault recorders (DFRs) has been proven to be a cost-effective approach. In this paper, an innovative open-source tool suite is presented for validating power plant models using PPMV tool, identifying bad parameters with trajectory sensitivity analysis, and finally calibrating parameters using an ensemble Kalman filter (EnKF) based algorithm. The architectural design and the detailed procedures to run the tool suite are presented, with results of test on a realistic hydro power plant using PMU measurements for 12 different events. The calibrated parameters of machine, exciter, governor and PSS models demonstrate much better performance than the original models for all the events and show the robustness of the proposed calibration algorithm.« less
Large liquid rocket engine transient performance simulation system
NASA Technical Reports Server (NTRS)
Mason, J. R.; Southwick, R. D.
1991-01-01
A simulation system, ROCETS, was designed and developed to allow cost-effective computer predictions of liquid rocket engine transient performance. The system allows a user to generate a simulation of any rocket engine configuration using component modules stored in a library through high-level input commands. The system library currently contains 24 component modules, 57 sub-modules and maps, and 33 system routines and utilities. FORTRAN models from other sources can be operated in the system upon inclusion of interface information on comment cards. Operation of the simulation is simplified for the user by run, execution, and output processors. The simulation system makes available steady-state trim balance, transient operation, and linear partial generation. The system utilizes a modern equation solver for efficient operation of the simulations. Transient integration methods include integral and differential forms for the trapezoidal, first order Gear, and second order Gear corrector equations. A detailed technology test bed engine (TTBE) model was generated to be used as the acceptance test of the simulation system. The general level of model detail was that reflected in the Space Shuttle Main Engine DTM. The model successfully obtained steady-state balance in main stage operation and simulated throttle transients, including engine starts and shutdown. A NASA FORTRAN control model was obtained, ROCETS interface installed in comment cards, and operated with the TTBE model in closed-loop transient mode.
SAR STUDY OF NASAL TOXICITY: LESSONS FOR MODELING SMALL TOXICITY DATASETS
Most toxicity data, particularly from whole animal bioassays, are generated without the needs or capabilities of structure-activity relationship (SAR) modeling in mind. Some toxicity endpoints have been of sufficient regulatory concern to warrant large scale testing efforts (e.g....
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quaglioni, S.; Beck, B. R.
The Monte Carlo All Particle Method generator and collision physics library features two models for allowing a particle to either up- or down-scatter due to collisions with material at finite temperature. The two models are presented and compared. Neutron interaction with matter through elastic collisions is used as testing case.
Model-independent test for scale-dependent non-Gaussianities in the cosmic microwave background.
Räth, C; Morfill, G E; Rossmanith, G; Banday, A J; Górski, K M
2009-04-03
We present a model-independent method to test for scale-dependent non-Gaussianities in combination with scaling indices as test statistics. Therefore, surrogate data sets are generated, in which the power spectrum of the original data is preserved, while the higher order correlations are partly randomized by applying a scale-dependent shuffling procedure to the Fourier phases. We apply this method to the Wilkinson Microwave Anisotropy Probe data of the cosmic microwave background and find signatures for non-Gaussianities on large scales. Further tests are required to elucidate the origin of the detected anomalies.
Dsm Extraction and Evaluation from GEOEYE-1 Stereo Imagery
NASA Astrophysics Data System (ADS)
Saldaña, M. M.; Aguilar, M. A.; Aguilar, F. J.; Fernández, I.
2012-07-01
The newest very high resolution (VHR) commercial satellites, such as GeoEye-1 or WorldView-2, open new possibilities for cartographic applications, orthoimages generation and extraction of Digital Surface Models (DSMs). These DSMs are generated by image matching strategies from VHR satellite stereopairs imagery, reconstructing the 3D surface corresponding to the first surface view of the earth containing both microrelief (buildings, trees and so on) and bare terrain. The main aim of this work is to carry out an accuracy assessment test on the DSMs extracted from a GeoEye-1 stereopair captured in August 2011. A LiDAR derived DSM taken at the same month that the satellite imagery was used as ground truth. The influence of factors such as number of Ground Control Points (GCPs), sensor models tested and the geoid employed to transform the ellipsoid to orthometric heights were going to be evaluated. In this way, different sets of GCPs ranging from 7 to 45, two sensor models and two geoids (EGM96 and EGM08, the last adapted for Spain vertical network by the Spanish's National Geographic Institute) were tested in this work. The photogrammetric software package used was OrthoEngine from PCI Geomatica v. 10.3.2. OrthoEngine implements both sensor models tested: (i) the physical model developed by Toutin (CCRS) and, (ii) the rational function model using rational polynomial coefficients supplied by the vendor and later refined by means of the zero order linear functions (RPC0). When high accurate and well-distributed GCPs were used, the planimetric and vertical accuracies of DSMs generated from the GeoEye-1 Geo stereopair were always better than 0.5 m. Using only 7 GCPs and RPC0, a vertical accuracy around 0.43 m measured as standard deviation was attained. The geoid used by OrthoEngine (EGM96) produced similar results that the EGM08 adapted for Spain vertical network.
Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models
NASA Astrophysics Data System (ADS)
Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.
2012-04-01
The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation process as a sequence of discrete equations which are assembled and solved. It is the coupling of the respective abstractions employed by libadjoint and the FEniCS project which produces the adjoint model automatically, without further intervention from the model developer. This presentation will demonstrate this new technology through linear and non-linear shallow water test cases. The exceptionally simple model syntax will be highlighted and the correctness of the resulting adjoint simulations will be demonstrated using rigorous convergence tests.
Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results
NASA Technical Reports Server (NTRS)
Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul
1992-01-01
The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.
Building an Evaluation Scale using Item Response Theory.
Lalor, John P; Wu, Hao; Yu, Hong
2016-11-01
Evaluation of NLP methods requires testing against a previously vetted gold-standard test set and reporting standard metrics (accuracy/precision/recall/F1). The current assumption is that all items in a given test set are equal with regards to difficulty and discriminating power. We propose Item Response Theory (IRT) from psychometrics as an alternative means for gold-standard test-set generation and NLP system evaluation. IRT is able to describe characteristics of individual items - their difficulty and discriminating power - and can account for these characteristics in its estimation of human intelligence or ability for an NLP task. In this paper, we demonstrate IRT by generating a gold-standard test set for Recognizing Textual Entailment. By collecting a large number of human responses and fitting our IRT model, we show that our IRT model compares NLP systems with the performance in a human population and is able to provide more insight into system performance than standard evaluation metrics. We show that a high accuracy score does not always imply a high IRT score, which depends on the item characteristics and the response pattern.
Building an Evaluation Scale using Item Response Theory
Lalor, John P.; Wu, Hao; Yu, Hong
2016-01-01
Evaluation of NLP methods requires testing against a previously vetted gold-standard test set and reporting standard metrics (accuracy/precision/recall/F1). The current assumption is that all items in a given test set are equal with regards to difficulty and discriminating power. We propose Item Response Theory (IRT) from psychometrics as an alternative means for gold-standard test-set generation and NLP system evaluation. IRT is able to describe characteristics of individual items - their difficulty and discriminating power - and can account for these characteristics in its estimation of human intelligence or ability for an NLP task. In this paper, we demonstrate IRT by generating a gold-standard test set for Recognizing Textual Entailment. By collecting a large number of human responses and fitting our IRT model, we show that our IRT model compares NLP systems with the performance in a human population and is able to provide more insight into system performance than standard evaluation metrics. We show that a high accuracy score does not always imply a high IRT score, which depends on the item characteristics and the response pattern.1 PMID:28004039
Observation-Oriented Modeling: Going beyond "Is It All a Matter of Chance"?
ERIC Educational Resources Information Center
Grice, James W.; Yepez, Maria; Wilson, Nicole L.; Shoda, Yuichi
2017-01-01
An alternative to null hypothesis significance testing is presented and discussed. This approach, referred to as observation-oriented modeling, is centered on model building in an effort to explicate the structures and processes believed to generate a set of observations. In terms of analysis, this novel approach complements traditional methods…
A Generative Approach to the Development of Hidden-Figure Items.
ERIC Educational Resources Information Center
Bejar, Issac I.; Yocom, Peter
This report explores an approach to item development and psychometric modeling which explicitly incorporates knowledge about the mental models used by examinees in the solution of items into a psychometric model that characterize performances on a test, as well as incorporating that knowledge into the item development process. The paper focuses on…
Testing Experimental Therapies in a Guinea Pig Model for Hemorrhagic Fever.
Wong, Gary; Bi, Yuhai; Kobinger, Gary; Gao, George F; Qiu, Xiangguo
2018-01-01
Hemorrhagic fever viruses are among the deadliest pathogens known to humans, and often, licensed medical countermeasures are unavailable to prevent or treat infections. Guinea pigs are a commonly used animal for the preclinical development of any experimental candidates, typically to confirm data generated in mice and as a way to validate and support further testing in nonhuman primates. In this chapter, we use Sudan virus (SUDV), a lethal filovirus closely related to Ebola virus, as an example of the steps required for generating a guinea pig-adapted isolate that is used to test a monoclonal antibody-based therapy against viral hemorrhagic fevers.
Life Prediction Model for Grid-Connected Li-ion Battery Energy Storage System: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Saxon, Aron R; Keyser, Matthew A
Life Prediction Model for Grid-Connected Li-ion Battery Energy Storage System: Preprint Lithium-ion (Li-ion) batteries are being deployed on the electrical grid for a variety of purposes, such as to smooth fluctuations in solar renewable power generation. The lifetime of these batteries will vary depending on their thermal environment and how they are charged and discharged. To optimal utilization of a battery over its lifetime requires characterization of its performance degradation under different storage and cycling conditions. Aging tests were conducted on commercial graphite/nickel-manganese-cobalt (NMC) Li-ion cells. A general lifetime prognostic model framework is applied to model changes in capacity andmore » resistance as the battery degrades. Across 9 aging test conditions from 0oC to 55oC, the model predicts capacity fade with 1.4 percent RMS error and resistance growth with 15 percent RMS error. The model, recast in state variable form with 8 states representing separate fade mechanisms, is used to extrapolate lifetime for example applications of the energy storage system integrated with renewable photovoltaic (PV) power generation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Yuping; Zheng, Qipeng P.; Wang, Jianhui
2014-11-01
tThis paper presents a two-stage stochastic unit commitment (UC) model, which integrates non-generation resources such as demand response (DR) and energy storage (ES) while including riskconstraints to balance between cost and system reliability due to the fluctuation of variable genera-tion such as wind and solar power. This paper uses conditional value-at-risk (CVaR) measures to modelrisks associated with the decisions in a stochastic environment. In contrast to chance-constrained modelsrequiring extra binary variables, risk constraints based on CVaR only involve linear constraints and con-tinuous variables, making it more computationally attractive. The proposed models with risk constraintsare able to avoid over-conservative solutions butmore » still ensure system reliability represented by loss ofloads. Then numerical experiments are conducted to study the effects of non-generation resources ongenerator schedules and the difference of total expected generation costs with risk consideration. Sen-sitivity analysis based on reliability parameters is also performed to test the decision preferences ofconfidence levels and load-shedding loss allowances on generation cost reduction.« less
Fusion of 3D models derived from TLS and image-based techniques for CH enhanced documentation
NASA Astrophysics Data System (ADS)
Bastonero, P.; Donadio, E.; Chiabrando, F.; Spanò, A.
2014-05-01
Recognizing the various advantages offered by 3D new metric survey technologies in the Cultural Heritage documentation phase, this paper presents some tests of 3D model generation, using different methods, and their possible fusion. With the aim to define potentialities and problems deriving from integration or fusion of metric data acquired with different survey techniques, the elected test case is an outstanding Cultural Heritage item, presenting both widespread and specific complexities connected to the conservation of historical buildings. The site is the Staffarda Abbey, the most relevant evidence of medieval architecture in Piedmont. This application faced one of the most topical architectural issues consisting in the opportunity to study and analyze an object as a whole, from twice location of acquisition sensors, both the terrestrial and the aerial one. In particular, the work consists in the evaluation of chances deriving from a simple union or from the fusion of different 3D cloudmodels of the abbey, achieved by multi-sensor techniques. The aerial survey is based on a photogrammetric RPAS (Remotely piloted aircraft system) flight while the terrestrial acquisition have been fulfilled by laser scanning survey. Both techniques allowed to extract and process different point clouds and to generate consequent 3D continuous models which are characterized by different scale, that is to say different resolutions and diverse contents of details and precisions. Starting from these models, the proposed process, applied to a sample area of the building, aimed to test the generation of a unique 3Dmodel thorough a fusion of different sensor point clouds. Surely, the describing potential and the metric and thematic gains feasible by the final model exceeded those offered by the two detached models.
Full scale wind turbine test of vortex generators mounted on the entire blade
NASA Astrophysics Data System (ADS)
Bak, Christian; Skrzypiński, Witold; Gaunaa, Mac; Villanueva, Hector; Brønnum, Niels F.; Kruse, Emil K.
2016-09-01
Measurements on a heavily instrumented pitch regulated variable speed Vestas V52 850 kW wind turbine situated at the DTU Risø Campus are carried out, where the effect of vortex generators mounted on almost the entire blade is tested with and without leading edge roughness. The measurements are compared to the predictions carried out by a developed design tool, where the effect of vortex generators and leading edge roughness is simulated using engineering models. The measurements showed that if vortex generators are mounted there is an increase in flapwise blade moments if the blades are clean, but also that the loads are almost neutral when vortex generators are installed if there is leading edge roughness on the blades. Finally, it was shown that there was a good agreement between the measurements and the predictions from the design tool.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2004-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merion M.
2002-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Software for Estimating Costs of Testing Rocket Engines
NASA Technical Reports Server (NTRS)
Hines, Merlon M.
2003-01-01
A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.
Rapid Crop Cover Mapping for the Conterminous United States.
Dahal, Devendra; Wylie, Bruce; Howard, Danny
2018-06-05
Timely crop cover maps with sufficient resolution are important components to various environmental planning and research applications. Through the modification and use of a previously developed crop classification model (CCM), which was originally developed to generate historical annual crop cover maps, we hypothesized that such crop cover maps could be generated rapidly during the growing season. Through a process of incrementally removing weekly and monthly independent variables from the CCM and implementing a 'two model mapping' approach, we found it viable to generate conterminous United States-wide rapid crop cover maps at a resolution of 250 m for the current year by the month of September. In this approach, we divided the CCM model into one 'crop type model' to handle the classification of nine specific crops and a second, binary model to classify the presence or absence of 'other' crops. Under the two model mapping approach, the training errors were 0.8% and 1.5% for the crop type and binary model, respectively, while test errors were 5.5% and 6.4%, respectively. With spatial mapping accuracies for annual maps reaching upwards of 70%, this approach demonstrated a strong potential for generating rapid crop cover maps by the 1 st of September.
Applying deep bidirectional LSTM and mixture density network for basketball trajectory prediction
NASA Astrophysics Data System (ADS)
Zhao, Yu; Yang, Rennong; Chevalier, Guillaume; Shah, Rajiv C.; Romijnders, Rob
2018-04-01
Data analytics helps basketball teams to create tactics. However, manual data collection and analytics are costly and ineffective. Therefore, we applied a deep bidirectional long short-term memory (BLSTM) and mixture density network (MDN) approach. This model is not only capable of predicting a basketball trajectory based on real data, but it also can generate new trajectory samples. It is an excellent application to help coaches and players decide when and where to shoot. Its structure is particularly suitable for dealing with time series problems. BLSTM receives forward and backward information at the same time, while stacking multiple BLSTMs further increases the learning ability of the model. Combined with BLSTMs, MDN is used to generate a multi-modal distribution of outputs. Thus, the proposed model can, in principle, represent arbitrary conditional probability distributions of output variables. We tested our model with two experiments on three-pointer datasets from NBA SportVu data. In the hit-or-miss classification experiment, the proposed model outperformed other models in terms of the convergence speed and accuracy. In the trajectory generation experiment, eight model-generated trajectories at a given time closely matched real trajectories.
Application of a nonlinear slug test model
McElwee, C.D.
2001-01-01
Knowledge of the hydraulic conductivity distribution is of utmost importance in understanding the dynamics of an aquifer and in planning the consequences of any action taken upon that aquifer. Slug tests have been used extensively to measure hydraulic conductivity in the last 50 years since Hvorslev's (1951) work. A general nonlinear model based on the Navier-Stokes equation, nonlinear frictional loss, non-Darcian flow, acceleration effects, radius changes in the wellbore, and a Hvorslev model for the aquifer has been implemented in this work. The nonlinear model has three parameters: ??, which is related primarily to radius changes in the water column; A, which is related to the nonlinear head losses; and K, the hydraulic conductivity. An additional parameter has been added representing the initial velocity of the water column at slug initiation and is incorporated into an analytical solution to generate the first time step before a sequential numerical solution generates the remainder of the time solution. Corrections are made to the model output for acceleration before it is compared to the experimental data. Sensitivity analysis and least squares fitting are used to estimate the aquifer parameters and produce some diagnostic results, which indicate the accuracy of the fit. Finally, an example of field data has been presented to illustrate the application of the model to data sets that exhibit nonlinear behavior. Multiple slug tests should be taken at a given location to test for nonlinear effects and to determine repeatability.
An advanced stochastic weather generator for simulating 2-D high-resolution climate variables
NASA Astrophysics Data System (ADS)
Peleg, Nadav; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo
2017-07-01
A new stochastic weather generator, Advanced WEather GENerator for a two-dimensional grid (AWE-GEN-2d) is presented. The model combines physical and stochastic approaches to simulate key meteorological variables at high spatial and temporal resolution: 2 km × 2 km and 5 min for precipitation and cloud cover and 100 m × 100 m and 1 h for near-surface air temperature, solar radiation, vapor pressure, atmospheric pressure, and near-surface wind. The model requires spatially distributed data for the calibration process, which can nowadays be obtained by remote sensing devices (weather radar and satellites), reanalysis data sets and ground stations. AWE-GEN-2d is parsimonious in terms of computational demand and therefore is particularly suitable for studies where exploring internal climatic variability at multiple spatial and temporal scales is fundamental. Applications of the model include models of environmental systems, such as hydrological and geomorphological models, where high-resolution spatial and temporal meteorological forcing is crucial. The weather generator was calibrated and validated for the Engelberg region, an area with complex topography in the Swiss Alps. Model test shows that the climate variables are generated by AWE-GEN-2d with a level of accuracy that is sufficient for many practical applications.
Miller, Matthew J; Yang, Minji; Hui, Kayi; Choi, Na-Yeun; Lim, Robert H
2011-07-01
In the present study, we tested a theoretically and empirically derived partially indirect effects acculturation and enculturation model of Asian American college students' mental health and attitudes toward seeking professional psychological help. Latent variable path analysis with 296 self-identified Asian American college students supported the partially indirect effects model and demonstrated the ways in which behavioral acculturation, behavioral enculturation, values acculturation, values enculturation, and acculturation gap family conflict related to mental health and attitudes toward seeking professional psychological help directly and indirectly through acculturative stress. We also tested a generational status moderator hypothesis to determine whether differences in model-implied relationships emerged across U.S.- (n = 185) and foreign-born (n = 107) participants. Consistent with this hypothesis, statistically significant differences in structural coefficients emerged across generational status. Limitations, future directions for research, and counseling implications are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortensi, Javier; Baker, Benjamin Allen; Schunert, Sebastian
The INL is currently evolving the modeling and simulation (M&S) capability that will enable improved core operation as well as design and analysis of TREAT experiments. This M&S capability primarily uses MAMMOTH, a reactor physics application being developed under Multi-physics Object Oriented Simulation Environment (MOOSE) framework. MAMMOTH allows the coupling of a number of other MOOSE-based applications. This second year of work has been devoted to the generation of a deterministic reference solution for the full core, the preparation of anisotropic diffusion coefficients, the testing of the SPH equivalence method, and the improvement of the control rod modeling. In addition,more » this report includes the progress made in the modeling of the M8 core configuration and experiment vehicle since January of this year.« less
Brayton cycle solarized advanced gas turbine
NASA Technical Reports Server (NTRS)
1986-01-01
Described is the development of a Brayton Engine/Generator Set for solar thermal to electrical power conversion, authorized under DOE/NASA Contract DEN3-181. The objective was to design, fabricate, assemble, and test a small, hybrid, 20-kW Brayton-engine-powered generator set. The latter, called a power conversion assembly (PCA), is designed to operate with solar energy obtained from a parobolic dish concentrator, 11 meters in diameter, or with fossil energy supplied by burning fuels in a combustor, or by a combination of both (hybrid model). The CPA consists of the Brayton cycle engine, a solar collector, a belt-driven 20-kW generator, and the necessary control systems for automatic operation in solar-only, fuel-only, and hybrid modes to supply electrical power to a utility grid. The original configuration of the generator set used the GTEC Model GTP36-51 gas turbine engine for the PCA prime mover. However, subsequent development of the GTEC Model AGT101 led to its selection as the powersource for the PCA. Performance characteristics of the latter, thermally coupled to a solar collector for operation in the solar mode, are presented. The PCA was successfully demonstrated in the fuel-only mode at the GTEC Phoenix, Arizona, facilities prior to its shipment to Sandia National Laboratory in Albuquerque, New Mexico, for installation and testing on a test bed concentractor (parabolic dish). Considerations relative to Brayton-engine development using the all-ceramic AGT101 when it becomes available, which would satisfy the DOE heat engine efficiency goal of 35 to 41 percent, are also discussed in the report.
Inference from clustering with application to gene-expression microarrays.
Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M
2002-01-01
There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.
Real-Time Extended Interface Automata for Software Testing Cases Generation
Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin
2014-01-01
Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080
Green Energy Options for Consumer-Owned Business
DOE Office of Scientific and Technical Information (OSTI.GOV)
Co-opPlus of Western Massachusetts
2006-05-01
The goal of this project was to define, test, and prototype a replicable business model for consumer-owned cooperatives. The result is a replicable consumer-owned cooperative business model for the generation, interconnection, and distribution of renewable energy that incorporates energy conservation and efficiency improvements.
Correction of electronic record for weighing bucket precipitation gauge measurements
USDA-ARS?s Scientific Manuscript database
Electronic sensors generate valuable streams of forcing and validation data for hydrologic models, but are often subject to noise, which must be removed as part of model input and testing database development. We developed Automated Precipitation Correction Program (APCP) for weighting bucket preci...
Computer-Based Linguistic Analysis.
ERIC Educational Resources Information Center
Wright, James R.
Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…
Integrating satellite imagery with simulation modeling to improve burn severity mapping
Eva C. Karau; Pamela G. Sikkink; Robert E. Keane; Gregory K. Dillon
2014-01-01
Both satellite imagery and spatial fire effects models are valuable tools for generating burn severity maps that are useful to fire scientists and resource managers. The purpose of this study was to test a new mapping approach that integrates imagery and modeling to create more accurate burn severity maps. We developed and assessed a statistical model that combines the...
The use of direct numerical simulation data in turbulence modeling
NASA Technical Reports Server (NTRS)
Mansour, N. N.
1991-01-01
Direct numerical simulations (DNS) of turbulent flows provide a complete data base to develop and to test turbulence models. In this article, the progress made in developing models for the dissipation rate equation is reviewed. New scaling arguments for the various terms in the dissipation rate equation were tested using data from DNS of homogeneous shear flows. Modifications to the epsilon-equation model that take into account near-wall effects were developed using DNS of turbulent channel flows. Testing of new models for flows under mean compression was carried out using data from DNS of isotropically compressed turbulence. In all of these studies the data from the simulations was essential in guiding the model development. The next generation of DNS will be at higher Reynolds numbers, and will undoubtedly lead to improved models for computations of flows of practical interest.
Adjusting HIV prevalence estimates for non-participation: an application to demographic surveillance
McGovern, Mark E.; Marra, Giampiero; Radice, Rosalba; Canning, David; Newell, Marie-Louise; Bärnighausen, Till
2015-01-01
Introduction HIV testing is a cornerstone of efforts to combat the HIV epidemic, and testing conducted as part of surveillance provides invaluable data on the spread of infection and the effectiveness of campaigns to reduce the transmission of HIV. However, participation in HIV testing can be low, and if respondents systematically select not to be tested because they know or suspect they are HIV positive (and fear disclosure), standard approaches to deal with missing data will fail to remove selection bias. We implemented Heckman-type selection models, which can be used to adjust for missing data that are not missing at random, and established the extent of selection bias in a population-based HIV survey in an HIV hyperendemic community in rural South Africa. Methods We used data from a population-based HIV survey carried out in 2009 in rural KwaZulu-Natal, South Africa. In this survey, 5565 women (35%) and 2567 men (27%) provided blood for an HIV test. We accounted for missing data using interviewer identity as a selection variable which predicted consent to HIV testing but was unlikely to be independently associated with HIV status. Our approach involved using this selection variable to examine the HIV status of residents who would ordinarily refuse to test, except that they were allocated a persuasive interviewer. Our copula model allows for flexibility when modelling the dependence structure between HIV survey participation and HIV status. Results For women, our selection model generated an HIV prevalence estimate of 33% (95% CI 27–40) for all people eligible to consent to HIV testing in the survey. This estimate is higher than the estimate of 24% generated when only information from respondents who participated in testing is used in the analysis, and the estimate of 27% when imputation analysis is used to predict missing data on HIV status. For men, we found an HIV prevalence of 25% (95% CI 15–35) using the selection model, compared to 16% among those who participated in testing, and 18% estimated with imputation. We provide new confidence intervals that correct for the fact that the relationship between testing and HIV status is unknown and requires estimation. Conclusions We confirm the feasibility and value of adopting selection models to account for missing data in population-based HIV surveys and surveillance systems. Elements of survey design, such as interviewer identity, present the opportunity to adopt this approach in routine applications. Where non-participation is high, true confidence intervals are much wider than those generated by standard approaches to dealing with missing data suggest. PMID:26613900
Switching moving boundary models for two-phase flow evaporators and condensers
NASA Astrophysics Data System (ADS)
Bonilla, Javier; Dormido, Sebastián; Cellier, François E.
2015-03-01
The moving boundary method is an appealing approach for the design, testing and validation of advanced control schemes for evaporators and condensers. When it comes to advanced control strategies, not only accurate but fast dynamic models are required. Moving boundary models are fast low-order dynamic models, and they can describe the dynamic behavior with high accuracy. This paper presents a mathematical formulation based on physical principles for two-phase flow moving boundary evaporator and condenser models which support dynamic switching between all possible flow configurations. The models were implemented in a library using the equation-based object-oriented Modelica language. Several integrity tests in steady-state and transient predictions together with stability tests verified the models. Experimental data from a direct steam generation parabolic-trough solar thermal power plant is used to validate and compare the developed moving boundary models against finite volume models.
A New Real - Time Fault Detection Methodology for Systems Under Test. Phase 1
NASA Technical Reports Server (NTRS)
Johnson, Roger W.; Jayaram, Sanjay; Hull, Richard A.
1998-01-01
The purpose of this research is focussed on the identification/demonstration of critical technology innovations that will be applied to various applications viz. Detection of automated machine Health Monitoring (BM, real-time data analysis and control of Systems Under Test (SUT). This new innovation using a High Fidelity Dynamic Model-based Simulation (BFDMS) approach will be used to implement a real-time monitoring, Test and Evaluation (T&E) methodology including the transient behavior of the system under test. The unique element of this process control technique is the use of high fidelity, computer generated dynamic models to replicate the behavior of actual Systems Under Test (SUT). It will provide a dynamic simulation capability that becomes the reference truth model, from which comparisons are made with the actual raw/conditioned data from the test elements.
NASA Technical Reports Server (NTRS)
Collins, Timothy J.; Congdon, William M.; Smeltzer, Stanley S.; Whitley, Karen S.
2005-01-01
The next generation of planetary exploration vehicles will rely heavily on robust aero-assist technologies, especially those that include aerocapture. This paper provides an overview of an ongoing development program, led by NASA Langley Research Center (LaRC) and aimed at introducing high-temperature structures, adhesives, and advanced thermal protection system (TPS) materials into the aeroshell design process. The purpose of this work is to demonstrate TPS materials that can withstand the higher heating rates of NASA's next generation planetary missions, and to validate high-temperature structures and adhesives that can reduce required TPS thickness and total aeroshell mass, thus allowing for larger science payloads. The effort described consists of parallel work in several advanced aeroshell technology areas. The areas of work include high-temperature adhesives, high-temperature composite materials, advanced ablator (TPS) materials, sub-scale demonstration test articles, and aeroshell modeling and analysis. The status of screening test results for a broad selection of available higher-temperature adhesives is presented. It appears that at least one (and perhaps a few) adhesives have working temperatures ranging from 315-400 C (600-750 F), and are suitable for TPS-to-structure bondline temperatures that are significantly above the traditional allowable of 250 C (482 F). The status of mechanical testing of advanced high-temperature composite materials is also summarized. To date, these tests indicate the potential for good material performance at temperatures of at least 600 F. Application of these materials and adhesives to aeroshell systems that incorporate advanced TPS materials may reduce aeroshell TPS mass by 15% - 30%. A brief outline is given of work scheduled for completion in 2006 that will include fabrication and testing of large panels and subscale aeroshell test articles at the Solar-Tower Test Facility located at Kirtland AFB and operated by Sandia National Laboratories. These tests are designed to validate aeroshell manufacturability using advanced material systems, and to demonstrate the maintenance of bondline integrity at realistically high temperatures and heating rates. Finally, a status is given of ongoing aeroshell modeling and analysis efforts which will be used to correlate with experimental testing, and to provide a reliable means of extrapolating to performance under actual flight conditions. The modeling and analysis effort includes a parallel series of experimental tests to determine TSP thermal expansion and other mechanical properties which are required for input to the analysis models.
On the virtues of automated quantitative structure-activity relationship: the new kid on the block.
de Oliveira, Marcelo T; Katekawa, Edson
2018-02-01
Quantitative structure-activity relationship (QSAR) has proved to be an invaluable tool in medicinal chemistry. Data availability at unprecedented levels through various databases have collaborated to a resurgence in the interest for QSAR. In this context, rapid generation of quality predictive models is highly desirable for hit identification and lead optimization. We showcase the application of an automated QSAR approach, which randomly selects multiple training/test sets and utilizes machine-learning algorithms to generate predictive models. Results demonstrate that AutoQSAR produces models of improved or similar quality to those generated by practitioners in the field but in just a fraction of the time. Despite the potential of the concept to the benefit of the community, the AutoQSAR opportunity has been largely undervalued.
Fite, Jennifer E.; Bates, John E.; Holtzworth-Munroe, Amy; Dodge, Kenneth A.; Nay, Sandra Y.; Pettit, Gregory S.
2012-01-01
This study explored the K. A. Dodge (1986) model of social information processing as a mediator of the association between interparental relationship conflict and subsequent offspring romantic relationship conflict in young adulthood. The authors tested 4 social information processing stages (encoding, hostile attributions, generation of aggressive responses, and positive evaluation of aggressive responses) in separate models to explore their independent effects as potential mediators. There was no evidence of mediation for encoding and attributions. However, there was evidence of significant mediation for both the response generation and response evaluation stages of the model. Results suggest that the ability of offspring to generate varied social responses and effectively evaluate the potential outcome of their responses at least partially mediates the intergenerational transmission of relationship conflict. PMID:18540765
The paper gives results of an investigation of the impact of an ozone generator air cleaner on vapor-phase styrene concentrations in a full-scale indoor air quality test chamber. The time history of the concentrations of styrene and ozone is well predicted by a simulation model u...
Direct mechanical torque sensor for model wind turbines
NASA Astrophysics Data System (ADS)
Kang, Hyung Suk; Meneveau, Charles
2010-10-01
A torque sensor is developed to measure the mechanical power extracted by model wind turbines. The torque is measured by mounting the model generator (a small dc motor) through ball bearings to the hub and by preventing its rotation by the deflection of a strain-gauge-instrumented plate. By multiplying the measured torque and rotor angular velocity, a direct measurement of the fluid mechanical power extracted from the flow is obtained. Such a measurement is more advantageous compared to measuring the electrical power generated by the model generator (dc motor), since the electrical power is largely affected by internal frictional, electric and magnetic losses. Calibration experiments are performed, and during testing, the torque sensor is mounted on a model wind turbine in a 3 rows × 3 columns array of wind turbines in a wind tunnel experiment. The resulting electrical and mechanical powers are quantified and compared over a range of applied loads, for three different incoming wind velocities. Also, the power coefficients are obtained as a function of the tip speed ratio. Significant differences between the electrical and mechanical powers are observed, which highlights the importance of using the direct mechanical power measurement for fluid dynamically meaningful results. A direct calibration with the measured current is also explored. The new torque sensor is expected to contribute to more accurate model wind tunnel tests which should provide added flexibility in model studies of the power that can be harvested from wind turbines and wind-turbine farms.
NASA Astrophysics Data System (ADS)
Kotulla, Ralf
2012-10-01
Over its lifespan Hubble has invested significant effort into detailed observations of galaxies both in the local and distant universe. To extract the physical information from the observed {spectro-}photometry requires detailed and accurate models. Stellar population synthesis models are frequently used to obtain stellar masses, star formation rate, galaxy ages and star formation histories. Chemical evolution models offer another valuable and complementary approach to gain insight into many of the same aspects, yet these two methods have rarely been used in combination.Our proposed next generation of galaxy evolution models will help us improve our understanding of how galaxies form and evolve. Building on GALEV evolutionary synthesis models we incorporate state-of-the-art input physics for stellar evolution of binaries and rotating stars as well as new spectral libraries well matched to the modern observational capabilities. Our improved chemical evolution model allows us to self-consistently trace abundances of individual elements, fully accounting for the increasing initial abundances of successive stellar generations. GALEV will support variable Initial Mass Functions {IMF}, enabling us to test recent observational findings of a non-universal IMF by predicting chemical properties and integrated spectra in an integrated and consistent manner.HST is the perfect instrument for testing this approach. Its wide wavelength coverage from UV to NIR enables precise SED fitting, and with its spatial resolution we can compare the inferred chemical evolution to studies of star clusters and resolved stellar populations in nearby galaxies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Chen; Gupta, Vipul; Huang, Shenyan
The goal of this project is to model long-term creep performance for nickel-base superalloy weldments in high temperature power generation systems. The project uses physics-based modeling methodologies and algorithms for predicting alloy properties in heterogeneous material structures. The modeling methodology will be demonstrated on a gas turbine combustor liner weldment of Haynes 282 precipitate-strengthened nickel-base superalloy. The major developments are: (1) microstructure-property relationships under creep conditions and microstructure characterization (2) modeling inhomogeneous microstructure in superalloy weld (3) modeling mesoscale plastic deformation in superalloy weld and (4) a constitutive creep model that accounts for weld and base metal microstructure and theirmore » long term evolution. The developed modeling technology is aimed to provide a more efficient and accurate assessment of a material’s long-term performance compared with current testing and extrapolation methods. This modeling technology will also accelerate development and qualification of new materials in advanced power generation systems. This document is a final technical report for the project, covering efforts conducted from October 2014 to December 2016.« less
Preliminary measurements on heat balance in pneumatic tires
NASA Technical Reports Server (NTRS)
Nybakken, G. H.; Collart, D. Y.; Staples, R. J.; Lackey, J. I.; Clark, S. K.; Dodge, R. N.
1973-01-01
A variety of tests was undertaken to determine the nature of heat generation associated with a pneumatic tire operating under various conditions. Tests were conducted to determine the magnitude and distribution of internally generated heat caused by hysteresis in the rubber and ply fabric in an automobile tire operating under conditions of load, pressure, and velocity representative of normal operating conditions. These included tests at various yaw angles and tests with braking applied. In other tests, temperature sensors were mounted on a road to measure the effect of a tire rolling over and an attempt was made to deduce the magnitude and nature of interfacial friction from the resulting information. In addition, tests were performed using the scratch plate technique to determine the nature of the motion between the tire and road. Finally, a model tire was tested on a roadwheel, the surface covering which could be changed, and an optical pyrometer was used to measure rubber surface temperatures.
Previous studies indicate that freshwater mollusks are more sensitive than commonly tested organisms to some chemicals, such as copper and ammonia. Nevertheless, mollusks are generally under-represented in toxicity databases. Studies are needed to generate data with which to comp...
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.; Walker, Bruce E.
2014-01-01
An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14x22 wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8% scale model. The UCFANS is a 5.8% rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the program was to provide an estimate of the acoustic shielding benefits possible from mounting an engine on the upper surface of a wing; a flat plate model was used as the shielding surface. Simple analytical simulations were used to preview the radiation patterns - Fresnel knife-edge diffraction was coupled with a dense phased array of point sources to compute shielded and unshielded sound pressure distributions for potential test geometries and excitation modes. Contour plots of sound pressure levels, and integrated power levels, from nacelle alone and shielded configurations for both the experimental measurements and the analytical predictions are presented in this paper.
Transfer function tests of the Joy longwall shearer
NASA Technical Reports Server (NTRS)
Fisher, P. H., Jr.
1978-01-01
A series of operational tests was performed on the Joy longwall shearer located at the Bureau of Mines in Bructon, Pennsylvania. The purpose of these tests was to determine the transfer function and operational characteristics of the system. These characteristics will be used to generate a simulation model of the longwall shearer used in the development of the closed-loop vertical control system.
NASA Technical Reports Server (NTRS)
Carter, John F.; Nagy, Christopher J.; Barnicki, Joseph S.
1997-01-01
Forces generated by the Space Shuttle orbiter tire under varying vertical load, slip angle, speed, and surface conditions were measured using the Landing System Research Aircraft (LSRA). Resulting data were used to calculate a mathematical model for predicting tire forces in orbiter simulations. Tire side and drag forces experienced by an orbiter tire are cataloged as a function of vertical load and slip angle. The mathematical model is compared to existing tire force models for the Space Shuttle orbiter. This report describes the LSRA and a typical test sequence. Testing methods, data reduction, and error analysis are presented. The LSRA testing was conducted on concrete and lakebed runways at the Edwards Air Force Flight Test Center and on concrete runways at the Kennedy Space Center (KSC). Wet runway tire force tests were performed on test strips made at the KSC using different surfacing techniques. Data were corrected for ply steer forces and conicity.
Classification framework for partially observed dynamical systems
NASA Astrophysics Data System (ADS)
Shen, Yuan; Tino, Peter; Tsaneva-Atanasova, Krasimira
2017-04-01
We present a general framework for classifying partially observed dynamical systems based on the idea of learning in the model space. In contrast to the existing approaches using point estimates of model parameters to represent individual data items, we employ posterior distributions over model parameters, thus taking into account in a principled manner the uncertainty due to both the generative (observational and/or dynamic noise) and observation (sampling in time) processes. We evaluate the framework on two test beds: a biological pathway model and a stochastic double-well system. Crucially, we show that the classification performance is not impaired when the model structure used for inferring posterior distributions is much more simple than the observation-generating model structure, provided the reduced-complexity inferential model structure captures the essential characteristics needed for the given classification task.
The Dynamical Core Model Intercomparison Project (DCMIP-2016): Results of the Supercell Test Case
NASA Astrophysics Data System (ADS)
Zarzycki, C. M.; Reed, K. A.; Jablonowski, C.; Ullrich, P. A.; Kent, J.; Lauritzen, P. H.; Nair, R. D.
2016-12-01
The 2016 Dynamical Core Model Intercomparison Project (DCMIP-2016) assesses the modeling techniques for global climate and weather models and was recently held at the National Center for Atmospheric Research (NCAR) in conjunction with a two-week summer school. Over 12 different international modeling groups participated in DCMIP-2016 and focused on the evaluation of the newest non-hydrostatic dynamical core designs for future high-resolution weather and climate models. The paper highlights the results of the third DCMIP-2016 test case, which is an idealized supercell storm on a reduced-radius Earth. The supercell storm test permits the study of a non-hydrostatic moist flow field with strong vertical velocities and associated precipitation. This test assesses the behavior of global modeling systems at extremely high spatial resolution and is used in the development of next-generation numerical weather prediction capabilities. In this regime the effective grid spacing is very similar to the horizontal scale of convective plumes, emphasizing resolved non-hydrostatic dynamics. The supercell test case sheds light on the physics-dynamics interplay and highlights the impact of diffusion on model solutions.
Nowinski, Wieslaw L; Thirunavuukarasuu, Arumugam; Ananthasubramaniam, Anand; Chua, Beng Choon; Qian, Guoyu; Nowinska, Natalia G; Marchenko, Yevgen; Volkau, Ihar
2009-10-01
Preparation of tests and student's assessment by the instructor are time consuming. We address these two tasks in neuroanatomy education by employing a digital media application with a three-dimensional (3D), interactive, fully segmented, and labeled brain atlas. The anatomical and vascular models in the atlas are linked to Terminologia Anatomica. Because the cerebral models are fully segmented and labeled, our approach enables automatic and random atlas-derived generation of questions to test location and naming of cerebral structures. This is done in four steps: test individualization by the instructor, test taking by the students at their convenience, automatic student assessment by the application, and communication of the individual assessment to the instructor. A computer-based application with an interactive 3D atlas and a preliminary mobile-based application were developed to realize this approach. The application works in two test modes: instructor and student. In the instructor mode, the instructor customizes the test by setting the scope of testing and student performance criteria, which takes a few seconds. In the student mode, the student is tested and automatically assessed. Self-testing is also feasible at any time and pace. Our approach is automatic both with respect to test generation and student assessment. It is also objective, rapid, and customizable. We believe that this approach is novel from computer-based, mobile-based, and atlas-assisted standpoints.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel, L.; Brown, Clifford, A.; Walker, Bruce, E.
2012-01-01
An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the Langley Research Center s 14- by 22-Foot wind tunnel test of the Hybrid Wing Body (HWB) full three-dimensional 5.8 percent scale model. The UCFANS is a 5.8 percent rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of candidate engines using artificial sources (no flow). The purpose of the test was to provide an estimate of the acoustic shielding benefits possible from mounting the engine on the upper surface of an HWB aircraft and to provide a database for shielding code validation. A range of frequencies, and a parametric study of modes were generated from exhaust and inlet nacelle configurations. Radiated acoustic data were acquired from a traversing linear array of 13 microphones, spanning 36 in. Two planes perpendicular to the axis of the nacelle (in its 0 orientation) and three planes parallel were acquired from the array sweep. In each plane the linear array traversed five sweeps, for a total span of 160 in. acquired. The resolution of the sweep is variable, so that points closer to the model are taken at a higher resolution. Contour plots of Sound Pressure Level, and integrated Power Levels are presented in this paper; as well as the in-duct modal structure.
R&D of high reliable refrigeration system for superconducting generators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hosoya, T.; Shindo, S.; Yaguchi, H.
1996-12-31
Super-GM carries out R&D of 70 MW class superconducting generators (model machines), refrigeration system and superconducting wires to apply superconducting technology to electric power apparatuses. The helium refrigeration system for keeping field windings of superconducting generator (SCG) in cryogenic environment must meet the requirement of high reliability for uninterrupted long term operation of the SCG. In FY 1992, a high reliable conventional refrigeration system for the model machines was integrated by combining components such as compressor unit, higher temperature cold box and lower temperature cold box which were manufactured utilizing various fundamental technologies developed in early stage of the projectmore » since 1988. Since FY 1993, its performance tests have been carried out. It has been confirmed that its performance was fulfilled the development target of liquefaction capacity of 100 L/h and impurity removal in the helium gas to < 0.1 ppm. Furthermore, its operation method and performance were clarified to all different modes as how to control liquefaction rate and how to supply liquid helium from a dewar to the model machine. In addition, the authors have made performance tests and system performance analysis of oil free screw type and turbo type compressors which greatly improve reliability of conventional refrigeration systems. The operation performance and operational control method of the compressors has been clarified through the tests and analysis.« less
Validated numerical simulation model of a dielectric elastomer generator
NASA Astrophysics Data System (ADS)
Foerster, Florentine; Moessinger, Holger; Schlaak, Helmut F.
2013-04-01
Dielectric elastomer generators (DEG) produce electrical energy by converting mechanical into electrical energy. Efficient operation requires homogeneous deformation of each single layer. However, by different internal and external influences like supports or the shape of a DEG the deformation will be inhomogeneous and hence negatively affect the amount of the generated electrical energy. Optimization of the deformation behavior leads to improved efficiency of the DEG and consequently to higher energy gain. In this work a numerical simulation model of a multilayer dielectric elastomer generator is developed using the FEM software ANSYS. The analyzed multilayer DEG consists of 49 active dielectric layers with layer thicknesses of 50 μm. The elastomer is silicone (PDMS) while the compliant electrodes are made of graphite powder. In the simulation the real material parameters of the PDMS and the graphite electrodes need to be included. Therefore, the mechanical and electrical material parameters of the PDMS are determined by experimental investigations of test samples while the electrode parameters are determined by numerical simulations of test samples. The numerical simulation of the DEG is carried out as coupled electro-mechanical simulation for the constant voltage energy harvesting cycle. Finally, the derived numerical simulation model is validated by comparison with analytical calculations and further simulated DEG configurations. The comparison of the determined results show good accordance with regard to the deformation of the DEG. Based on the validated model it is now possible to optimize the DEG layout for improved deformation behavior with further simulations.
Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Reid, Terry V.; Schifer, Nicholas A.; Briggs, Maxwell H.
2012-01-01
The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. Microporous bulk insulation is used in the ground support test hardware to minimize the loss of thermal energy from the electric heat source to the environment. The insulation package is characterized before operation to predict how much heat will be absorbed by the convertor and how much will be lost to the environment during operation. In an effort to validate these predictions, numerous tasks have been performed, which provided a more accurate value for net heat input into the ASCs. This test and modeling effort included: (a) making thermophysical property measurements of test setup materials to provide inputs to the numerical models, (b) acquiring additional test data that was collected during convertor tests to provide numerical models with temperature profiles of the test setup via thermocouple and infrared measurements, (c) using multidimensional numerical models (computational fluid dynamics code) to predict net heat input of an operating convertor, and (d) using validation test hardware to provide direct comparison of numerical results and validate the multidimensional numerical models used to predict convertor net heat input. This effort produced high fidelity ASC net heat input predictions, which were successfully validated using specially designed test hardware enabling measurement of heat transferred through a simulated Stirling cycle. The overall effort and results are discussed.
Khansai, Manatsanan; Boonmaleerat, Kanchanit; Pothacharoen, Peraphan; Phitak, Thanyaluck; Kongtawelert, Prachya
2016-07-11
Rheumatoid arthritis (RA) is an autoimmune disease associated with chronic inflammatory arthritis. TNF-α and OSM are pro-inflammatory cytokines that play a key role in RA progression. Thus, reducing the effects of both cytokines is practical in order to relieve the progression of the disease. This current study is interested in sesamin, an active compound in sesame seeds. Sesamin has been shown to be a chondroprotective agent in osteoarthritis models. Here, we have evaluated a porcine cartilage explant as a cartilage degradation model related to RA induced by TNF-α and/or OSM in order to investigate the effects of sesamin on TNF-α and OSM in the cartilage degradation model. A porcine cartilage explant was induced with a combination of TNF-α and OSM (test group) or IL-1β and OSM (control group) followed by a co-treatment of sesamin over a long-term period (35 days). After which, the tested explants were analyzed for indications of both the remaining and the degradation aspects using glycosaminoglycan and collagen as an indicator. The combination of TNF-α and OSM promoted cartilage degradation more than either TNF-α or OSM alone and was comparable with the combination of IL-1β and OSM. Sesamin could be offering protection against cartilage degradation by reducing GAGs and collagen turnover in the generated model. Sesamin might be a promising agent as an alternative treatment for RA patients. Furthermore, the generated model revealed itself to be an impressive test model for the analysis of phytochemical substances against the cartilage degradation model for RA. The model could be used to test for the prevention of cartilage degradation in other biological agents induced with TNF-α and OSM as well.
Analysis of Direct Solar Illumination on the Backside of Space Station Solar Cells
NASA Technical Reports Server (NTRS)
Delleur, Ann M.; Kerslake, Thomas W.; Scheiman, David A.
1999-01-01
The International Space Station (ISS) is a complex spacecraft that will take several years to assemble in orbit. During many of the assembly and maintenance procedures, the space station's large solar arrays must he locked, which can significantly reduce power generation. To date, power generation analyses have not included power generation from the backside of the solar cells in a desire to produce a conservative analysis. This paper describes the testing of ISS solar cell backside power generation, analytical modeling and analysis results on an ISS assembly mission.
Combustion Stability Analyses for J-2X Gas Generator Development
NASA Technical Reports Server (NTRS)
Hulka, J. R.; Protz, C. S.; Casiano, M. J.; Kenny, R. J.
2010-01-01
The National Aeronautics and Space Administration (NASA) is developing a liquid oxygen/liquid hydrogen rocket engine for upper stage and trans-lunar applications of the Ares vehicles for the Constellation program. This engine, designated the J-2X, is a higher pressure, higher thrust variant of the Apollo-era J-2 engine. Development was contracted to Pratt & Whitney Rocketdyne in 2006. Over the past several years, development of the gas generator for the J-2X engine has progressed through a variety of workhorse injector, chamber, and feed system configurations. Several of these configurations have resulted in injection-coupled combustion instability of the gas generator assembly at the first longitudinal mode of the combustion chamber. In this paper, the longitudinal mode combustion instabilities observed on the workhorse test stand are discussed in detail. Aspects of this combustion instability have been modeled at the NASA Marshall Space Flight Center with several codes, including the Rocket Combustor Interaction Design and Analysis (ROCCID) code and a new lumped-parameter MatLab model. To accurately predict the instability characteristics of all the chamber and injector geometries and test conditions, several features of the submodels in the ROCCID suite of calculations required modification. Finite-element analyses were conducted of several complicated combustion chamber geometries to determine how to model and anchor the chamber response in ROCCID. A large suite of sensitivity calculations were conducted to determine how to model and anchor the injector response in ROCCID. These modifications and their ramification for future stability analyses of this type are discussed in detail. The lumped-parameter MatLab model of the gas generator assembly was created as an alternative calculation to the ROCCID methodology. This paper also describes this model and the stability calculations.
Statistically generated weighted curve fit of residual functions for modal analysis of structures
NASA Technical Reports Server (NTRS)
Bookout, P. S.
1995-01-01
A statistically generated weighting function for a second-order polynomial curve fit of residual functions has been developed. The residual flexibility test method, from which a residual function is generated, is a procedure for modal testing large structures in an external constraint-free environment to measure the effects of higher order modes and interface stiffness. This test method is applicable to structures with distinct degree-of-freedom interfaces to other system components. A theoretical residual function in the displacement/force domain has the characteristics of a relatively flat line in the lower frequencies and a slight upward curvature in the higher frequency range. In the test residual function, the above-mentioned characteristics can be seen in the data, but due to the present limitations in the modal parameter evaluation (natural frequencies and mode shapes) of test data, the residual function has regions of ragged data. A second order polynomial curve fit is required to obtain the residual flexibility term. A weighting function of the data is generated by examining the variances between neighboring data points. From a weighted second-order polynomial curve fit, an accurate residual flexibility value can be obtained. The residual flexibility value and free-free modes from testing are used to improve a mathematical model of the structure. The residual flexibility modal test method is applied to a straight beam with a trunnion appendage and a space shuttle payload pallet simulator.
Real time testing of intelligent relays for synchronous distributed generation islanding detection
NASA Astrophysics Data System (ADS)
Zhuang, Davy
As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.
Test model designs for advanced refractory ceramic materials
NASA Technical Reports Server (NTRS)
Tran, Huy Kim
1993-01-01
The next generation of space vehicles will be subjected to severe aerothermal loads and will require an improved thermal protection system (TPS) and other advanced vehicle components. In order to ensure the satisfactory performance system (TPS) and other advanced vehicle materials and components, testing is to be performed in environments similar to space flight. The design and fabrication of the test models should be fairly simple but still accomplish test objectives. In the Advanced Refractory Ceramic Materials test series, the models and model holders will need to withstand the required heat fluxes of 340 to 817 W/sq cm or surface temperatures in the range of 2700 K to 3000 K. The model holders should provide one dimensional (1-D) heat transfer to the samples and the appropriate flow field without compromising the primary test objectives. The optical properties such as the effective emissivity, catalytic efficiency coefficients, thermal properties, and mass loss measurements are also taken into consideration in the design process. Therefore, it is the intent of this paper to demonstrate the design schemes for different models and model holders that would accommodate these test requirements and ensure the safe operation in a typical arc jet facility.
Model-Drive Architecture for Agent-Based Systems
NASA Technical Reports Server (NTRS)
Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.
2004-01-01
The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.
Modeling OAE responses to short tones
NASA Astrophysics Data System (ADS)
Duifhuis, Hendrikus; Siegel, Jonathan
2015-12-01
In 1999 Shera and Guinan postulated that otoacoustic emissions evoked by low-level transient stimuli are generated by coherent linear reflection (CRF or CLR). This hypothesis was tested experimentally, e.g., by Siegel and Charaziak[10] by measuring emissions evoked by short (1 ms) tone pips in chinchilla. Using techniques in which supplied level and recorded spectral information were used Siegel and Charaziak concluded that much of the emission was generated by a mechanism in a region extending basally from the peak of the traveling wave and that the action of the suppressor is to remove emission generators evoked by the tone-pip and not to generate nonlinear artifacts in regions basal to the peak region. The original formulation of the CRF theory does not account for these results This study addresses relevant cochlear model predictions.
NASA Astrophysics Data System (ADS)
Morris, Joseph W.; Lowry, Mac; Boren, Brett; Towers, James B.; Trimble, Darian E.; Bunfield, Dennis H.
2011-06-01
The US Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) and the Redstone Test Center (RTC) has formed the Scene Generation Development Center (SGDC) to support the Department of Defense (DoD) open source EO/IR Scene Generation initiative for real-time hardware-in-the-loop and all-digital simulation. Various branches of the DoD have invested significant resources in the development of advanced scene and target signature generation codes. The SGDC goal is to maintain unlimited government rights and controlled access to government open source scene generation and signature codes. In addition, the SGDC provides development support to a multi-service community of test and evaluation (T&E) users, developers, and integrators in a collaborative environment. The SGDC has leveraged the DoD Defense Information Systems Agency (DISA) ProjectForge (https://Project.Forge.mil) which provides a collaborative development and distribution environment for the DoD community. The SGDC will develop and maintain several codes for tactical and strategic simulation, such as the Joint Signature Image Generator (JSIG), the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC), and Office of the Secretary of Defense (OSD) Test and Evaluation Science and Technology (T&E/S&T) thermal modeling and atmospherics packages, such as EOView, CHARM, and STAR. Other utility packages included are the ContinuumCore for real-time messaging and data management and IGStudio for run-time visualization and scenario generation.
Automatic mathematical modeling for real time simulation system
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1988-01-01
A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.
NASA Astrophysics Data System (ADS)
Liu, Dan; Li, Congsheng; Kang, Yangyang; Zhou, Zhou; Xie, Yi; Wu, Tongning
2017-09-01
In this study, the plane wave exposure of an infant to radiofrequency electromagnetic fields of 3.5 GHz was numerically analyzed to investigate the unintentional electromagnetic field (EMF) exposure of fifth generation (5G) signals during field test. The dosimetric influence of age-dependent dielectric properties and the influence of an adult body were evaluated using an infant model of 12 month old and an adult female model. The results demonstrated that the whole body-averaged specific absorption rate (WBASAR) was not significantly affected by age-dependent dielectric properties and the influence of the adult body did not enhance WBASAR. Taking the magnitude of the in situ
NASA Technical Reports Server (NTRS)
Frady, Gregory P.; Duvall, Lowery D.; Fulcher, Clay W. G.; Laverde, Bruce T.; Hunt, Ronald A.
2011-01-01
A rich body of vibroacoustic test data was recently generated at Marshall Space Flight Center for a curved orthogrid panel typical of launch vehicle skin structures. Several test article configurations were produced by adding component equipment of differing weights to the flight-like vehicle panel. The test data were used to anchor computational predictions of a variety of spatially distributed responses including acceleration, strain and component interface force. Transfer functions relating the responses to the input pressure field were generated from finite element based modal solutions and test-derived damping estimates. A diffuse acoustic field model was employed to describe the assumed correlation of phased input sound pressures across the energized panel. This application demonstrates the ability to quickly and accurately predict a variety of responses to acoustically energized skin panels with mounted components. Favorable comparisons between the measured and predicted responses were established. The validated models were used to examine vibration response sensitivities to relevant modeling parameters such as pressure patch density, mesh density, weight of the mounted component and model form. Convergence metrics include spectral densities and cumulative root-mean squared (RMS) functions for acceleration, velocity, displacement, strain and interface force. Minimum frequencies for response convergence were established as well as recommendations for modeling techniques, particularly in the early stages of a component design when accurate structural vibration requirements are needed relatively quickly. The results were compared with long-established guidelines for modeling accuracy of component-loaded panels. A theoretical basis for the Response/Pressure Transfer Function (RPTF) approach provides insight into trends observed in the response predictions and confirmed in the test data. The software modules developed for the RPTF method can be easily adapted for quick replacement of the diffuse acoustic field with other pressure field models; for example a turbulent boundary layer (TBL) model suitable for vehicle ascent. Wind tunnel tests have been proposed to anchor the predictions and provide new insight into modeling approaches for this type of environment. Finally, component vibration environments for design were developed from the measured and predicted responses and compared with those derived from traditional techniques such as Barrett scaling methods for unloaded and component-loaded panels.
Naik, Aijaz A.; Patro, Ishan K.; Patro, Nisha
2015-01-01
Environmental stressors including protein malnutrition (PMN) during pre-, neo- and post-natal age have been documented to affect cognitive development and cause increased susceptibility to neuropsychiatric disorders. Most studies have addressed either of the three windows and that does not emulate the clinical conditions of intra-uterine growth restriction (IUGR). Such data fail to provide a complete picture of the behavioral alterations in the F1 generation. The present study thus addresses the larger window from gestation to F1 generation, a new model of intra-generational PMN. Naive Sprague Dawley (SD) dams pre-gestationally switched to LP (8% protein) or HP (20% protein) diets for 45 days were bred and maintained throughout gestation on same diets. Pups born (HP/LP dams) were maintained on the respective diets post-weaningly. The present study aimed to show the sex specific differences in the neurobehavioral evolution and behavioral phenotype of the HP/LP F1 generation pups. A battery of neurodevelopmental reflex tests, behavioral (Open field and forelimb gripstrength test), and cognitive [Elevated plus maze (EPM) and Morris water maze (MWM)] assays were performed. A decelerated growth curve with significantly restricted body and brain weight, delays in apparition of neuro-reflexes and poor performance in the LP group rats was recorded. Intra-generational PMN induced poor habituation-with-time in novel environment exploration, low anxiety and hyperactive like profile in open field test in young and adult rats. The study revealed poor forelimb neuromuscular strength in LP F1 pups till adulthood. Group occupancy plots in MWM test revealed hyperactivity with poor learning, impaired memory retention and integration, thus modeling the signs of early onset Alzehemier phenotype. In addition, a gender specific effect of LP diet with severity in males and favoring female sex was also noticed. PMID:26696810
Transgenerational Adaptation to Pollution Changes Energy Allocation in Populations of Nematodes.
Goussen, Benoit; Péry, Alexandre R R; Bonzom, Jean-Marc; Beaudouin, Rémy
2015-10-20
Assessing the evolutionary responses of long-term exposed populations requires multigeneration ecotoxicity tests. However, the analysis of the data from these tests is not straightforward. Mechanistic models allow the in-depth analysis of the variation of physiological traits over many generations, by quantifying the trend of the physiological and toxicological parameters of the model. In the present study, a bioenergetic mechanistic model has been used to assess the evolution of two populations of the nematode Caenorhabditis elegans in control conditions or exposed to uranium. This evolutionary pressure resulted in a brood size reduction of 60%. We showed an adaptation of individuals of both populations to experimental conditions (increase of maximal length, decrease of growth rate, decrease of brood size, and decrease of the elimination rate). In addition, differential evolution was also highlighted between the two populations once the maternal effects had been diminished after several generations. Thus, individuals that were greater in maximal length, but with apparently a greater sensitivity to uranium were selected in the uranium population. In this study, we showed that this bioenergetics mechanistic modeling approach provided a precise, certain, and powerful analysis of the life strategy of C. elegans populations exposed to heavy metals resulting in an evolutionary pressure across successive generations.
Identifying Metabolically Active Chemicals Using a Consensus ...
Traditional toxicity testing provides insight into the mechanisms underlying toxicological responses but requires a high investment in a large number of resources. The new paradigm of testing approaches involves rapid screening studies able to evaluate thousands of chemicals across hundreds of biological targets through use of in vitro assays. Endocrine disrupting chemicals (EDCs) are of concern due to their ability to alter neurodevelopment, behavior, and reproductive success of humans and other species. A recent integrated computational model examined results across 18 ER-related assays in the ToxCast in vitro screening program to eliminate chemicals that produce a false signal by possibly interfering with the technological attributes of an individual assay. However, in vitro assays can also lead to false negatives when the complex metabolic processes that render a chemical bioactive in a living system might be unable to be replicated in an in vitro environment. In the current study, the influence of metabolism was examined for over 1,400 chemicals considered inactive using the integrated computational model. Over 2,000 first-generation and over 4,000 second-generation metabolites were generated for the inactive chemicals using in silico techniques. Next, a consensus model comprised of individual structure activity relationship (SAR) models was used to predict ER-binding activity for each of the metabolites. Binding activity was predicted for 8-10% of the meta
Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base
NASA Technical Reports Server (NTRS)
Bryant, Richard B., Jr.; Carrelli, David J.
2006-01-01
The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.
Cyberwar XXI: quantifying the unquantifiable: adaptive AI for next-generation conflict simulations
NASA Astrophysics Data System (ADS)
Miranda, Joseph; von Kleinsmid, Peter; Zalewski, Tony
2004-08-01
The era of the "Revolution in Military Affairs," "4th Generation Warfare" and "Asymmetric War" requires novel approaches to modeling warfare at the operational and strategic level of modern conflict. For example, "What if, in response to our planned actions, the adversary reacts in such-and-such a manner? What will our response be? What are the possible unintended consequences?" Next generation conflict simulation tools are required to help create and test novel courses of action (COA's) in support of real-world operations. Conflict simulations allow non-lethal and cost-effective exploration of the "what-if" of COA development. The challenge has been to develop an automated decision-support software tool which allows competing COA"s to be compared in simulated dynamic environments. Principal Investigator Joseph Miranda's research is based on modeling an integrated military, economic, social, infrastructure and information (PMESII) environment. The main effort was to develop an adaptive AI engine which models agents operating within an operational-strategic conflict environment. This was implemented in Cyberwar XXI - a simulation which models COA selection in a PMESII environment. Within this framework, agents simulate decision-making processes and provide predictive capability of the potential behavior of Command Entities. The 2003 Iraq is the first scenario ready for V&V testing.
Conditional Monte Carlo randomization tests for regression models.
Parhat, Parwen; Rosenberger, William F; Diao, Guoqing
2014-08-15
We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Researcher Examines an Aircraft Model with a Four-Fan Thrust Reverser
1972-03-21
National Aeronautics and Space Administration (NASA) researcher John Carpenter inspects an aircraft model with a four-fan thrust reverser which would be studied in the 9- by 15-Foot Low Speed Wind Tunnel at the Lewis Research Center. Thrust reversers were introduced in the 1950s as a means for slowing high-speed jet aircraft during landing. Engineers sought to apply the technology to Vertical and Short Takeoff and Landing (VSTOL) aircraft in the 1970s. The new designs would have to take into account shorter landing areas, noise levels, and decreased thrust levels. A balance was needed between the thrust reverser’s efficiency, its noise generation, and the engine’s power setting. This model underwent a series of four tests in the 9- by 15-foot tunnel during April and May 1974. The model, with a high-wing configuration and no tail, was equipped with four thrust-reverser engines. The investigations included static internal aerodynamic tests on a single fan/reverser, wind tunnel isolated fan/reverser thrust tests, installation effects on a four-fan airplane model in a wind tunnel, and single reverser acoustic tests. The 9-by 15 was built inside the return leg of the 8- by 6-Foot Supersonic Wind Tunnel in 1968. The facility generates airspeeds from 0 to 175 miles per hour to evaluate the aerodynamic performance and acoustic characteristics of nozzles, inlets, and propellers, and investigate hot gas re-ingestion of advanced VSTOL concepts. John Carpenter was a technician in the Wind Tunnels Service Section of the Test Installations Division.
Feasibility of Using Neural Network Models to Accelerate the Testing of Mechanical Systems
NASA Technical Reports Server (NTRS)
Fusaro, Robert L.
1998-01-01
Verification testing is an important aspect of the design process for mechanical mechanisms, and full-scale, full-length life testing is typically used to qualify any new component for use in space. However, as the required life specification is increased, full-length life tests become more costly and lengthen the development time. At the NASA Lewis Research Center, we theorized that neural network systems may be able to model the operation of a mechanical device. If so, the resulting neural network models could simulate long-term mechanical testing with data from a short-term test. This combination of computer modeling and short-term mechanical testing could then be used to verify the reliability of mechanical systems, thereby eliminating the costs associated with long-term testing. Neural network models could also enable designers to predict the performance of mechanisms at the conceptual design stage by entering the critical parameters as input and running the model to predict performance. The purpose of this study was to assess the potential of using neural networks to predict the performance and life of mechanical systems. To do this, we generated a neural network system to model wear obtained from three accelerated testing devices: 1) A pin-on-disk tribometer; 2) A line-contact rub-shoe tribometer; 3) A four-ball tribometer.
An integer programming model for distal humerus fracture fixation planning.
Maratt, Joseph D; Peaks, Ya-Sin A; Doro, Lisa Case; Karunakar, Madhav A; Hughes, Richard E
2008-05-01
To demonstrate the feasibility of an integer programming model to assist in pre-operative planning for open reduction and internal fixation of a distal humerus fracture. We describe an integer programming model based on the objective of maximizing the reward for screws placed while satisfying the requirements for sound internal fixation. The model maximizes the number of bicortical screws placed while avoiding screw collision and favoring screws of greater length that cross multiple fracture planes. The model was tested on three types of total articular fractures of the distal humerus. Solutions were generated using 5, 9, 21 and 33 possible screw orientations per hole. Solutions generated using 33 possible screw orientations per hole and five screw lengths resulted in the most clinically relevant fixation plan and required the calculation of 1,191,975 pairs of screws that resulted in collision. At this level of complexity, the pre-processor took 104 seconds to generate the constraints for the solver, and a solution was generated in under one minute in all three cases. Despite the large size of this problem, it can be solved in a reasonable amount of time, making use of the model practical in pre-surgical planning.
Bromochloromethane (BCM) is a volatile compound and a by-product of disinfection of water by ofchlorination. Physiologically based pharmacokinetic (PBPK) models are used in risk assessment applications. An updated PBPKmodel for BCM is generated and applied to hypotheses testing c...
Continuous data assimilation for downscaling large-footprint soil moisture retrievals
NASA Astrophysics Data System (ADS)
Altaf, Muhammad U.; Jana, Raghavendra B.; Hoteit, Ibrahim; McCabe, Matthew F.
2016-10-01
Soil moisture is a key component of the hydrologic cycle, influencing processes leading to runoff generation, infiltration and groundwater recharge, evaporation and transpiration. Generally, the measurement scale for soil moisture is found to be different from the modeling scales for these processes. Reducing this mismatch between observation and model scales in necessary for improved hydrological modeling. An innovative approach to downscaling coarse resolution soil moisture data by combining continuous data assimilation and physically based modeling is presented. In this approach, we exploit the features of Continuous Data Assimilation (CDA) which was initially designed for general dissipative dynamical systems and later tested numerically on the incompressible Navier-Stokes equation, and the Benard equation. A nudging term, estimated as the misfit between interpolants of the assimilated coarse grid measurements and the fine grid model solution, is added to the model equations to constrain the model's large scale variability by available measurements. Soil moisture fields generated at a fine resolution by a physically-based vadose zone model (HYDRUS) are subjected to data assimilation conditioned upon coarse resolution observations. This enables nudging of the model outputs towards values that honor the coarse resolution dynamics while still being generated at the fine scale. Results show that the approach is feasible to generate fine scale soil moisture fields across large extents, based on coarse scale observations. Application of this approach is likely in generating fine and intermediate resolution soil moisture fields conditioned on the radiometerbased, coarse resolution products from remote sensing satellites.
Menze, Bjoern H; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-Andre; Szekely, Gabor; Ayache, Nicholas; Golland, Polina
2016-04-01
We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM), to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as "tumor core" or "fluid-filled structure", but without a one-to-one correspondence to the hypo- or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the extended discriminative -discriminative model to be one of the top ranking methods in the BRATS evaluation.
A primary goal of computational toxicology is to generate predictive models of toxicity. An elusive target of alternative test methods and models has been the accurate prediction of systemic toxicity points of departure (PoD). We aim not only to provide a large and valuable resou...
Advanced Stirling Convertor Testing at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Poriti, Sal
2010-01-01
The NASA Glenn Research Center (GRC) has been testing high-efficiency free-piston Stirling convertors for potential use in radioisotope power systems (RPSs) since 1999. The current effort is in support of the Advanced Stirling Radioisotope Generator (ASRG), which is being developed by the U.S. Department of Energy (DOE), Lockheed Martin Space Systems Company (LMSSC), Sunpower, Inc., and the NASA GRC. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs) to convert thermal energy from a radioisotope heat source into electricity. As reliability is paramount to a RPS capable of providing spacecraft power for potential multi-year missions, GRC provides direct technology support to the ASRG flight project in the areas of reliability, convertor and generator testing, high-temperature materials, structures, modeling and analysis, organics, structural dynamics, electromagnetic interference (EMI), and permanent magnets to reduce risk and enhance reliability of the convertor as this technology transitions toward flight status. Convertor and generator testing is carried out in short- and long-duration tests designed to characterize convertor performance when subjected to environments intended to simulate launch and space conditions. Long duration testing is intended to baseline performance and observe any performance degradation over the life of the test. Testing involves developing support hardware that enables 24/7 unattended operation and data collection. GRC currently has 14 Stirling convertors under unattended extended operation testing, including two operating in the ASRG Engineering Unit (ASRG-EU). Test data and high-temperature support hardware are discussed for ongoing and future ASC tests with emphasis on the ASC-E and ASC-E2.
Models of germ cell development and their application for toxicity studies
Ferreira, Daniel W.; Allard, Patrick
2015-01-01
Germ cells are unique in their ability to transfer genetic information and traits from generation to generation. As such, the proper development of germ cells and the integrity of their genome are paramount to the health of organisms and the survival of species. Germ cells are also exquisitely sensitive to environmental influences although the testing of germ cell toxicity, especially in females, has proven particularly challenging. In this review, we first describe the remarkable odyssey of germ cells in mammals, with an emphasis on the female germline, from their initial specification during embryogenesis to the generation of mature gametes in adults. We also describe the current methods used in germ cell toxicity testing and their limitations in examining the complex features of mammalian germ cell development. To bypass these challenges, we propose the use of alternative model systems such as Saccharomyces cerevisiae, Drosophila melanogaster, Caenorhabditis elegans and in vitro germ cell methods that have distinct advantages over traditional toxicity models. We discuss the benefits and limitations of each approach, their application to germ cell toxicity studies, and the need for computational approaches to maximize the usefulness of these models. Together, the inclusion of these alternative germ cell toxicity models will be invaluable for the examination of stages not easily accessible in mammals as well as the large scale, high-throughput investigation of germ cell toxicity. PMID:25821157
NASA Technical Reports Server (NTRS)
Annett, Martin S.; Polanco, Michael A.
2010-01-01
A full-scale crash test of an MD-500 helicopter was conducted in December 2009 at NASA Langley's Landing and Impact Research facility (LandIR). The MD-500 helicopter was fitted with a composite honeycomb Deployable Energy Absorber (DEA) and tested under vertical and horizontal impact velocities of 26-ft/sec and 40-ft/sec, respectively. The objectives of the test were to evaluate the performance of the DEA concept under realistic crash conditions and to generate test data for validation of a system integrated finite element model. In preparation for the full-scale crash test, a series of sub-scale and MD-500 mass simulator tests was conducted to evaluate the impact performances of various components, including a new crush tube and the DEA blocks. Parameters defined within the system integrated finite element model were determined from these tests. The objective of this paper is to summarize the finite element models developed and analyses performed, beginning with pre-test predictions and continuing through post-test validation.
LS-DYNA Analysis of a Full-Scale Helicopter Crash Test
NASA Technical Reports Server (NTRS)
Annett, Martin S.
2010-01-01
A full-scale crash test of an MD-500 helicopter was conducted in December 2009 at NASA Langley's Landing and Impact Research facility (LandIR). The MD-500 helicopter was fitted with a composite honeycomb Deployable Energy Absorber (DEA) and tested under vertical and horizontal impact velocities of 26 ft/sec and 40 ft/sec, respectively. The objectives of the test were to evaluate the performance of the DEA concept under realistic crash conditions and to generate test data for validation of a system integrated LS-DYNA finite element model. In preparation for the full-scale crash test, a series of sub-scale and MD-500 mass simulator tests was conducted to evaluate the impact performances of various components, including a new crush tube and the DEA blocks. Parameters defined within the system integrated finite element model were determined from these tests. The objective of this paper is to summarize the finite element models developed and analyses performed, beginning with pre-test and continuing through post test validation.
Detecting drawdowns masked by environmental stresses with water-level models
Garcia, C.A.; Halford, K.J.; Fenelon, J.M.
2013-01-01
Detecting and quantifying small drawdown at observation wells distant from the pumping well greatly expands the characterized aquifer volume. However, this detection is often obscured by water level fluctuations such as barometric and tidal effects. A reliable analytical approach for distinguishing drawdown from nonpumping water-level fluctuations is presented and tested here. Drawdown is distinguished by analytically simulating all pumping and nonpumping water-level stresses simultaneously during the period of record. Pumping signals are generated with Theis models, where the pumping schedule is translated into water-level change with the Theis solution. This approach closely matched drawdowns simulated with a complex three-dimensional, hypothetical model and reasonably estimated drawdowns from an aquifer test conducted in a complex hydrogeologic system. Pumping-induced changes generated with a numerical model and analytical Theis model agreed (RMS as low as 0.007 m) in cases where pumping signals traveled more than 1 km across confining units and fault structures. Maximum drawdowns of about 0.05 m were analytically estimated from field investigations where environmental fluctuations approached 0.2 m during the analysis period.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Michael K.; Davidson, Megan
As part of Sandia’s nuclear deterrence mission, the B61-12 Life Extension Program (LEP) aims to modernize the aging weapon system. Modernization requires requalification and Sandia is using high performance computing to perform advanced computational simulations to better understand, evaluate, and verify weapon system performance in conjunction with limited physical testing. The Nose Bomb Subassembly (NBSA) of the B61-12 is responsible for producing a fuzing signal upon ground impact. The fuzing signal is dependent upon electromechanical impact sensors producing valid electrical fuzing signals at impact. Computer generated models were used to assess the timing between the impact sensor’s response to themore » deceleration of impact and damage to major components and system subassemblies. The modeling and simulation team worked alongside the physical test team to design a large-scale reverse ballistic test to not only assess system performance, but to also validate their computational models. The reverse ballistic test conducted at Sandia’s sled test facility sent a rocket sled with a representative target into a stationary B61-12 (NBSA) to characterize the nose crush and functional response of NBSA components. Data obtained from data recorders and high-speed photometrics were integrated with previously generated computer models in order to refine and validate the model’s ability to reliably simulate real-world effects. Large-scale tests are impractical to conduct for every single impact scenario. By creating reliable computer models, we can perform simulations that identify trends and produce estimates of outcomes over the entire range of required impact conditions. Sandia’s HPCs enable geometric resolution that was unachievable before, allowing for more fidelity and detail, and creating simulations that can provide insight to support evaluation of requirements and performance margins. As computing resources continue to improve, researchers at Sandia are hoping to improve these simulations so they provide increasingly credible analysis of the system response and performance over the full range of conditions.« less
Cao, Hongrui; Niu, Linkai; He, Zhengjia
2012-01-01
Bearing defects are one of the most important mechanical sources for vibration and noise generation in machine tool spindles. In this study, an integrated finite element (FE) model is proposed to predict the vibration responses of a spindle bearing system with localized bearing defects and then the sensor placement for better detection of bearing faults is optimized. A nonlinear bearing model is developed based on Jones' bearing theory, while the drawbar, shaft and housing are modeled as Timoshenko's beam. The bearing model is then integrated into the FE model of drawbar/shaft/housing by assembling equations of motion. The Newmark time integration method is used to solve the vibration responses numerically. The FE model of the spindle-bearing system was verified by conducting dynamic tests. Then, the localized bearing defects were modeled and vibration responses generated by the outer ring defect were simulated as an illustration. The optimization scheme of the sensor placement was carried out on the test spindle. The results proved that, the optimal sensor placement depends on the vibration modes under different boundary conditions and the transfer path between the excitation and the response. PMID:23012514
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler
Battery Life estimation is one of the key inputs required for Hybrid applications for all GM Hybrid/EV/EREV/PHEV programs. For each Hybrid vehicle program, GM has instituted multi-parameter Design of Experiments generating test data at Cell level and also Pack level on a reduced basis. Based on experience, generating test data on a pack level is found to be very expensive, resource intensive and sometimes less reliable. The proposed collaborative project will focus on a methodology to estimate Battery life based on cell degradation data combined with pack thermal modeling. NREL has previously developed cell-level battery aging models and pack-level thermal/electricalmore » network models, though these models are currently not integrated. When coupled together, the models are expected to describe pack-level thermal and aging response of individual cells. GM and NREL will use data collected for GM's Bas+ battery system for evaluation of the proposed methodology and assess to what degree these models can replace pack-level aging experiments in the future.« less
NASA Astrophysics Data System (ADS)
Son, Yurak; Kamano, Takuya; Yasuno, Takashi; Suzuki, Takayuki; Harada, Hironobu
This paper describes the generation of adaptive gait patterns using new Central Pattern Generators (CPGs) including motor dynamic models for a quadruped robot under various environment. The CPGs act as the flexible oscillators of the joints and make the desired angle of the joints. The CPGs are mutually connected each other, and the sets of their coupling parameters are adjusted by genetic algorithm so that the quadruped robot can realize the stable and adequate gait patterns. As a result of generation, the suitable CPG networks for not only a walking straight gait pattern but also rotation gait patterns are obtained. Experimental results demonstrate that the proposed CPG networks are effective to automatically adjust the adaptive gait patterns for the tested quadruped robot under various environment. Furthermore, the target tracking control based on image processing is achieved by combining the generated gait patterns.
Improved Ant Algorithms for Software Testing Cases Generation
Yang, Shunkun; Xu, Jiaqi
2014-01-01
Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391
A wave model test bed study for wave energy resource characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Zhaoqing; Neary, Vincent S.; Wang, Taiping
This paper presents a test bed study conducted to evaluate best practices in wave modeling to characterize energy resources. The model test bed off the central Oregon Coast was selected because of the high wave energy and available measured data at the site. Two third-generation spectral wave models, SWAN and WWIII, were evaluated. A four-level nested-grid approach—from global to test bed scale—was employed. Model skills were assessed using a set of model performance metrics based on comparing six simulated wave resource parameters to observations from a wave buoy inside the test bed. Both WWIII and SWAN performed well at themore » test bed site and exhibited similar modeling skills. The ST4 package with WWIII, which represents better physics for wave growth and dissipation, out-performed ST2 physics and improved wave power density and significant wave height predictions. However, ST4 physics tended to overpredict the wave energy period. The newly developed ST6 physics did not improve the overall model skill for predicting the six wave resource parameters. Sensitivity analysis using different wave frequencies and direction resolutions indicated the model results were not sensitive to spectral resolutions at the test bed site, likely due to the absence of complex bathymetric and geometric features.« less
Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration
NASA Technical Reports Server (NTRS)
Groce, Alex; Joshi, Rajeev
2008-01-01
Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.
NASA Astrophysics Data System (ADS)
Xie, Longhan; Li, Jiehong; Li, Xiaodong; Huang, Ledeng; Cai, Siqi
2018-01-01
Hydraulic dampers are used to decrease the vibration of a vehicle, where vibration energy is dissipated as heat. In addition to resulting in energy waste, the damping coefficient in hydraulic dampers cannot be changed during operation. In this paper, an energy-harvesting vehicle damper was proposed to replace traditional hydraulic dampers. The goal is not only to recover kinetic energy from suspension vibration but also to change the damping coefficient during operation according to road conditions. The energy-harvesting damper consists of multiple generators that are independently controlled by switches. One of these generators connects to a tunable resistor for fine tuning the damping coefficient, while the other generators are connected to a control and rectifying circuit, each of which both regenerates electricity and provides a constant damping coefficient. A mathematical model was built to investigate the performance of the energy-harvesting damper. By controlling the number of switched-on generators and adjusting the value of the external tunable resistor, the damping can be fine tuned according to the requirement. In addition to the capability of damping tuning, the multiple controlled generators can output a significant amount of electricity. A prototype was built to test the energy-harvesting damper design. Experiments on an MTS testing system were conducted, with results that validated the theoretical analysis. Experiments show that changing the number of switched-on generators can obviously tune the damping coefficient of the damper and simultaneously produce considerable electricity.