BEAT: A Web-Based Boolean Expression Fault-Based Test Case Generation Tool
ERIC Educational Resources Information Center
Chen, T. Y.; Grant, D. D.; Lau, M. F.; Ng, S. P.; Vasa, V. R.
2006-01-01
BEAT is a Web-based system that generates fault-based test cases from Boolean expressions. It is based on the integration of our several fault-based test case selection strategies. The generated test cases are considered to be fault-based, because they are aiming at the detection of particular faults. For example, when the Boolean expression is in…
A rule-based software test data generator
NASA Technical Reports Server (NTRS)
Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II
1991-01-01
Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
Automated unit-level testing with heuristic rules
NASA Technical Reports Server (NTRS)
Carlisle, W. Homer; Chang, Kai-Hsiung; Cross, James H.; Keleher, William; Shackelford, Keith
1990-01-01
Software testing plays a significant role in the development of complex software systems. Current testing methods generally require significant effort to generate meaningful test cases. The QUEST/Ada system is a prototype system designed using CLIPS to experiment with expert system based test case generation. The prototype is designed to test for condition coverage, and attempts to generate test cases to cover all feasible branches contained in an Ada program. This paper reports on heuristics sued by the system. These heuristics vary according to the amount of knowledge obtained by preprocessing and execution of the boolean conditions in the program.
Coverage criteria for test case generation using UML state chart diagram
NASA Astrophysics Data System (ADS)
Salman, Yasir Dawood; Hashim, Nor Laily; Rejab, Mawarny Md; Romli, Rohaida; Mohd, Haslina
2017-10-01
To improve the effectiveness of test data generation during the software test, many studies have focused on the automation of test data generation from UML diagrams. One of these diagrams is the UML state chart diagram. Test cases are generally evaluated according to coverage criteria. However, combinations of multiple criteria are required to achieve better coverage. Different studies used various number and types of coverage criteria in their methods and approaches. The objective of this paper to propose suitable coverage criteria for test case generation using UML state chart diagram especially in handling loops. In order to achieve this objective, this work reviewed previous studies to present the most practical coverage criteria combinations, including all-states, all-transitions, all-transition-pairs, and all-loop-free-paths coverage. Calculation to determine the coverage percentage of the proposed coverage criteria were presented together with an example has they are applied on a UML state chart diagram. This finding would be beneficial in the area of test case generating especially in handling loops in UML state chart diagram.
Test-Case Generation using an Explicit State Model Checker Final Report
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Gao, Jimin
2003-01-01
In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.
Optimal Test Design with Rule-Based Item Generation
ERIC Educational Resources Information Center
Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.
2013-01-01
Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…
Formal methods for test case generation
NASA Technical Reports Server (NTRS)
Rushby, John (Inventor); De Moura, Leonardo Mendonga (Inventor); Hamon, Gregoire (Inventor)
2011-01-01
The invention relates to the use of model checkers to generate efficient test sets for hardware and software systems. The method provides for extending existing tests to reach new coverage targets; searching *to* some or all of the uncovered targets in parallel; searching in parallel *from* some or all of the states reached in previous tests; and slicing the model relative to the current set of coverage targets. The invention provides efficient test case generation and test set formation. Deep regions of the state space can be reached within allotted time and memory. The approach has been applied to use of the model checkers of SRI's SAL system and to model-based designs developed in Stateflow. Stateflow models achieving complete state and transition coverage in a single test case are reported.
Automated Generation and Assessment of Autonomous Systems Test Cases
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.
2008-01-01
This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results based on a range of metrics. Although the specific means of scoring depends highly on the application, the use of formal scoring - metrics has high value in identifying and prioritizing anomalies, and in presenting an overall picture of the state of the test program. In this paper we present a case study based on automatic generation and assessment of faulted test runs for the Dawn mission, and discuss its role in optimizing the allocation of resources for completing the test program.
Path generation algorithm for UML graphic modeling of aerospace test software
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao
2018-03-01
Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.
NASA Technical Reports Server (NTRS)
Generazio, Edward R. (Inventor)
2012-01-01
A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.
Chen, Derrick J; Yao, Joseph D
2017-06-01
Updated recommendations for HIV diagnostic laboratory testing published by the Centers for Disease Control and Prevention and the Association of Public Health Laboratories incorporate 4th generation HIV immunoassays, which are capable of identifying HIV infection prior to seroconversion. The purpose of this study was to compare turnaround time and cost between 3rd and 4th generation HIV immunoassay-based testing algorithms for initially reactive results. The clinical microbiology laboratory database at Mayo Clinic, Rochester, MN was queried for 3rd generation (from November 2012 to May 2014) and 4th generation (from May 2014 to November 2015) HIV immunoassay results. All results from downstream supplemental testing were recorded. Turnaround time (defined as the time of initial sample receipt in the laboratory to the time the final supplemental test in the algorithm was resulted) and cost (based on 2016 Medicare reimbursement rates) were assessed. A total of 76,454 and 78,998 initial tests were performed during the study period using the 3rd generation and 4th generation HIV immunoassays, respectively. There were 516 (0.7%) and 581 (0.7%) total initially reactive results, respectively. Of these, 304 (58.9%) and 457 (78.7%) were positive by supplemental testing. There were 10 (0.01%) cases of acute HIV infection identified with the 4th generation algorithm. The most frequent tests performed to confirm an HIV-positive case using the 3rd generation algorithm, which were reactive initial immunoassay and positive HIV-1 Western blot, took a median time of 1.1 days to complete at a cost of $45.00. In contrast, the most frequent tests performed to confirm an HIV-positive case using the 4th generation algorithm, which included a reactive initial immunoassay and positive HIV-1/-2 antibody differentiation immunoassay for HIV-1, took a median time of 0.4 days and cost $63.25. Overall median turnaround time was 2.2 and 1.5 days, and overall median cost was $63.90 and $72.50 for 3rd and 4th generation algorithms, respectively. Both 3rd and 4th generation HIV immunoassays had similar total numbers of tests performed and positivity rates during the study period. A greater proportion of reactive 4th generation immunoassays were confirmed to be positive, and the 4th generation algorithm identified several cases of acute HIV infection that would have been missed by the 3rd generation algorithm. The 4th generation algorithm had a more rapid turnaround time but higher cost for confirmed positive HIV infections and overall, compared to the 3rd generation algorithm. Copyright © 2017 Elsevier B.V. All rights reserved.
Improved Ant Algorithms for Software Testing Cases Generation
Yang, Shunkun; Xu, Jiaqi
2014-01-01
Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391
Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras
2018-05-01
The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.
Method of Generating Transient Equivalent Sink and Test Target Temperatures for Swift BAT
NASA Technical Reports Server (NTRS)
Choi, Michael K.
2004-01-01
The NASA Swift mission has a 600-km altitude and a 22 degrees maximum inclination. The sun angle varies from 45 degrees to 180 degrees in normal operation. As a result, environmental heat fluxes absorbed by the Burst Alert Telescope (BAT) radiator and loop heat pipe (LHP) compensation chambers (CCs) vary transiently. Therefore the equivalent sink temperatures for the radiator and CCs varies transiently. In thermal performance verification testing in vacuum, the radiator and CCs radiated heat to sink targets. This paper presents an analytical technique for generating orbit transient equivalent sink temperatures and a technique for generating transient sink target temperatures for the radiator and LHP CCs. Using these techniques, transient target temperatures for the radiator and LHP CCs were generated for three thermal environmental cases: worst hot case, worst cold case, and cooldown and warmup between worst hot case in sunlight and worst cold case in the eclipse, and three different heat transport values: 128 W, 255 W, and 382 W. The 128 W case assumed that the two LHPs transport 255 W equally to the radiator. The 255 W case assumed that one LHP fails so that the remaining LHP transports all the waste heat from the detector array to the radiator. The 382 W case assumed that one LHP fails so that the remaining LHP transports all the waste heat from the detector array to the radiator, and has a 50% design margin. All these transient target temperatures were successfully implemented in the engineering test unit (ETU) LHP and flight LHP thermal performance verification tests in vacuum.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
Determination of HIV Status in African Adults With Discordant HIV Rapid Tests.
Fogel, Jessica M; Piwowar-Manning, Estelle; Donohue, Kelsey; Cummings, Vanessa; Marzinke, Mark A; Clarke, William; Breaud, Autumn; Fiamma, Agnès; Donnell, Deborah; Kulich, Michal; Mbwambo, Jessie K K; Richter, Linda; Gray, Glenda; Sweat, Michael; Coates, Thomas J; Eshleman, Susan H
2015-08-01
In resource-limited settings, HIV infection is often diagnosed using 2 rapid tests. If the results are discordant, a third tie-breaker test is often used to determine HIV status. This study characterized samples with discordant rapid tests and compared different testing strategies for determining HIV status in these cases. Samples were previously collected from 173 African adults in a population-based survey who had discordant rapid test results. Samples were classified as HIV positive or HIV negative using a rigorous testing algorithm that included two fourth-generation tests, a discriminatory test, and 2 HIV RNA tests. Tie-breaker tests were evaluated, including rapid tests (1 performed in-country), a third-generation enzyme immunoassay, and two fourth-generation tests. Selected samples were further characterized using additional assays. Twenty-nine samples (16.8%) were classified as HIV positive and 24 of those samples (82.8%) had undetectable HIV RNA. Antiretroviral drugs were detected in 1 sample. Sensitivity was 8.3%-43% for the rapid tests; 24.1% for the third-generation enzyme immunoassay; 95.8% and 96.6% for the fourth-generation tests. Specificity was lower for the fourth-generation tests than the other tests. Accuracy ranged from 79.5% to 91.3%. In this population-based survey, most HIV-infected adults with discordant rapid tests were virally suppressed without antiretroviral drugs. Use of individual assays as tie-breaker tests was not a reliable method for determining HIV status in these individuals. More extensive testing algorithms that use a fourth-generation screening test with a discriminatory test and HIV RNA test are preferable for determining HIV status in these cases.
A Micro-Computer Model for Army Air Defense Training.
1985-03-01
generator. The period is 32763 numbers generated before a repetitive sequence is encountered on the development system. Chi-Squared tests for frequency...C’ Tests CPeriodicity. The period is 32763 numbers generated C’before a repetitive sequence is encountered on the development system. This was...positions in the test array. This was done with several different random number seeds. In each case 32763 p random numbers were generated before a
The Application of Surface Potential Test on Hand-making Insulation for Generator Stator End-winding
NASA Astrophysics Data System (ADS)
Lu, Zhu-mao; Liu, Qing; Wang, Tian-zheng; Bai, Lu; Li, Yan-peng
2017-05-01
This paper presents the advantage of surface potential test on hand-making insulation for generator stator end-winding insulation detection, compared with DC or AC withstand voltage test, also details the test principle, connection method and test notes. And through the case, surface potential test on hand-making insulation proved effective for insulation quality detection after generator stator end-winding maintenance, and the experimental data is useful and reliable for the electrical equipment operation and maintenance in the power plant.
Utility Bill Calibration Test Cases | Buildings | NREL
illustrates the utility bill calibration test cases in BESTEST-EX. In these cases, participants are given software results have been generated. This diagram provides an overview of the BESTEST-EX utility bill calibration case process. On the left side of the diagram is a box labeled "BESTEST-EX Document"
Creation and Delivery of New Superpixelized DIRBE Map Products
NASA Technical Reports Server (NTRS)
Weiland, J.
1998-01-01
Phase 1 called for the following tasks: (1) completion of code to generate intermediate files containing the individual DIRBE observations which would be used to make the superpixelized maps; (2) completion of code necessary to generate the maps themselves; and (3) quality control on test-case maps in the form of point-source extraction and photometry. Items 1 and 2 are well in hand and the tested code is nearly complete. A few test maps have been generated for the tests mentioned in item 3. Map generation is not in production mode yet.
Experiments with Test Case Generation and Runtime Analysis
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Drusinsky, Doron; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Rosu, Grigore; Visser, Willem; Koga, Dennis (Technical Monitor)
2003-01-01
Software testing is typically an ad hoc process where human testers manually write many test inputs and expected test results, perhaps automating their execution in a regression suite. This process is cumbersome and costly. This paper reports preliminary results on an approach to further automate this process. The approach consists of combining automated test case generation based on systematically exploring the program's input domain, with runtime analysis, where execution traces are monitored and verified against temporal logic specifications, or analyzed using advanced algorithms for detecting concurrency errors such as data races and deadlocks. The approach suggests to generate specifications dynamically per input instance rather than statically once-and-for-all. The paper describes experiments with variants of this approach in the context of two examples, a planetary rover controller and a space craft fault protection system.
Test Generation Algorithm for Fault Detection of Analog Circuits Based on Extreme Learning Machine
Zhou, Jingyu; Tian, Shulin; Yang, Chenglin; Ren, Xuelong
2014-01-01
This paper proposes a novel test generation algorithm based on extreme learning machine (ELM), and such algorithm is cost-effective and low-risk for analog device under test (DUT). This method uses test patterns derived from the test generation algorithm to stimulate DUT, and then samples output responses of the DUT for fault classification and detection. The novel ELM-based test generation algorithm proposed in this paper contains mainly three aspects of innovation. Firstly, this algorithm saves time efficiently by classifying response space with ELM. Secondly, this algorithm can avoid reduced test precision efficiently in case of reduction of the number of impulse-response samples. Thirdly, a new process of test signal generator and a test structure in test generation algorithm are presented, and both of them are very simple. Finally, the abovementioned improvement and functioning are confirmed in experiments. PMID:25610458
Predicate Argument Structure Analysis for Use Case Description Modeling
NASA Astrophysics Data System (ADS)
Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira
In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.
Model-Based GUI Testing Using Uppaal at Novo Nordisk
NASA Astrophysics Data System (ADS)
Hjort, Ulrik H.; Illum, Jacob; Larsen, Kim G.; Petersen, Michael A.; Skou, Arne
This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates a test suite satisfying some testing criterion, such as edge or state coverage, and converts the individual test case into a scripting language that can be automatically executed against the target. The tool has significantly reduced the time required for test construction and generation, and reduced the number of test scripts while increasing the coverage.
Potential generated inner and outside a circular wire in its plane. Application to Saturn's ring
NASA Astrophysics Data System (ADS)
Najid, N.-E.; Zegoumou, M.; El Ourabi, E. H.
2012-12-01
In this article we derive the development of the potential generated by a homogeneous wire bent into a circular shape (Najid, Jammari & Zegoumou, 2005). We develop the potential as a power series of the distance from an appropriate origin to the test particle. The potential is expressed as a function of Legendre polynomials. We study both, the case where the test particle is inside or outside the circular wire. By Lagrangian formulation, we establish the differential equation of motion. The numerical resolution leads us to different orbits. Outside the wire we get a case where the test particle is confined between a maxima and minima of the radial position; while inner the wire the test particle is subjected to an escape case depending on the time of integration.
Highly Efficient Design-of-Experiments Methods for Combining CFD Analysis and Experimental Data
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Haller, Harold S.
2009-01-01
It is the purpose of this study to examine the impact of "highly efficient" Design-of-Experiments (DOE) methods for combining sets of CFD generated analysis data with smaller sets of Experimental test data in order to accurately predict performance results where experimental test data were not obtained. The study examines the impact of micro-ramp flow control on the shock wave boundary layer (SWBL) interaction where a complete paired set of data exist from both CFD analysis and Experimental measurements By combining the complete set of CFD analysis data composed of fifteen (15) cases with a smaller subset of experimental test data containing four/five (4/5) cases, compound data sets (CFD/EXP) were generated which allows the prediction of the complete set of Experimental results No statistical difference were found to exist between the combined (CFD/EXP) generated data sets and the complete Experimental data set composed of fifteen (15) cases. The same optimal micro-ramp configuration was obtained using the (CFD/EXP) generated data as obtained with the complete set of Experimental data, and the DOE response surfaces generated by the two data sets were also not statistically different.
Shock Generation and Control Using DBD Plasma Actuators
NASA Technical Reports Server (NTRS)
Patel, Mehul P.; Cain, Alan B.; Nelson, Christopher C.; Corke, Thomas C.; Matlis, Eric H.
2012-01-01
This report is the final report of a NASA Phase I SBIR contract, with some revisions to remove company proprietary data. The Shock Boundary Layer Interaction (SBLI) phenomena in a supersonic inlet involve mutual interaction of oblique shocks with boundary layers, forcing the boundary layer to separate from the inlet wall. To improve the inlet efficiency, it is desired to prevent or delay shock-induced boundary layer separation. In this effort, Innovative Technology Applications Company (ITAC), LLC and the University of Notre Dame (UND) jointly investigated the use of dielectric-barrier-discharge (DBD) plasma actuators for control of SBLI in a supersonic inlet. The research investigated the potential for DBD plasma actuators to suppress flow separation caused by a shock in a turbulent boundary layer. The research involved both numerical and experimental investigations of plasma flow control for a few different SBLI configurations: (a) a 12 wedge flow test case at Mach 1.5 (numerical and experimental), (b) an impinging shock test case at Mach 1.5 using an airfoil as a shock generator (numerical and experimental), and (c) a Mach 2.0 nozzle flow case in a simulated 15 X 15 cm wind tunnel with a shock generator (numerical). Numerical studies were performed for all three test cases to examine the feasibility of plasma flow control concepts. These results were used to guide the wind tunnel experiments conducted on the Mach 1.5 12 degree wedge flow (case a) and the Mach 1.5 impinging shock test case (case b) which were at similar flow conditions as the corresponding numerical studies to obtain experimental evidence of plasma control effects for SBLI control. The experiments also generated data that were used in validating the numerical studies for the baseline cases (without plasma actuators). The experiments were conducted in a Mach 1.5 test section in the University of Notre Dame Hessert Laboratory. The simulation results from cases a and b indicated that multiple spanwise actuators in series and at a voltage of 75 kVp-p could fully suppress the flow separation downstream of the shock. The simulation results from case c showed that the streamwise plasma actuators are highly effective in creating pairs of counter-rotating vortices, much like the mechanical vortex generators, and could also potentially have beneficial effects for SBLI control. However, to achieve these effects, the positioning and the quantity of the DBD actuators used must be optimized. The wind tunnel experiments mapped the baseline flow with good agreement to the numerical simulations. The experimental results were conducted with spanwise actuators for cases a and b, but were limited by the inability to generate a sufficiently high voltage due to arcing in the wind-tunnel test-section. The static pressure in the tunnel was lower than the static pressure in an inlet at flight conditions, promoting arching and degrading the actuator performance.
Nemesis Autonomous Test System
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Lee, Cin-Young; Horvath, Gregory A,; Clement, Bradley J.
2012-01-01
A generalized framework has been developed for systems validation that can be applied to both traditional and autonomous systems. The framework consists of an automated test case generation and execution system called Nemesis that rapidly and thoroughly identifies flaws or vulnerabilities within a system. By applying genetic optimization and goal-seeking algorithms on the test equipment side, a "war game" is conducted between a system and its complementary nemesis. The end result of the war games is a collection of scenarios that reveals any undesirable behaviors of the system under test. The software provides a reusable framework to evolve test scenarios using genetic algorithms using an operation model of the system under test. It can automatically generate and execute test cases that reveal flaws in behaviorally complex systems. Genetic algorithms focus the exploration of tests on the set of test cases that most effectively reveals the flaws and vulnerabilities of the system under test. It leverages advances in state- and model-based engineering, which are essential in defining the behavior of autonomous systems. It also uses goal networks to describe test scenarios.
Incompleteness of Bluetooth protocol conformance test cases
NASA Astrophysics Data System (ADS)
Wu, Peng; Gao, Qiang
2001-10-01
This paper describes a formal method to verify the completeness of conformance testing, in which not only Implementation Under Test (IUT) is formalized in SDL, but also conformance tester is described in SDL so that conformance testing can be performed in simulator provided with CASE tool. The protocol set considered is Bluetooth, an open wireless communication technology. Our research results show that Bluetooth conformance test specification is not complete in that it has only limited coverage and many important capabilities defined in Bluetooth core specification are not tested. We also give a detail report on the missing test cases against Bluetooth core specification, and provide a guide on further test case generation in the future.
NASA Astrophysics Data System (ADS)
Součková, Natálie; Kuklová, Jana; Popelka, Lukáš; Matějka, Milan
2012-04-01
This paper focuses on a suppression of the flow separation, which occurs on a deflected flap, by means of vortex generators (VG's). An airfoil NACA 63A421 with a simple flap and vane-type vortex generators were used. The investigation was carried out by using experimental and numerical methods. The data from the numerical simulation of the flapped airfoil without VG's control were used for the vortex generator design. Two sizes, two different shapes and various spacing of the vortex generators were tested. The flow past the airfoil was visualized through three methods, namely tuft filaments technique, oil and thermo camera visualization. The experiments were performed in closed circuit wind tunnels with closed and open test sections. The lift curves for both cases without and with vortex generators were acquired for a lift coefficient improvement determination. The improvement was achieved for several cases by means all of the applied methods.
Automated Test Case Generation for an Autopilot Requirement Prototype
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Rungta, Neha; Feary, Michael
2011-01-01
Designing safety-critical automation with robust human interaction is a difficult task that is susceptible to a number of known Human-Automation Interaction (HAI) vulnerabilities. It is therefore essential to develop automated tools that provide support both in the design and rapid evaluation of such automation. The Automation Design and Evaluation Prototyping Toolset (ADEPT) enables the rapid development of an executable specification for automation behavior and user interaction. ADEPT supports a number of analysis capabilities, thus enabling the detection of HAI vulnerabilities early in the design process, when modifications are less costly. In this paper, we advocate the introduction of a new capability to model-based prototyping tools such as ADEPT. The new capability is based on symbolic execution that allows us to automatically generate quality test suites based on the system design. Symbolic execution is used to generate both user input and test oracles user input drives the testing of the system implementation, and test oracles ensure that the system behaves as designed. We present early results in the context of a component in the Autopilot system modeled in ADEPT, and discuss the challenges of test case generation in the HAI domain.
Deblauwe, Vincent; Kennel, Pol; Couteron, Pierre
2012-01-01
Background Independence between observations is a standard prerequisite of traditional statistical tests of association. This condition is, however, violated when autocorrelation is present within the data. In the case of variables that are regularly sampled in space (i.e. lattice data or images), such as those provided by remote-sensing or geographical databases, this problem is particularly acute. Because analytic derivation of the null probability distribution of the test statistic (e.g. Pearson's r) is not always possible when autocorrelation is present, we propose instead the use of a Monte Carlo simulation with surrogate data. Methodology/Principal Findings The null hypothesis that two observed mapped variables are the result of independent pattern generating processes is tested here by generating sets of random image data while preserving the autocorrelation function of the original images. Surrogates are generated by matching the dual-tree complex wavelet spectra (and hence the autocorrelation functions) of white noise images with the spectra of the original images. The generated images can then be used to build the probability distribution function of any statistic of association under the null hypothesis. We demonstrate the validity of a statistical test of association based on these surrogates with both actual and synthetic data and compare it with a corrected parametric test and three existing methods that generate surrogates (randomization, random rotations and shifts, and iterative amplitude adjusted Fourier transform). Type I error control was excellent, even with strong and long-range autocorrelation, which is not the case for alternative methods. Conclusions/Significance The wavelet-based surrogates are particularly appropriate in cases where autocorrelation appears at all scales or is direction-dependent (anisotropy). We explore the potential of the method for association tests involving a lattice of binary data and discuss its potential for validation of species distribution models. An implementation of the method in Java for the generation of wavelet-based surrogates is available online as supporting material. PMID:23144961
Generating Test Templates via Automated Theorem Proving
NASA Technical Reports Server (NTRS)
Kancherla, Mani Prasad
1997-01-01
Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
An assessment of unstructured grid technology for timely CFD analysis
NASA Technical Reports Server (NTRS)
Kinard, Tom A.; Schabowski, Deanne M.
1995-01-01
An assessment of two unstructured methods is presented in this paper. A tetrahedral unstructured method USM3D, developed at NASA Langley Research Center is compared to a Cartesian unstructured method, SPLITFLOW, developed at Lockheed Fort Worth Company. USM3D is an upwind finite volume solver that accepts grids generated primarily from the Vgrid grid generator. SPLITFLOW combines an unstructured grid generator with an implicit flow solver in one package. Both methods are exercised on three test cases, a wing, and a wing body, and a fully expanded nozzle. The results for the first two runs are included here and compared to the structured grid method TEAM and to available test data. On each test case, the set up procedure are described, including any difficulties that were encountered. Detailed descriptions of the solvers are not included in this paper.
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
Real-Time Extended Interface Automata for Software Testing Cases Generation
Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin
2014-01-01
Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080
Validating an artificial intelligence human proximity operations system with test cases
NASA Astrophysics Data System (ADS)
Huber, Justin; Straub, Jeremy
2013-05-01
An artificial intelligence-controlled robot (AICR) operating in close proximity to humans poses risk to these humans. Validating the performance of an AICR is an ill posed problem, due to the complexity introduced by the erratic (noncomputer) actors. In order to prove the AICR's usefulness, test cases must be generated to simulate the actions of these actors. This paper discusses AICR's performance validation in the context of a common human activity, moving through a crowded corridor, using test cases created by an AI use case producer. This test is a two-dimensional simplification relevant to autonomous UAV navigation in the national airspace.
Embedded object concept with a telepresence robot system
NASA Astrophysics Data System (ADS)
Vallius, Tero; Röning, Juha
2005-10-01
This paper presents the Embedded Object Concept (EOC) and a telepresence robot system which is a test case for the EOC. The EOC utilizes common object-oriented methods used in software by applying them to combined Lego-like software-hardware entities. These entities represent objects in object-oriented design methods, and they are the building blocks of embedded systems. The goal of the EOC is to make the designing of embedded systems faster and easier. This concept enables people without comprehensive knowledge in electronics design to create new embedded systems, and for experts it shortens the design time of new embedded systems. We present the current status of the EOC, including two generations of embedded objects named Atomi objects. The first generation of the Atomi objects has been tested with different applications, and found to be functional, but not optimal. The second generation aims to correct the issues found with the first generation, and it is being tested in a relatively complex test case. The test case is a telepresence robot consisting of a two wheeled human height robot and its computer counter part. The robot has been constructed using incremental device development, which is made possible by the architecture of the EOC. The robot contains video and audio exchange capability, and a controlling and balancing system for driving with two wheels. The robot is built in two versions, the first consisting of a PDA device and Atomi objects, and the second consisting of only Atomi objects. The robot is currently incomplete, but for the most part it has been successfully tested.
Specifications for a coupled neutronics thermal-hydraulics SFR test case
NASA Astrophysics Data System (ADS)
Tassone, A.; Smirnov, A. D.; Tikhomirov, G. V.
2017-01-01
Coupling neutronics/thermal-hydraulics calculations for the design of nuclear reactors are a growing trend in the scientific community. This approach allows to properly represent the mutual feedbacks between the neutronic distribution and the thermal-hydraulics properties of the materials composing the reactor, details which are often lost when separate analysis are performed. In this work, a test case for a generation IV sodium-cooled fast reactor (SFR), based on the ASTRID concept developed by CEA, is proposed. Two sub-assemblies (SA) characterized by different fuel enrichment and layout are considered. Specifications for the test case are provided including geometrical data, material compositions, thermo-physical properties and coupling scheme details. Serpent and ANSYS-CFX are used as reference in the description of suitable inputs for the performing of the benchmark, but the use of other code combinations for the purpose of validation of the results is encouraged. The expected outcome of the test case are the axial distribution of volumetric power generation term (q‴), density and temperature for the fuel, the cladding and the coolant.
Ten Broek, Roel W; Bekers, Elise M; de Leng, Wendy W J; Strengman, Eric; Tops, Bastiaan B J; Kutzner, Heinz; Leeuwis, Jan Willem; van Gorp, Joost M; Creytens, David H; Mentzel, Thomas; van Diest, Paul J; Eijkelenboom, Astrid; Flucke, Uta
2017-12-01
Spindle cell hemangioma (SCH) is a distinct vascular soft-tissue lesion characterized by cavernous blood vessels and a spindle cell component mainly occurring in the distal extremities of young adults. The majority of cases harbor heterozygous mutations in IDH1/2 sporadically or rarely in association with Maffucci syndrome. However, based on mosaicism and accordingly a low percentage of lesional cells harboring a mutant allele, detection can be challenging. We tested 19 sporadic SCHs by Sanger sequencing, multiplex ligation-dependent probe amplification (MLPA), conventional next generation sequencing (NGS), and NGS using a single molecule molecular inversion probes (smMIP)-based library preparation to compare their diagnostic value. Out of 10 cases tested by Sanger sequencing and 2 analyzed using MLPA, 4 and 1, respectively, revealed a mutation in IDH1 (p.R132C). The 7 remaining negative cases and additional 6 cases were investigated using smMIP/NGS, showing hot spot mutations in IDH1 (p.R132C) (8 cases) and IDH2 (3 cases; twice p.R172S and once p.R172G, respectively). One case was negative. Owing to insufficient DNA quality and insufficient coverage, 2 cases were excluded. In total, in 16 out of 17 cases successfully tested, an IDH1/2 mutation was found. Given that IDH1/2 mutations were absent in 161 other vascular lesions tested by smMIP/NGS, the mutation can be considered as highly specific for SCH. © 2017 Wiley Periodicals, Inc.
Analysis of subsonic wind tunnel with variation shape rectangular and octagonal on test section
NASA Astrophysics Data System (ADS)
Rhakasywi, D.; Ismail; Suwandi, A.; Fadhli, A.
2018-02-01
The need for good design in the aerodynamics field required a wind tunnel design. The wind tunnel design required in this case is capable of generating laminar flow. In this research searched for wind tunnel models with rectangular and octagonal variations with objectives to generate laminar flow in the test section. The research method used numerical approach of CFD (Computational Fluid Dynamics) and manual analysis to analyze internal flow in test section. By CFD simulation results and manual analysis to generate laminar flow in the test section is a design that has an octagonal shape without filled for optimal design.
The risk of a second diagnostic window with 4th generation HIV assays: Two cases.
Niederhauser, C; Ströhle, A; Stolz, M; Müller, F; Tinguely, C
2009-08-01
Despite the improved sensitivity of the 4th generation combined antigen/antibody HIV assays, detection of HIV in the early phase of an infection may still be ineffective. Description of two cases that highlight the existence of the "second diagnostic window phase" observed with commonly used sensitive 4th generation HIV assays. Samples were screened with different 4th generation HIV assays. HIV infection was confirmed with an HIV I/II antibody assay, a HIV-1 p24 antigen assay, the INNO-LIA HIV I/II Score Line immunoassay and HIV-1 PCR. In both investigated cases, the limitations of the 4th generation HIV assays within the second diagnostic window were apparent. The overall sensitivity of the commercial 4th generation HIV assays is currently higher than the 3rd generation HIV assays. Nevertheless, the rare occurrence of a second diagnostic window with 4th generation HIV assays strongly suggests that the following up testing algorithms need to be adjusted accordingly.
Maltese, Paolo E; Iarossi, Giancarlo; Ziccardi, Lucia; Colombo, Leonardo; Buzzonetti, Luca; Crinò, Antonino; Tezzele, Silvia; Bertelli, Matteo
2018-02-01
Obesity phenotype can be manifested as an isolated trait or accompanied by multisystem disorders as part of a syndromic picture. In both situations, same molecular pathways may be involved to different degrees. This evidence is stronger in syndromic obesity, in which phenotypes of different syndromes may overlap. In these cases, genetic testing can unequivocally provide a final diagnosis. Here we describe a patient who met the diagnostic criteria for Alström syndrome only during adolescence. Genetic testing was requested at 25 years of age for a final confirmation of the diagnosis. The genetic diagnosis of Alström syndrome was obtained through a Next Generation Sequencing genetic test approach using a custom-designed gene panel of 47 genes associated with syndromic and non-syndromic obesity. Genetic analysis revealed a novel homozygous frameshift variant p.(Arg1550Lysfs*10) on exon 8 of the ALMS1 gene. This case shows the need for a revision of the diagnostic criteria guidelines, as a consequence of the recent advent of massive parallel sequencing technology. Indications for genetic testing reported in these currently accepted diagnostic criteria for Alström syndrome, were drafted when sequencing was expensive and time consuming. Nowadays, Next Generation Sequencing testing could be considered as first line diagnostic tool not only for Alström syndrome but, more generally, for all those atypical or not clearly distinguishable cases of syndromic obesity, thus avoiding delayed diagnosis and treatments. Early diagnosis permits a better follow-up and pre-symptomatic interventions. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Modal Survey of ETM-3, A 5-Segment Derivative of the Space Shuttle Solid Rocket Booster
NASA Technical Reports Server (NTRS)
Nielsen, D.; Townsend, J.; Kappus, K.; Driskill, T.; Torres, I.; Parks, R.
2005-01-01
The complex interactions between internal motor generated pressure oscillations and motor structural vibration modes associated with the static test configuration of a Reusable Solid Rocket Motor have potential to generate significant dynamic thrust loads in the 5-segment configuration (Engineering Test Motor 3). Finite element model load predictions for worst-case conditions were generated based on extrapolation of a previously correlated 4-segment motor model. A modal survey was performed on the largest rocket motor to date, Engineering Test Motor #3 (ETM-3), to provide data for finite element model correlation and validation of model generated design loads. The modal survey preparation included pretest analyses to determine an efficient analysis set selection using the Effective Independence Method and test simulations to assure critical test stand component loads did not exceed design limits. Historical Reusable Solid Rocket Motor modal testing, ETM-3 test analysis model development and pre-test loads analyses, as well as test execution, and a comparison of results to pre-test predictions are discussed.
Dynamic test input generation for multiple-fault isolation
NASA Technical Reports Server (NTRS)
Schaefer, Phil
1990-01-01
Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.
Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
NASA's Spaceliner 100 Investment Area Technology Activities
NASA Technical Reports Server (NTRS)
Hueter, Uwe; Lyles, Garry M. (Technical Monitor)
2001-01-01
NASA's has established long term goals for access-to-space. The third generation launch systems are to be fully reusable and operational around 2025. The goals for the third generation launch system are to reduce cost by a factor of 100 and improve safety by a factor of 10,000 over current conditions. The Advanced Space Transportation Program Office (ASTP) at the NASA's Marshall Space Flight Center in Huntsville, AL has the agency lead to develop space transportation technologies. Within ASTP, under the Spaceliner100 Investment Area, third generation technologies are being pursued in the areas of propulsion, airframes, integrated vehicle health management (IVHM), launch systems, and operations and range. The ASTP program will mature these technologies through ground system testing. Flight testing where required, will be advocated on a case by case basis.
Wu, Abraham J; Bosch, Walter R; Chang, Daniel T; Hong, Theodore S; Jabbour, Salma K; Kleinberg, Lawrence R; Mamon, Harvey J; Thomas, Charles R; Goodman, Karyn A
2015-07-15
Current guidelines for esophageal cancer contouring are derived from traditional 2-dimensional fields based on bony landmarks, and they do not provide sufficient anatomic detail to ensure consistent contouring for more conformal radiation therapy techniques such as intensity modulated radiation therapy (IMRT). Therefore, we convened an expert panel with the specific aim to derive contouring guidelines and generate an atlas for the clinical target volume (CTV) in esophageal or gastroesophageal junction (GEJ) cancer. Eight expert academically based gastrointestinal radiation oncologists participated. Three sample cases were chosen: a GEJ cancer, a distal esophageal cancer, and a mid-upper esophageal cancer. Uniform computed tomographic (CT) simulation datasets and accompanying diagnostic positron emission tomographic/CT images were distributed to each expert, and the expert was instructed to generate gross tumor volume (GTV) and CTV contours for each case. All contours were aggregated and subjected to quantitative analysis to assess the degree of concordance between experts and to generate draft consensus contours. The panel then refined these contours to generate the contouring atlas. The κ statistics indicated substantial agreement between panelists for each of the 3 test cases. A consensus CTV atlas was generated for the 3 test cases, each representing common anatomic presentations of esophageal cancer. The panel agreed on guidelines and principles to facilitate the generalizability of the atlas to individual cases. This expert panel successfully reached agreement on contouring guidelines for esophageal and GEJ IMRT and generated a reference CTV atlas. This atlas will serve as a reference for IMRT contours for clinical practice and prospective trial design. Subsequent patterns of failure analyses of clinical datasets using these guidelines may require modification in the future. Copyright © 2015 Elsevier Inc. All rights reserved.
Expert consensus contouring guidelines for IMRT in esophageal and gastroesophageal junction cancer
Wu, Abraham J.; Bosch, Walter R.; Chang, Daniel T.; Hong, Theodore S.; Jabbour, Salma K.; Kleinberg, Lawrence R.; Mamon, Harvey J.; Thomas, Charles R.; Goodman, Karyn A.
2015-01-01
Purpose/Objective(s) Current guidelines for esophageal cancer contouring are derived from traditional two-dimensional fields based on bony landmarks, and do not provide sufficient anatomical detail to ensure consistent contouring for more conformal radiotherapy techniques such as intensity-modulated radiation therapy (IMRT). Therefore, we convened an expert panel with the specific aim to derive contouring guidelines and generate an atlas for the clinical target volume (CTV) in esophageal or gastroesophageal junction (GEJ) cancer. Methods and Materials Eight expert academically-based gastrointestinal radiation oncologists participated. Three sample cases were chosen: a GEJ cancer, a distal esophageal cancer, and a mid-upper esophageal cancer. Uniform CT simulation datasets and an accompanying diagnostic PET-CT were distributed to each expert, and he/she was instructed to generate gross tumor volume (GTV) and CTV contours for each case. All contours were aggregated and subjected to quantitative analysis to assess the degree of concordance between experts and generate draft consensus contours. The panel then refined these contours to generate the contouring atlas. Results Kappa statistics indicated substantial agreement between panelists for each of the three test cases. A consensus CTV atlas was generated for the three test cases, each representing common anatomic presentations of esophageal cancer. The panel agreed on guidelines and principles to facilitate the generalizability of the atlas to individual cases. Conclusions This expert panel successfully reached agreement on contouring guidelines for esophageal and GEJ IMRT and generated a reference CTV atlas. This atlas will serve as a reference for IMRT contours for clinical practice and prospective trial design. Subsequent patterns of failure analyses of clinical datasets utilizing these guidelines may require modification in the future. PMID:26104943
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Abraham J., E-mail: wua@mskcc.org; Bosch, Walter R.; Chang, Daniel T.
Purpose/Objective(s): Current guidelines for esophageal cancer contouring are derived from traditional 2-dimensional fields based on bony landmarks, and they do not provide sufficient anatomic detail to ensure consistent contouring for more conformal radiation therapy techniques such as intensity modulated radiation therapy (IMRT). Therefore, we convened an expert panel with the specific aim to derive contouring guidelines and generate an atlas for the clinical target volume (CTV) in esophageal or gastroesophageal junction (GEJ) cancer. Methods and Materials: Eight expert academically based gastrointestinal radiation oncologists participated. Three sample cases were chosen: a GEJ cancer, a distal esophageal cancer, and a mid-upper esophagealmore » cancer. Uniform computed tomographic (CT) simulation datasets and accompanying diagnostic positron emission tomographic/CT images were distributed to each expert, and the expert was instructed to generate gross tumor volume (GTV) and CTV contours for each case. All contours were aggregated and subjected to quantitative analysis to assess the degree of concordance between experts and to generate draft consensus contours. The panel then refined these contours to generate the contouring atlas. Results: The κ statistics indicated substantial agreement between panelists for each of the 3 test cases. A consensus CTV atlas was generated for the 3 test cases, each representing common anatomic presentations of esophageal cancer. The panel agreed on guidelines and principles to facilitate the generalizability of the atlas to individual cases. Conclusions: This expert panel successfully reached agreement on contouring guidelines for esophageal and GEJ IMRT and generated a reference CTV atlas. This atlas will serve as a reference for IMRT contours for clinical practice and prospective trial design. Subsequent patterns of failure analyses of clinical datasets using these guidelines may require modification in the future.« less
IGB grid: User's manual (A turbomachinery grid generation code)
NASA Technical Reports Server (NTRS)
Beach, T. A.; Hoffman, G.
1992-01-01
A grid generation code called IGB is presented for use in computational investigations of turbomachinery flowfields. It contains a combination of algebraic and elliptic techniques coded for use on an interactive graphics workstation. The instructions for use and a test case are included.
Electrostatic Hazard Considerations for ODC Solvent Replacement Selection Testing
NASA Technical Reports Server (NTRS)
Fairbourn, Brad
1999-01-01
ODC solvents are used to clean many critical substrates during solid rocket motor production operations. Electrostatic charge generation incidental to these cleaning operations can pose a major safety issue. Therefore, while determining the acceptability of various ODC replacement cleaners, one aspect of the selection criteria included determining the extent of electric charge generation during a typical solvent cleaning operation. A total of six candidate replacement cleaners, sixteen critical substrates, and two types of cleaning swatch materials were studied in simulated cleaning operations. Charge generation and accumulation effects were investigated by measuring the peak voltage and brush discharging effects associated with each cleaning process combination. In some cases, charge generation was found to be very severe. Using the conductivity information for each cleaner, the peak voltage data could in some cases, be qualitatively predicted. Test results indicated that severe charging effects could result in brush discharges that could potentially result in flash fire hazards when occurring in close proximity to flammable vapor/air mixtures. Process controls to effectively mitigate these hazards are discussed.
Kremser, Andreas; Dressig, Julia; Grabrucker, Christine; Liepert, Anja; Kroell, Tanja; Scholl, Nina; Schmid, Christoph; Tischer, Johanna; Kufner, Stefanie; Salih, Helmut; Kolb, Hans Jochem; Schmetzer, Helga
2010-01-01
Myeloid-leukemic cells (AML, MDS, CML) can be differentiated to leukemia-derived dendritic cell [DC (DCleu)] potentially presenting the whole leukemic antigen repertoire without knowledge of distinct leukemia antigens and are regarded as promising candidates for a vaccination strategy. We studied the capability of 6 serum-free DC culture methods, chosen according to different mechanisms, to induce DC differentiation in 137 cases of AML and 52 cases of MDS. DC-stimulating substances were cytokines ("standard-medium", "MCM-Mimic", "cytokine-method"), bacterial lysates ("Picibanil"), double-stranded RNA ["Poly (I:C)"] or a cytokine bypass method ("Ca-ionophore"). The quality/quantity of DC generated was estimated by flow cytometry studying (co) expressions of "DC"antigens, costimulatory, maturation, and blast-antigens. Comparing these methods on average 15% to 32% DC, depending on methods used, could be obtained from blast-containing mononuclear cells (MNC) in AML/MDS cases with a DC viability of more than 60%. In all, 39% to 64% of these DC were mature; 31% to 52% of leukemic blasts could be converted to DCleu and DCleu-proportions in the suspension were 2% to 70% (13%). Average results of all culture methods tested were comparable, however not every given case of AML could be differentiated to DC with 1 selected method. However performing a pre-analysis with 3 DC-generating methods (MCM-Mimic, Picibanil, Ca-ionophore) we could generate DC in any given case. Functional analyses provided proof, that DC primed T cells to antileukemia-directed cytotoxic cells, although an anti-leukemic reaction was not achieved in every case. In summary our data show that a successful, quantitative DC/DCleu generation is possible with the best of 3 previously tested methods in any given case. Reasons for different functional behaviors of DC-primed T cells must be evaluated to design a practicable DC-based vaccination strategy.
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
Test Bench for Coupling and Shielding Magnetic Fields
NASA Astrophysics Data System (ADS)
Jordan, J.; Esteve, V.; Dede, E.; Sanchis, E.; Maset, E.; Ferreres, A.; Ejea, J. B.; Cases, C.
2016-05-01
This paper describes a test bench for training purposes, which uses a magnetic field generator to couple this magnetic field to a victim circuit. It can be very useful to test for magnetic susceptibility as well. The magnetic field generator consists of a board, which generates a variable current that flows into a printed circuit board with spiral tracks (noise generator). The victim circuit consists of a coaxial cable concentric with the spiral tracks and its generated magnetic field. The coaxial cable is part of a circuit which conducts a signal produced by a signal generator and a resistive load. In the paper three cases are studied. First, the transmitted signal from the signal generator uses the central conductor of the coaxial cable and the shield is floating. Second, the shield is short circuited at its ends (and thus forming a loop). Third, when connecting the shield in series with the inner conductor and therefore having the current flowing into the coax via the inner conductor and returning via the shield.
2016-10-04
analysis (due to site-level evaluations), but could be added in the future, include: wind turbines (the installations we visited were not interested due...procurement, operation, maintenance , testing, and fueling of the generators, detailed inventory and cost data is difficult to obtain. The DPW is often...understaffed, leading to uneven testing and maintenance of the equipment despite their best efforts. The reliability of these generators is typically
Lessons Learned During Instrument Testing for the Thermal Infrared Sensor (TIRS)
NASA Technical Reports Server (NTRS)
Peabody, Hume L.; Otero, Veronica; Neuberger, David
2013-01-01
The Themal InfraRed Sensor (TIRS) instrument, set to launch on the Landsat Data Continuity Mission in 2013, features a passively cooled telescope and IR detectors which are actively cooled by a two stage cryocooler. In order to proceed to the instrument level test campaign, at least one full functional test was required, necessitating a thermal vacuum test to sufficiently cool the detectors and demonstrate performance. This was fairly unique in that this test occurred before the Pre Environmental Review, but yielded significant knowledge gains before the planned instrument level test. During the pre-PER test, numerous discrepancies were found between the model and the actual hardware, which were revealed by poor correlation between model predictions and test data. With the inclusion of pseudo-balance points, the test also provided an opportunity to perform a pre-correlation to test data prior to the instrument level test campaign. Various lessons were learned during this test related to modeling and design of both the flight hardware and the Ground Support Equipment and test setup. The lessons learned in the pre-PER test resulted in a better test setup for the nstrument level test and the completion of the final instrument model correlation in a shorter period of time. Upon completion of the correlation, the flight predictions were generated including the full suite of off-nominal cases, including some new cases defined by the spacecraft. For some of these ·new cases, some components now revealed limit exceedances, in particular for a portion of the hardware that could not be tested due to its size and chamber limitations.. Further lessons were learned during the completion of flight predictions. With a correlated detalled instrument model, significant efforts were made to generate a reduced model suitable for observatory level analyses. This proved a major effort both to generate an appropriate network as well as to convert to the final model to the required format and yielded additional lessons learned. In spite of all the challenges encountered by TIRS, the instrument was successfully delivered to the spacecraft and will soon be tested at observatory level in preparation for a successful mission launch.
An Extended IEEE 118-Bus Test System With High Renewable Penetration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pena, Ivonne; Martinez-Anido, Carlo Brancucci; Hodge, Bri-Mathias
This article describes a new publicly available version of the IEEE 118-bus test system, named NREL-118. The database is based on the transmission representation (buses and lines) of the IEEE 118-bus test system, with a reconfigured generation representation using three regions of the US Western Interconnection from the latest Western Electricity Coordination Council (WECC) 2024 Common Case [1]. Time-synchronous hourly load, wind, and solar time series are provided for over one year (8784 hours). The public database presented and described in this manuscript will allow researchers to model a test power system using detailed transmission, generation, load, wind, and solarmore » data. This database includes key additional features that add to the current IEEE 118-bus test model, such as: the inclusion of 10 generation technologies with different heat rate functions, minimum stable levels and ramping rates, GHG emissions rates, regulation and contingency reserves, and hourly time series data for one full year for load, wind and solar generation.« less
Michiels, Bart; Heyvaert, Mieke; Onghena, Patrick
2018-04-01
The conditional power (CP) of the randomization test (RT) was investigated in a simulation study in which three different single-case effect size (ES) measures were used as the test statistics: the mean difference (MD), the percentage of nonoverlapping data (PND), and the nonoverlap of all pairs (NAP). Furthermore, we studied the effect of the experimental design on the RT's CP for three different single-case designs with rapid treatment alternation: the completely randomized design (CRD), the randomized block design (RBD), and the restricted randomized alternation design (RRAD). As a third goal, we evaluated the CP of the RT for three types of simulated data: data generated from a standard normal distribution, data generated from a uniform distribution, and data generated from a first-order autoregressive Gaussian process. The results showed that the MD and NAP perform very similarly in terms of CP, whereas the PND performs substantially worse. Furthermore, the RRAD yielded marginally higher power in the RT, followed by the CRD and then the RBD. Finally, the power of the RT was almost unaffected by the type of the simulated data. On the basis of the results of the simulation study, we recommend at least 20 measurement occasions for single-case designs with a randomized treatment order that are to be evaluated with an RT using a 5% significance level. Furthermore, we do not recommend use of the PND, because of its low power in the RT.
NASA Technical Reports Server (NTRS)
Heady, Joel; Pereira, J. Michael; Ruggeri, Charles R.; Bobula, George A.
2009-01-01
A test methodology currently employed for large engines was extended to quantify the ballistic containment capability of a small turboshaft engine compressor case. The approach involved impacting the inside of a compressor case with a compressor blade. A gas gun propelled the blade into the case at energy levels representative of failed compressor blades. The test target was a full compressor case. The aft flange was rigidly attached to a test stand and the forward flange was attached to a main frame to provide accurate boundary conditions. A window machined in the case allowed the projectile to pass through and impact the case wall from the inside with the orientation, direction and speed that would occur in a blade-out event. High-peed, digital-video cameras provided accurate velocity and orientation data. Calibrated cameras and digital image correlation software generated full field displacement and strain information at the back side of the impact point.
Prediction of Acoustic Loads Generated by Propulsion Systems
NASA Technical Reports Server (NTRS)
Perez, Linamaria; Allgood, Daniel C.
2011-01-01
NASA Stennis Space Center is one of the nation's premier facilities for conducting large-scale rocket engine testing. As liquid rocket engines vary in size, so do the acoustic loads that they produce. When these acoustic loads reach very high levels they may cause damages both to humans and to actual structures surrounding the testing area. To prevent these damages, prediction tools are used to estimate the spectral content and levels of the acoustics being generated by the rocket engine plumes and model their propagation through the surrounding atmosphere. Prior to the current work, two different acoustic prediction tools were being implemented at Stennis Space Center, each having their own advantages and disadvantages depending on the application. Therefore, a new prediction tool was created, using NASA SP-8072 handbook as a guide, which would replicate the same prediction methods as the previous codes, but eliminate any of the drawbacks the individual codes had. Aside from replicating the previous modeling capability in a single framework, additional modeling functions were added thereby expanding the current modeling capability. To verify that the new code could reproduce the same predictions as the previous codes, two verification test cases were defined. These verification test cases also served as validation cases as the predicted results were compared to actual test data.
On-line determination of transient stability status using multilayer perceptron neural network
NASA Astrophysics Data System (ADS)
Frimpong, Emmanuel Asuming; Okyere, Philip Yaw; Asumadu, Johnson
2018-01-01
A scheme to predict transient stability status following a disturbance is presented. The scheme is activated upon the tripping of a line or bus and operates as follows: Two samples of frequency deviation values at all generator buses are obtained. At each generator bus, the maximum frequency deviation within the two samples is extracted. A vector is then constructed from the extracted maximum frequency deviations. The Euclidean norm of the constructed vector is calculated and then fed as input to a trained multilayer perceptron neural network which predicts the stability status of the system. The scheme was tested using data generated from the New England test system. The scheme successfully predicted the stability status of all two hundred and five disturbance test cases.
Muthukumar, Alagarraju; Alatoom, Adnan; Burns, Susan; Ashmore, Jerry; Kim, Anne; Emerson, Brian; Bannister, Edward; Ansari, M Qasim
2015-01-01
To assess the false-positive and false-negative rates of a 4th-generation human immunodeficiency virus (HIV) assay, the Abbott ARCHITECT, vs 2 HIV 3rd-generation assays, the Siemens Centaur and the Ortho-Clinical Diagnostics Vitros. We examined 123 patient specimens. In the first phase of the study, we compared 99 specimens that had a positive screening result via the 3rd-generation Vitros assay (10 positive, 82 negative, and 7 indeterminate via confirmatory immunofluorescent assay [IFA]/Western blot [WB] testing). In the second phase, we assessed 24 HIV-1 RNA-positive (positive result via the nuclear acid amplification test [NAAT] and negative/indeterminate results via the WB test) specimens harboring acute HIV infection. The 4th-generation ARCHITECT assay yielded fewer false-positive results (n = 2) than the 3rd-generation Centaur (n = 9; P = .02) and Vitros (n = 82; P <.001) assays. One confirmed positive case had a false-negative result via the Centaur assay. When specimens from the 24 patients with acute HIV-1 infection were tested, the ARCHITECT assay yielded fewer false-negative results (n = 5) than the Centaur (n = 10) (P = .13) and the other 3rd-generation tests (n = 16) (P = .002). This study indicates that the 4th-generation ARCHITECT HIV assay yields fewer false-positive and false-negative results than the 3rd-generation HIV assays we tested. Copyright© by the American Society for Clinical Pathology (ASCP).
HAL/S-360 compiler test activity report
NASA Technical Reports Server (NTRS)
Helmers, C. T.
1974-01-01
The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.
ERIC Educational Resources Information Center
Palka, Sean
2015-01-01
This research details a methodology designed for creating content in support of various phishing prevention tasks including live exercises and detection algorithm research. Our system uses probabilistic context-free grammars (PCFG) and variable interpolation as part of a multi-pass method to create diverse and consistent phishing email content on…
Design and analysis of a novel doubly salient permanent- magnet generator
NASA Astrophysics Data System (ADS)
Sarlioglu, Bulent
Improvements in permanent magnets and power electronics technologies have made it possible to devise different configurations of electrical machines which were not previously possible to implement. In this dissertation, a novel Doubly Salient Permanent Magnet (DSPM) generator has been designed, analyzed, and tested. The DSPM generator has four stator poles and six rotor poles. Two high density permanent magnets are located in the stator yoke. Since there are no windings or permanent magnets in the rotor, the DSPM generator has several advantages: the rotor has low inertia, no copper loss, no PM attachments, no brushes, and no slip rings. This type of rotor can be manufactured easily, and can be run at very high speeds as in the case of a switched reluctance machine. Compared to induction and switched reluctance machines, the DSPM generator can produce more power from the same geometry. Moreover, the efficiency of the DSPM generator is higher, since there is no copper loss associated with excitation of the machine. Another advantage of the DSPM generator is that the output AC voltage can easily be rectified by a diode bridge rectifier, while in the case of the switched reluctance machine one needs to use active semiconductor switches for power generation. If greater utilization and control of power production capability are desired, the AC output of the DSPM generator can be rectified using an active converter. In this dissertation, a novel doubly salient permanent magnet generator is introduced. First, the theory of the DSPM generator is given. Later, this novel generator is investigated using conventional magnetic circuits, nonlinear finite element analysis, and simulations with first order approximations and nonlinear modeling. It is compared with other generators. Static and no-load testing of the prototype DSPM generator are presented, and generator performance is evaluated with various power electronic circuits.
A prevalence-based association test for case-control studies.
Ryckman, Kelli K; Jiang, Lan; Li, Chun; Bartlett, Jacquelaine; Haines, Jonathan L; Williams, Scott M
2008-11-01
Genetic association is often determined in case-control studies by the differential distribution of alleles or genotypes. Recent work has demonstrated that association can also be assessed by deviations from the expected distributions of alleles or genotypes. Specifically, multiple methods motivated by the principles of Hardy-Weinberg equilibrium (HWE) have been developed. However, these methods do not take into account many of the assumptions of HWE. Therefore, we have developed a prevalence-based association test (PRAT) as an alternative method for detecting association in case-control studies. This method, also motivated by the principles of HWE, uses an estimated population allele frequency to generate expected genotype frequencies instead of using the case and control frequencies separately. Our method often has greater power, under a wide variety of genetic models, to detect association than genotypic, allelic or Cochran-Armitage trend association tests. Therefore, we propose PRAT as a powerful alternative method of testing for association.
NASA Astrophysics Data System (ADS)
Faria, J. M.; Mahomad, S.; Silva, N.
2009-05-01
The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.
Test Cases for the Benchmark Active Controls: Spoiler and Control Surface Oscillations and Flutter
NASA Technical Reports Server (NTRS)
Bennett, Robert M.; Scott, Robert C.; Wieseman, Carol D.
2000-01-01
As a portion of the Benchmark Models Program at NASA Langley, a simple generic model was developed for active controls research and was called BACT for Benchmark Active Controls Technology model. This model was based on the previously-tested Benchmark Models rectangular wing with the NACA 0012 airfoil section that was mounted on the Pitch and Plunge Apparatus (PAPA) for flutter testing. The BACT model had an upper surface spoiler, a lower surface spoiler, and a trailing edge control surface for use in flutter suppression and dynamic response excitation. Previous experience with flutter suppression indicated a need for measured control surface aerodynamics for accurate control law design. Three different types of flutter instability boundaries had also been determined for the NACA 0012/PAPA model, a classical flutter boundary, a transonic stall flutter boundary at angle of attack, and a plunge instability near M = 0.9. Therefore an extensive set of steady and control surface oscillation data was generated spanning the range of the three types of instabilities. This information was subsequently used to design control laws to suppress each flutter instability. There have been three tests of the BACT model. The objective of the first test, TDT Test 485, was to generate a data set of steady and unsteady control surface effectiveness data, and to determine the open loop dynamic characteristics of the control systems including the actuators. Unsteady pressures, loads, and transfer functions were measured. The other two tests, TDT Test 502 and TDT Test 5 18, were primarily oriented towards active controls research, but some data supplementary to the first test were obtained. Dynamic response of the flexible system to control surface excitation and open loop flutter characteristics were determined during Test 502. Loads were not measured during the last two tests. During these tests, a database of over 3000 data sets was obtained. A reasonably extensive subset of the data sets from the first two tests have been chosen for Test Cases for computational comparisons concentrating on static conditions and cases with harmonically oscillating control surfaces. Several flutter Test Cases from both tests have also been included. Some aerodynamic comparisons with the BACT data have been made using computational fluid dynamics codes at the Navier-Stokes level (and in the accompanying chapter SC). Some mechanical and active control studies have been presented. In this report several Test Cases are selected to illustrate trends for a variety of different conditions with emphasis on transonic flow effects. Cases for static angles of attack, static trailing-edge and upper-surface spoiler deflections are included for a range of conditions near those for the oscillation cases. Cases for trailing-edge control and upper-surface spoiler oscillations for a range of Mach numbers, angle of attack, and static control deflections are included. Cases for all three types of flutter instability are selected. In addition some cases are included for dynamic response measurements during forced oscillations of the controls on the flexible mount. An overview of the model and tests is given, and the standard formulary for these data is listed. Some sample data and sample results of calculations are presented. Only the static pressures and the first harmonic real and imaginary parts of the pressures are included in the data for the Test Cases, but digitized time histories have been archived. The data for the Test Cases are also available as separate electronic files.
McDermott, K B; Roediger, H L
1996-03-01
Three experiments examined whether a conceptual implicit memory test (specifically, category instance generation) would exhibit repetition effects similar to those found in free recall. The transfer appropriate processing account of dissociations among memory tests led us to predict that the tests would show parallel effects; this prediction was based upon the theory's assumption that conceptual tests will behave similarly as a function of various independent variables. In Experiment 1, conceptual repetition (i.e., following a target word [e.g., puzzles] with an associate [e.g., jigsaw]) did not enhance priming on the instance generation test relative to the condition of simply presenting the target word once, although this manipulation did affect free recall. In Experiment 2, conceptual repetition was achieved by following a picture with its corresponding word (or vice versa). In this case, there was an effect of conceptual repetition on free recall but no reliable effect on category instance generation or category cued recall. In addition, we obtained a picture superiority effect in free recall but not in category instance generation. In the third experiment, when the same study sequence was used as in Experiment 1, but with instructions that encouraged relational processing, priming on the category instance generation task was enhanced by conceptual repetition. Results demonstrate that conceptual memory tests can be dissociated and present problems for Roediger's (1990) transfer appropriate processing account of dissociations between explicit and implicit tests.
PE Workshop II. Proceedings of the Second Parabolic Equation Workshop
1993-01-01
13) The values for the Padd coefficients tabulated in [8] were generated using the same accuracy and stability constraints used to generate the Pad6...singly to that subject. With the expectation that such a workshop would soon occur, the general topic of underwater acoustic scattering was minimized...were not generated for all of the test cases. The available reference solutions were forwarded to the participants on 23 April 1991. Since the workshop
NASA Astrophysics Data System (ADS)
Glumac, Nick; Clemenson, Michael; Guadarrama, Jose; Krier, Herman
2015-06-01
Aluminum-cased warheads have been observed to generate enhanced blast and target damage due to reactivity of the aluminum fragments with ambient air. This effect can more than double the output of a conventional warhead. The mechanism by which the aluminum reacts under these conditions remains poorly understood. We undertake a highly controlled experimental study to investigate the phenomenon of aluminum reaction under explosive loading. Experiments are conducted with Al 6061 casings and PBX-N9 explosive with a fixed charge to case mass ratio of 1:2. Results are compared to inert casings (steel), as well as to tests performed in nitrogen environments to isolate aerobic and anaerobic effects. Padded walls are used in some tests to isolate the effects of impact-induced reactions, which are found to be non-negligible. Finally, blast wave measurements and quasi-static pressure measurements are used to isolate the fraction of case reaction that is fast enough to drive the primary blast wave from the later time reaction that generates temperature and overpressure only in the late-time fireball. Fragment size distributions, including those in the micron-scale range, are collected and quantified.
NASA Astrophysics Data System (ADS)
Moneta, Diana; Mora, Paolo; Viganò, Giacomo; Alimonti, Gianluca
2014-12-01
The diffusion of Distributed Generation (DG) based on Renewable Energy Sources (RES) requires new strategies to ensure reliable and economic operation of the distribution networks and to support the diffusion of DG itself. An advanced algorithm (DISCoVER - DIStribution Company VoltagE Regulator) is being developed to optimize the operation of active network by means of an advanced voltage control based on several regulations. Starting from forecasted load and generation, real on-field measurements, technical constraints and costs for each resource, the algorithm generates for each time period a set of commands for controllable resources that guarantees achievement of technical goals minimizing the overall cost. Before integrating the controller into the telecontrol system of the real networks, and in order to validate the proper behaviour of the algorithm and to identify possible critical conditions, a complete simulation phase has started. The first step is concerning the definition of a wide range of "case studies", that are the combination of network topology, technical constraints and targets, load and generation profiles and "costs" of resources that define a valid context to test the algorithm, with particular focus on battery and RES management. First results achieved from simulation activity on test networks (based on real MV grids) and actual battery characteristics are given, together with prospective performance on real case applications.
Black-Box System Testing of Real-Time Embedded Systems Using Random and Search-Based Testing
NASA Astrophysics Data System (ADS)
Arcuri, Andrea; Iqbal, Muhammad Zohaib; Briand, Lionel
Testing real-time embedded systems (RTES) is in many ways challenging. Thousands of test cases can be potentially executed on an industrial RTES. Given the magnitude of testing at the system level, only a fully automated approach can really scale up to test industrial RTES. In this paper we take a black-box approach and model the RTES environment using the UML/MARTE international standard. Our main motivation is to provide a more practical approach to the model-based testing of RTES by allowing system testers, who are often not familiar with the system design but know the application domain well-enough, to model the environment to enable test automation. Environment models can support the automation of three tasks: the code generation of an environment simulator, the selection of test cases, and the evaluation of their expected results (oracles). In this paper, we focus on the second task (test case selection) and investigate three test automation strategies using inputs from UML/MARTE environment models: Random Testing (baseline), Adaptive Random Testing, and Search-Based Testing (using Genetic Algorithms). Based on one industrial case study and three artificial systems, we show how, in general, no technique is better than the others. Which test selection technique to use is determined by the failure rate (testing stage) and the execution time of test cases. Finally, we propose a practical process to combine the use of all three test strategies.
Paternal age related schizophrenia (PARS): Latent subgroups detected by k-means clustering analysis.
Lee, Hyejoo; Malaspina, Dolores; Ahn, Hongshik; Perrin, Mary; Opler, Mark G; Kleinhaus, Karine; Harlap, Susan; Goetz, Raymond; Antonius, Daniel
2011-05-01
Paternal age related schizophrenia (PARS) has been proposed as a subgroup of schizophrenia with distinct etiology, pathophysiology and symptoms. This study uses a k-means clustering analysis approach to generate hypotheses about differences between PARS and other cases of schizophrenia. We studied PARS (operationally defined as not having any family history of schizophrenia among first and second-degree relatives and fathers' age at birth ≥ 35 years) in a series of schizophrenia cases recruited from a research unit. Data were available on demographic variables, symptoms (Positive and Negative Syndrome Scale; PANSS), cognitive tests (Wechsler Adult Intelligence Scale-Revised; WAIS-R) and olfaction (University of Pennsylvania Smell Identification Test; UPSIT). We conducted a series of k-means clustering analyses to identify clusters of cases containing high concentrations of PARS. Two analyses generated clusters with high concentrations of PARS cases. The first analysis (N=136; PARS=34) revealed a cluster containing 83% PARS cases, in which the patients showed a significant discrepancy between verbal and performance intelligence. The mean paternal and maternal ages were 41 and 33, respectively. The second analysis (N=123; PARS=30) revealed a cluster containing 71% PARS cases, of which 93% were females; the mean age of onset of psychosis, at 17.2, was significantly early. These results strengthen the evidence that PARS cases differ from other patients with schizophrenia. Hypothesis-generating findings suggest that features of PARS may include a discrepancy between verbal and performance intelligence, and in females, an early age of onset. These findings provide a rationale for separating these phenotypes from others in future clinical, genetic and pathophysiologic studies of schizophrenia and in considering responses to treatment. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Brown, David B.
1990-01-01
The results of research and development efforts are described for Task one, Phase two of a general project entitled The Development of a Program Analysis Environment for Ada. The scope of this task includes the design and development of a prototype system for testing Ada software modules at the unit level. The system is called Query Utility Environment for Software Testing of Ada (QUEST/Ada). The prototype for condition coverage provides a platform that implements expert system interaction with program testing. The expert system can modify data in the instrument source code in order to achieve coverage goals. Given this initial prototype, it is possible to evaluate the rule base in order to develop improved rules for test case generation. The goals of Phase two are the following: (1) to continue to develop and improve the current user interface to support the other goals of this research effort (i.e., those related to improved testing efficiency and increased code reliable); (2) to develop and empirically evaluate a succession of alternative rule bases for the test case generator such that the expert system achieves coverage in a more efficient manner; and (3) to extend the concepts of the current test environment to address the issues of Ada concurrency.
van Dam, Alje P; van Ogtrop, Marc L; Golparian, Daniel; Mehrtens, Jan; de Vries, Henry J C; Unemo, Magnus
2014-11-01
We describe the first case of treatment failure of gonorrhoea with a third generation cephalosporin, cefotaxime 1g intramuscularly, in the Netherlands. The case was from a high-frequency transmitting population (men having sex with men) and was caused by the internationally spreading multidrug-resistant gonococcal NG-MAST ST1407 clone. The patient was clinically cured after treatment with ceftriaxone 500 mg intramuscularly and this is the only third generation cephalosporin that should be used for first-line empiric treatment of gonorrhoea. Increased awareness of failures with third generation cephalosporins, enhanced monitoring and appropriate verification of treatment failures including more frequent test-of-cures, and strict adherence to regularly updated treatment guidelines are essential globally. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Laidler, Matthew R; Tourdjman, Mathieu; Buser, Genevieve L; Hostetler, Trevor; Repp, Kimberly K; Leman, Richard; Samadpour, Mansour; Keene, William E
2013-10-01
An outbreak of Escherichia coli O157:H7 was identified in Oregon through an increase in Shiga toxin-producing E. coli cases with an indistinguishable, novel pulsed-field gel electrophoresis (PFGE) subtyping pattern. We defined confirmed cases as persons from whom E. coli O157:H7 with the outbreak PFGE pattern was cultured during July-August 2011, and presumptive cases as persons having a household relationship with a case testing positive for E. coli O157:H7 and coincident diarrheal illness. We conducted an investigation that included structured hypothesis-generating interviews, a matched case-control study, and environmental and traceback investigations. We identified 15 cases. Six cases were hospitalized, including 4 with hemolytic uremic syndrome (HUS). Two cases with HUS died. Illness was significantly associated with strawberry consumption from roadside stands or farmers' markets (matched odds ratio, 19.6; 95% confidence interval, 2.9-∞). A single farm was identified as the source of contaminated strawberries. Ten of 111 (9%) initial environmental samples from farm A were positive for E. coli O157:H7. All samples testing positive for E. coli O157:H7 contained deer feces, and 5 tested farm fields had ≥ 1 sample positive with the outbreak PFGE pattern. The investigation identified fresh strawberries as a novel vehicle for E. coli O157:H7 infection, implicated deer feces as the source of contamination, and highlights problems concerning produce contamination by wildlife and regulatory exemptions for locally grown produce. A comprehensive hypothesis-generating questionnaire enabled rapid identification of the implicated product. Good agricultural practices are key barriers to wildlife fecal contamination of produce.
Complementing in vitro screening assays with in silico ...
High-throughput in vitro assays offer a rapid, cost-efficient means to screen thousands of chemicals across hundreds of pathway-based toxicity endpoints. However, one main concern involved with the use of in vitro assays is the erroneous omission of chemicals that are inactive under assay conditions but that can generate active metabolites under in vivo conditions. To address this potential issue, a case study will be presented to demonstrate the use of in silico tools to identify inactive parents with the ability to generate active metabolites. This case study used the results from an orthogonal assay designed to improve confidence in the identification of active chemicals tested across eighteen estrogen receptor (ER)-related in vitro assays by accounting for technological limitations inherent within each individual assay. From the 1,812 chemicals tested within the orthogonal assay, 1,398 were considered inactive. These inactive chemicals were analyzed using Chemaxon Metabolizer software to predict the first and second generation metabolites. From the nearly 1,400 inactive chemicals, over 2,200 first-generation (i.e., primary) metabolites and over 5,500 second-generation (i.e., secondary) metabolites were predicted. Nearly 70% of primary metabolites were immediately detoxified or converted to other metabolites, while over 70% of secondary metabolites remained stable. Among these predicted metabolites, those that are most likely to be produced and remain
Hypothesis test for synchronization: twin surrogates revisited.
Romano, M Carmen; Thiel, Marco; Kurths, Jürgen; Mergenthaler, Konstantin; Engbert, Ralf
2009-03-01
The method of twin surrogates has been introduced to test for phase synchronization of complex systems in the case of passive experiments. In this paper we derive new analytical expressions for the number of twins depending on the size of the neighborhood, as well as on the length of the trajectory. This allows us to determine the optimal parameters for the generation of twin surrogates. Furthermore, we determine the quality of the twin surrogates with respect to several linear and nonlinear statistics depending on the parameters of the method. In the second part of the paper we perform a hypothesis test for phase synchronization in the case of experimental data from fixational eye movements. These miniature eye movements have been shown to play a central role in neural information processing underlying the perception of static visual scenes. The high number of data sets (21 subjects and 30 trials per person) allows us to compare the generated twin surrogates with the "natural" surrogates that correspond to the different trials. We show that the generated twin surrogates reproduce very well all linear and nonlinear characteristics of the underlying experimental system. The synchronization analysis of fixational eye movements by means of twin surrogates reveals that the synchronization between the left and right eye is significant, indicating that either the centers in the brain stem generating fixational eye movements are closely linked, or, alternatively that there is only one center controlling both eyes.
NASA Astrophysics Data System (ADS)
Shao, Hongbing
Software testing with scientific software systems often suffers from test oracle problem, i.e., lack of test oracles. Amsterdam discrete dipole approximation code (ADDA) is a scientific software system that can be used to simulate light scattering of scatterers of various types. Testing of ADDA suffers from "test oracle problem". In this thesis work, I established a testing framework to test scientific software systems and evaluated this framework using ADDA as a case study. To test ADDA, I first used CMMIE code as the pseudo oracle to test ADDA in simulating light scattering of a homogeneous sphere scatterer. Comparable results were obtained between ADDA and CMMIE code. This validated ADDA for use with homogeneous sphere scatterers. Then I used experimental result obtained for light scattering of a homogeneous sphere to validate use of ADDA with sphere scatterers. ADDA produced light scattering simulation comparable to the experimentally measured result. This further validated the use of ADDA for simulating light scattering of sphere scatterers. Then I used metamorphic testing to generate test cases covering scatterers of various geometries, orientations, homogeneity or non-homogeneity. ADDA was tested under each of these test cases and all tests passed. The use of statistical analysis together with metamorphic testing is discussed as a future direction. In short, using ADDA as a case study, I established a testing framework, including use of pseudo oracles, experimental results and the metamorphic testing techniques to test scientific software systems that suffer from test oracle problems. Each of these techniques is necessary and contributes to the testing of the software under test.
A Solution Adaptive Technique Using Tetrahedral Unstructured Grids
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2000-01-01
An adaptive unstructured grid refinement technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The method is based on a combination of surface mesh subdivision and local remeshing of the volume grid Simple functions of flow quantities are employed to detect dominant features of the flowfield The method is designed for modular coupling with various error/feature analyzers and flow solvers. Several steady-state, inviscid flow test cases are presented to demonstrate the applicability of the method for solving practical three-dimensional problems. In all cases, accurate solutions featuring complex, nonlinear flow phenomena such as shock waves and vortices have been generated automatically and efficiently.
IQ Scores Should Be Corrected for the Flynn Effect in High-Stakes Decisions
ERIC Educational Resources Information Center
Fletcher, Jack M.; Stuebing, Karla K.; Hughes, Lisa C.
2010-01-01
IQ test scores should be corrected for high stakes decisions that employ these assessments, including capital offense cases. If scores are not corrected, then diagnostic standards must change with each generation. Arguments against corrections, based on standards of practice, information present and absent in test manuals, and related issues,…
Orbit attitude processor. STS-1 bench program verification test plan
NASA Technical Reports Server (NTRS)
Mcclain, C. R.
1980-01-01
A plan for the static verification of the STS-1 ATT PROC ORBIT software requirements is presented. The orbit version of the SAPIENS bench program is used to generate the verification data. A brief discussion of the simulation software and flight software modules is presented along with a description of the test cases.
Assessment of microwave-based clinical waste decontamination unit.
Hoffman, P N; Hanley, M J
1994-12-01
A clinical waste decontamination unit that used microwave-generated heat was assessed for operator safety and efficacy. Tests with loads artificially contaminated with aerosol-forming particles showed that no particles were detected outside the machine provided the seals and covers were correctly seated. Thermometric measurement of a self-generated steam decontamination cycle was used to determine the parameters needed to ensure heat disinfection of the waste reception hopper, prior to entry for maintenance or repair. Bacterial and thermometric test pieces were passed through the machine within a full load of clinical waste. These test pieces, designed to represent a worst case situation, were enclosed in aluminium foil to shield them from direct microwave energy. None of the 100 bacterial test pieces yielded growth on culture and all 100 thermal test pieces achieved temperatures in excess of 99 degrees C during their passage through the decontamination unit. It was concluded that this method may be used to render safe the bulk of of ward-generated clinical waste.
Electrical Aspects of Flames in Microgravity Combustion
NASA Technical Reports Server (NTRS)
Dunn-Rankin, D.; Strayer, B.; Weinberg, F.; Carleton, F.
1999-01-01
A principal characteristic of combustion in microgravity is the absence of buoyancy driven flows. In some cases, such as for spherically symmetrical droplet burning, the absence of buoyancy is desirable for matching analytical treatments with experiments. In other cases, however, it can be more valuable to arbitrarily control the flame's convective environment independent of the environmental gravitational condition. To accomplish this, we propose the use of ion generated winds driven by electric fields to control local convection of flames. Such control can produce reduced buoyancy (effectively zero buoyancy) conditions in the laboratory in 1-g facilitating a wide range of laser diagnostics that can probe the system without special packaging required for drop tower or flight tests. In addition, the electric field generated ionic winds allow varying gravitational convection equivalents even if the test occurs in reduced gravity environments.
NASA Astrophysics Data System (ADS)
Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur
2015-05-01
Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.
Applying a CAD-generated imaging marker to assess short-term breast cancer risk
NASA Astrophysics Data System (ADS)
Mirniaharikandehei, Seyedehnafiseh; Zarafshani, Ali; Heidari, Morteza; Wang, Yunzhi; Aghaei, Faranak; Zheng, Bin
2018-02-01
Although whether using computer-aided detection (CAD) helps improve radiologists' performance in reading and interpreting mammograms is controversy due to higher false-positive detection rates, objective of this study is to investigate and test a new hypothesis that CAD-generated false-positives, in particular, the bilateral summation of false-positives, is a potential imaging marker associated with short-term breast cancer risk. An image dataset involving negative screening mammograms acquired from 1,044 women was retrospectively assembled. Each case involves 4 images of craniocaudal (CC) and mediolateral oblique (MLO) view of the left and right breasts. In the next subsequent mammography screening, 402 cases were positive for cancer detected and 642 remained negative. A CAD scheme was applied to process all "prior" negative mammograms. Some features from CAD scheme were extracted, which include detection seeds, the total number of false-positive regions, an average of detection scores and the sum of detection scores in CC and MLO view images. Then the features computed from two bilateral images of left and right breasts from either CC or MLO view were combined. In order to predict the likelihood of each testing case being positive in the next subsequent screening, two logistic regression models were trained and tested using a leave-one-case-out based cross-validation method. Data analysis demonstrated the maximum prediction accuracy with an area under a ROC curve of AUC=0.65+/-0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of [2.95, 6.83]. The results also illustrated an increasing trend in the adjusted odds ratio and risk prediction scores (p<0.01). Thus, the study showed that CAD-generated false-positives might provide a new quantitative imaging marker to help assess short-term breast cancer risk.
Automated Item Generation with Recurrent Neural Networks.
von Davier, Matthias
2018-03-12
Utilizing technology for automated item generation is not a new idea. However, test items used in commercial testing programs or in research are still predominantly written by humans, in most cases by content experts or professional item writers. Human experts are a limited resource and testing agencies incur high costs in the process of continuous renewal of item banks to sustain testing programs. Using algorithms instead holds the promise of providing unlimited resources for this crucial part of assessment development. The approach presented here deviates in several ways from previous attempts to solve this problem. In the past, automatic item generation relied either on generating clones of narrowly defined item types such as those found in language free intelligence tests (e.g., Raven's progressive matrices) or on an extensive analysis of task components and derivation of schemata to produce items with pre-specified variability that are hoped to have predictable levels of difficulty. It is somewhat unlikely that researchers utilizing these previous approaches would look at the proposed approach with favor; however, recent applications of machine learning show success in solving tasks that seemed impossible for machines not too long ago. The proposed approach uses deep learning to implement probabilistic language models, not unlike what Google brain and Amazon Alexa use for language processing and generation.
Valid statistical inference methods for a case-control study with missing data.
Tian, Guo-Liang; Zhang, Chi; Jiang, Xuejun
2018-04-01
The main objective of this paper is to derive the valid sampling distribution of the observed counts in a case-control study with missing data under the assumption of missing at random by employing the conditional sampling method and the mechanism augmentation method. The proposed sampling distribution, called the case-control sampling distribution, can be used to calculate the standard errors of the maximum likelihood estimates of parameters via the Fisher information matrix and to generate independent samples for constructing small-sample bootstrap confidence intervals. Theoretical comparisons of the new case-control sampling distribution with two existing sampling distributions exhibit a large difference. Simulations are conducted to investigate the influence of the three different sampling distributions on statistical inferences. One finding is that the conclusion by the Wald test for testing independency under the two existing sampling distributions could be completely different (even contradictory) from the Wald test for testing the equality of the success probabilities in control/case groups under the proposed distribution. A real cervical cancer data set is used to illustrate the proposed statistical methods.
Pitting and Bending Fatigue Evaluations of a New Case-Carburized Gear Steel
NASA Technical Reports Server (NTRS)
Krantz, Timothy; Tufts, Brian
2007-01-01
The power density of a gearbox is an important consideration for many applications and is especially important for gearboxes used on aircraft. One approach to improving power density of gearing is to improve the steel properties by design of the alloy. The alloy tested in this work was designed to be case-carburized with surface hardness of Rockwell C66 after hardening. Test gear performance was evaluated using surface fatigue tests and single-tooth bending fatigue tests. The performance of gears made from the new alloy was compared to the performance of gears made from two alloys currently used for aviation gearing. The new alloy exhibited significantly better performance in surface fatigue testing, demonstrating the value of the improved properties in the case layer. However, the alloy exhibited lesser performance in single-tooth bending fatigue testing. The fracture toughness of the tested gears was insufficient for use in aircraft applications as judged by the behavior exhibited during the single tooth bending tests. This study quantified the performance of the new alloy and has provided guidance for the design and development of next generation gear steels.
Dynamic Test Generation for Large Binary Programs
2009-11-12
the fuzzing@whitestar.linuxbox.orgmailing list, including Jared DeMott, Disco Jonny, and Ari Takanen, for discussions on fuzzing tradeoffs. Martin...as is the case for large applications where exercising all execution paths is virtually hopeless anyway. This point will be further discussed in...consumes trace files generated by iDNA and virtually re-executes the recorded runs. TruScan offers several features that substantially simplify symbolic
A Procedure for Testing the Difference between Effect Sizes.
ERIC Educational Resources Information Center
Lambert, Richard G.; Flowers, Claudia
A special case of the homogeneity of effect size test, as applied to pairwise comparisons of standardized mean differences, was evaluated. Procedures for comparing pairs of pretest to posttest effect sizes, as well as pairs of treatment versus control group effect sizes, were examined. Monte Carlo simulation was used to generate Type I error rates…
Docherty, Paul D; Schranz, Christoph; Chase, J Geoffrey; Chiew, Yeong Shiong; Möller, Knut
2014-05-01
Accurate model parameter identification relies on accurate forward model simulations to guide convergence. However, some forward simulation methodologies lack the precision required to properly define the local objective surface and can cause failed parameter identification. The role of objective surface smoothness in identification of a pulmonary mechanics model was assessed using forward simulation from a novel error-stepping method and a proprietary Runge-Kutta method. The objective surfaces were compared via the identified parameter discrepancy generated in a Monte Carlo simulation and the local smoothness of the objective surfaces they generate. The error-stepping method generated significantly smoother error surfaces in each of the cases tested (p<0.0001) and more accurate model parameter estimates than the Runge-Kutta method in three of the four cases tested (p<0.0001) despite a 75% reduction in computational cost. Of note, parameter discrepancy in most cases was limited to a particular oblique plane, indicating a non-intuitive multi-parameter trade-off was occurring. The error-stepping method consistently improved or equalled the outcomes of the Runge-Kutta time-integration method for forward simulations of the pulmonary mechanics model. This study indicates that accurate parameter identification relies on accurate definition of the local objective function, and that parameter trade-off can occur on oblique planes resulting prematurely halted parameter convergence. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Moles: Tool-Assisted Environment Isolation with Closures
NASA Astrophysics Data System (ADS)
de Halleux, Jonathan; Tillmann, Nikolai
Isolating test cases from environment dependencies is often desirable, as it increases test reliability and reduces test execution time. However, code that calls non-virtual methods or consumes sealed classes is often impossible to test in isolation. Moles is a new lightweight framework which addresses this problem. For any .NET method, Moles allows test-code to provide alternative implementations, given as .NET delegates, for which C# provides very concise syntax while capturing local variables in a closure object. Using code instrumentation, the Moles framework will redirect calls to provided delegates instead of the original methods. The Moles framework is designed to work together with the dynamic symbolic execution tool Pex to enable automated test generation. In a case study, testing code programmed against the Microsoft SharePoint Foundation API, we achieved full code coverage while running tests in isolation without an actual SharePoint server. The Moles framework integrates with .NET and Visual Studio.
NASA Technical Reports Server (NTRS)
Clune, E.; Segall, Z.; Siewiorek, D.
1984-01-01
A program of experiments has been conducted at NASA-Langley to test the fault-free performance of a Fault-Tolerant Multiprocessor (FTMP) avionics system for next-generation aircraft. Baseline measurements of an operating FTMP system were obtained with respect to the following parameters: instruction execution time, frame size, and the variation of clock ticks. The mechanisms of frame stretching were also investigated. The experimental results are summarized in a table. Areas of interest for future tests are identified, with emphasis given to the implementation of a synthetic workload generation mechanism on FTMP.
Digital test assembly of truck parts with the IMMA-tool--an illustrative case.
Hanson, L; Högberg, D; Söderholm, M
2012-01-01
Several digital human modelling (DHM) tools have been developed for simulation and visualisation of human postures and motions. In 2010 the DHM tool IMMA (Intelligently Moving Manikins) was introduced as a DHM tool that uses advanced path planning techniques to generate collision free and biomechanically acceptable motions for digital human models (as well as parts) in complex assembly situations. The aim of the paper is to illustrate how the IPS/IMMA tool is used at Scania CV AB in a digital test assembly process, and to compare the tool with other DHM tools on the market. The illustrated case of using the IMMA tool, here combined with the path planner tool IPS, indicates that the tool is promising. The major strengths of the tool are its user friendly interface, the motion generation algorithms, the batch simulation of manikins and the ergonomics assessment methods that consider time.
HOMAR: A computer code for generating homotopic grids using algebraic relations: User's manual
NASA Technical Reports Server (NTRS)
Moitra, Anutosh
1989-01-01
A computer code for fast automatic generation of quasi-three-dimensional grid systems for aerospace configurations is described. The code employs a homotopic method to algebraically generate two-dimensional grids in cross-sectional planes, which are stacked to produce a three-dimensional grid system. Implementation of the algebraic equivalents of the homotopic relations for generating body geometries and grids are explained. Procedures for controlling grid orthogonality and distortion are described. Test cases with description and specification of inputs are presented in detail. The FORTRAN computer program and notes on implementation and use are included.
Tarkan, Sureyya; Plaisant, Catherine; Shneiderman, Ben; Hettinger, A. Zachary
2011-01-01
Researchers have conducted numerous case studies reporting the details on how laboratory test results of patients were missed by the ordering medical providers. Given the importance of timely test results in an outpatient setting, there is limited discussion of electronic versions of test result management tools to help clinicians and medical staff with this complex process. This paper presents three ideas to reduce missed results with a system that facilitates tracking laboratory tests from order to completion as well as during follow-up: (1) define a workflow management model that clarifies responsible agents and associated time frame, (2) generate a user interface for tracking that could eventually be integrated into current electronic health record (EHR) systems, (3) help identify common problems in past orders through retrospective analyses. PMID:22195201
Requirements-Driven Log Analysis Extended Abstract
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2012-01-01
Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.
Myer, Emily N B; Petrikovets, Andrey; Slocum, Paul D; Lee, Toy Gee; Carter-Brooks, Charelle M; Noor, Nabila; Carlos, Daniela M; Wu, Emily; Van Eck, Kathryn; Fashokun, Tola B; Yurteri-Kaplan, Ladin; Chen, Chi Chiung Grace
2018-04-07
Sacral neuromodulation is an effective therapy for overactive bladder, urinary retention, and fecal incontinence. Infection after sacral neurostimulation is costly and burdensome. Determining optimal perioperative management strategies to reduce the risk of infection is important to reduce this burden. We sought to identify risk factors associated with sacral neurostimulator infection requiring explantation, to estimate the incidence of infection requiring explantation, and identify associated microbial pathogens. This is a multicenter retrospective case-control study of sacral neuromodulation procedures completed from Jan. 1, 2004, through Dec. 31, 2014. We identified all sacral neuromodulation implantable pulse generator implants as well as explants due to infection at 8 participating institutions. Cases were patients who required implantable pulse generator explantation for infection during the review period. Cases were included if age ≥18 years old, follow-up data were available ≥30 days after implantable pulse generator implant, and the implant was performed at the institution performing the explant. Two controls were matched to each case. These controls were the patients who had an implantable pulse generator implanted by the same surgeon immediately preceding and immediately following the identified case who met inclusion criteria. Controls were included if age ≥18 years old, no infection after implantable pulse generator implant, follow-up data were available ≥180 days after implant, and no explant for any reason <180 days from implant. Controls may have had an explant for reasons other than infection at >180 days after implant. Fisher exact test (for categorical variables) and Student t test (for continuous variables) were used to test the strength of the association between infection and patient and surgery characteristics. Significant variables were then considered in a multivariable logistic regression model to determine risk factors independently associated with infection. Over a 10-year period at 8 academic institutions, 1930 sacral neuromodulator implants were performed by 17 surgeons. In all, 38 cases requiring device explant for infection and 72 corresponding controls were identified. The incidence of infection requiring explant was 1.97%. Hematoma formation (13% cases, 0% controls; P = .004) and pocket depth of ≥3 cm (21% cases, 0% controls; P = .031) were independently associated with an increased risk of infection requiring explant. On multivariable regression analysis controlling for significant variables, both hematoma formation (P = .006) and pocket depth ≥3 cm (P = .020, odds ratio 3.26; 95% confidence interval, 1.20-8.89) remained significantly associated with infection requiring explant. Of the 38 cases requiring explant, 32 had cultures collected and 24 had positive cultures. All 5 cases with a hematoma had a positive culture (100%). Of the 4 cases with a pocket depth ≥3 cm, 2 had positive cultures, 1 had negative cultures, and 1 had a missing culture result. The most common organism identified was methicillin-resistant Staphylococcus aureus (38%). Infection after sacral neuromodulation requiring device explant is low. The most common infectious pathogen identified was methicillin-resistant S aureus. Demographic and health characteristics did not predict risk of explant due to infection, however, having a postoperative hematoma or a deep pocket ≥3 cm significantly increased the risk of explant due to infection. These findings highlight the importance of meticulous hemostasis as well as ensuring the pocket depth is <3 cm at the time of device implant. Copyright © 2018 Elsevier Inc. All rights reserved.
Gallio, Elena; Giglioli, Francesca Romana; Girardi, Andrea; Guarneri, Alessia; Ricardi, Umberto; Ropolo, Roberto; Ragona, Riccardo; Fiandra, Christian
2018-02-01
Automated treatment planning is a new frontier in radiotherapy. The Auto-Planning module of the Pinnacle 3 treatment planning system (TPS) was evaluated for liver stereotactic body radiation therapy treatments. Ten cases were included in the study. Six plans were generated for each case by four medical physics experts. The first two planned with Pinnacle TPS, both with manual module (MP) and Auto-Planning one (AP). The other two physicists generated two plans with Monaco TPS (VM). Treatment plan comparisons were then carried on the various dosimetric parameters of target and organs at risk, monitor units, number of segments, plan complexity metrics and human resource planning time. The user dependency of Auto-Planning was also tested and the plans were evaluated by a trained physician. Statistically significant differences (Anova test) were observed for spinal cord doses, plan average beam irregularity, number of segments, monitor units and human planning time. The Fisher-Hayter test applied to these parameters showed significant statistical differences between AP e MP for spinal cord doses and human planning time; between MP and VM for monitor units, number of segments and plan irregularity; for all those between AP and VM. The two plans created by different planners with AP were similar to each other. The plans created with Auto-Planning were comparable to the manually generated plans. The time saved in planning enables the planner to commit more resources to more complex cases. The independence of the planner enables to standardize plan quality. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Jaworska, Joanna; Harol, Artsiom; Kern, Petra S; Gerberick, G Frank
2011-01-01
There is an urgent need to develop data integration and testing strategy frameworks allowing interpretation of results from animal alternative test batteries. To this end, we developed a Bayesian Network Integrated Testing Strategy (BN ITS) with the goal to estimate skin sensitization hazard as a test case of previously developed concepts (Jaworska et al., 2010). The BN ITS combines in silico, in chemico, and in vitro data related to skin penetration, peptide reactivity, and dendritic cell activation, and guides testing strategy by Value of Information (VoI). The approach offers novel insights into testing strategies: there is no one best testing strategy, but the optimal sequence of tests depends on information at hand, and is chemical-specific. Thus, a single generic set of tests as a replacement strategy is unlikely to be most effective. BN ITS offers the possibility of evaluating the impact of generating additional data on the target information uncertainty reduction before testing is commenced.
Buttitta, Fiamma; Felicioni, Lara; Del Grammastro, Maela; Filice, Giampaolo; Di Lorito, Alessia; Malatesta, Sara; Viola, Patrizia; Centi, Irene; D'Antuono, Tommaso; Zappacosta, Roberta; Rosini, Sandra; Cuccurullo, Franco; Marchetti, Antonio
2013-02-01
The therapeutic choice for patients with lung adenocarcinoma depends on the presence of EGF receptor (EGFR) mutations. In many cases, only cytologic samples are available for molecular diagnosis. Bronchoalveolar lavage (BAL) and pleural fluid, which represent a considerable proportion of cytologic specimens, cannot always be used for molecular testing because of low rate of tumor cells. We tested the feasibility of EGFR mutation analysis on BAL and pleural fluid samples by next-generation sequencing (NGS), an innovative and extremely sensitive platform. The study was devised to extend the EGFR test to those patients who could not get it due to the paucity of biologic material. A series of 830 lung cytology specimens was used to select 48 samples (BAL and pleural fluid) from patients with EGFR mutations in resected tumors. These samples included 36 cases with 0.3% to 9% of neoplastic cells (series A) and 12 cases without evidence of tumor (series B). All samples were analyzed by Sanger sequencing and NGS on 454 Roche platform. A mean of 21,130 ± 2,370 sequences per sample were obtained by NGS. In series A, EGFR mutations were detected in 16% of cases by Sanger sequencing and in 81% of cases by NGS. Seventy-seven percent of cases found to be negative by Sanger sequencing showed mutations by NGS. In series B, all samples were negative for EGFR mutation by Sanger sequencing whereas 42% of them were positive by NGS. The very sensitive EGFR-NGS assay may open up to the possibility of specific treatments for patients otherwise doomed to re-biopsies or nontargeted therapies.
Somasekar, Sneha; Lee, Deanna; Rule, Jody; Naccache, Samia N; Stone, Mars; Busch, Michael P; Sanders, Corron; Lee, William M; Chiu, Charles Y
2017-10-16
Twelve percent of all acute liver failure (ALF) cases are of unknown origin, often termed indeterminate. A previously unrecognized hepatotropic virus has been suspected as a potential etiologic agent. We compared the performance of metagenomic next-generation sequencing (mNGS) with confirmatory nucleic acid testing (NAT) to routine clinical diagnostic testing in detection of known or novel viruses associated with ALF. Serum samples from 204 adult ALF patients collected from 1998 to 2010 as part of a nationwide registry were analyzed. One hundred eighty-seven patients (92%) were classified as indeterminate, while the remaining 17 patients (8%) served as controls, with infections by either hepatitis A virus or hepatitis B virus (HBV), or a noninfectious cause for their ALF. Eight cases of infection from previously unrecognized viral pathogens were detected by mNGS (4 cases of herpes simplex virus type 1, including 1 case of coinfection with HBV, and 1 case each of HBV, parvovirus B19, cytomegalovirus, and human herpesvirus 7). Several missed dual or triple infections were also identified, and assembled viral genomes provided additional information on genotyping and drug resistance mutations. Importantly, no sequences corresponding to novel viruses were detected. These results suggest that ALF patients should be screened for the presence of uncommon viruses and coinfections, and that most cases of indeterminate ALF in the United States do not appear to be caused by novel viral pathogens. In the future, mNGS testing may be useful for comprehensive diagnosis of viruses associated with ALF, or to exclude infectious etiologies. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Evaluation of Critical Bandwidth Using Digitally Processed Speech.
1982-05-12
observed after re- peating the two tests on persons with confirmed cases of sensorineural hearing impairment. Again, the plotted speech discrimination...quantifying the critical bandwidth of persons on a cli- nical or pre-employment level. The complex portion of the test design (the computer generation of...34super" normal hearing indi- viduals (i.e., those persons with narrower-than-normal cri- tical bands). This ability of the test shows promise as a valuable
Impact of Virtual Patients as Optional Learning Material in Veterinary Biochemistry Education.
Kleinsorgen, Christin; von Köckritz-Blickwede, Maren; Naim, Hassan Y; Branitzki-Heinemann, Katja; Kankofer, Marta; Mándoki, Míra; Adler, Martin; Tipold, Andrea; Ehlers, Jan P
2018-01-01
Biochemistry and physiology teachers from veterinary faculties in Hannover, Budapest, and Lublin prepared innovative, computer-based, integrative clinical case scenarios as optional learning materials for teaching and learning in basic sciences. These learning materials were designed to enhance attention and increase interest and intrinsic motivation for learning, thus strengthening autonomous, active, and self-directed learning. We investigated learning progress and success by administering a pre-test before exposure to the virtual patients (vetVIP) cases, offered vetVIP cases alongside regular biochemistry courses, and then administered a complementary post-test. We analyzed improvement in cohort performance and level of confidence in rating questions. Results of the performance in biochemistry examinations in 2014, 2015, and 2016 were correlated with the use of and performance in vetVIP cases throughout biochemistry courses in Hannover. Surveys of students reflected that interactive cases helped them understand the relevance of basic sciences in veterinary education. Differences between identical pre- and post-tests revealed knowledge improvement (correct answers: +28% in Hannover, +9% in Lublin) and enhanced confidence in decision making ("I don't know" answers: -20% in Hannover, -7.5% in Lublin). High case usage and voluntary participation (use of vetVIP cases in Hannover and Lublin >70%, Budapest <1%; response rates in pre-test 72% and post-test 48%) indicated a good increase in motivation for the subject of biochemistry. Despite increased motivation, there was only a weak correlation between performance in final exams and performance in the vetVIP cases. Case-based e-learning could be extended and generated cases should be shared across veterinary faculties.
From empirical data to time-inhomogeneous continuous Markov processes.
Lencastre, Pedro; Raischel, Frank; Rogers, Tim; Lind, Pedro G
2016-03-01
We present an approach for testing for the existence of continuous generators of discrete stochastic transition matrices. Typically, existing methods to ascertain the existence of continuous Markov processes are based on the assumption that only time-homogeneous generators exist. Here a systematic extension to time inhomogeneity is presented, based on new mathematical propositions incorporating necessary and sufficient conditions, which are then implemented computationally and applied to numerical data. A discussion concerning the bridging between rigorous mathematical results on the existence of generators to its computational implementation is presented. Our detection algorithm shows to be effective in more than 60% of tested matrices, typically 80% to 90%, and for those an estimate of the (nonhomogeneous) generator matrix follows. We also solve the embedding problem analytically for the particular case of three-dimensional circulant matrices. Finally, a discussion of possible applications of our framework to problems in different fields is briefly addressed.
Evaluation of steam generator WWER 440 tube integrity criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Splichal, K.; Otruba, J.; Burda, J.
1997-02-01
The main corrosion damage in WWER steam generators under operating conditions has been observed on the outer surface of these tubes. An essential operational requirement is to assure a low probability of radioactive primary water leakage, unstable defect development and rupture of tubes. In the case of WWER 440 steam generators the above requirements led to the development of permissible limits for data evaluation of the primary-to-secondary leak measurements and determination of acceptable values for plugging of heat exchange tubes based on eddy current test (ECT) inspections.
Ice Particle Analysis of the Honeywell AL502 Engine Booster
NASA Technical Reports Server (NTRS)
Bidwell, Colin S.; Rigby, David L.
2015-01-01
A flow and ice particle trajectory analysis was performed for the booster of the Honeywell ALF502 engine. The analysis focused on two closely related conditions one of which produced an icing event and another which did not during testing of the ALF502 engine in the Propulsion Systems Lab (PSL) at NASA Glenn Research Center. The flow analysis was generated using the NASA Glenn GlennHT flow solver and the particle analysis was generated using the NASA Glenn LEWICE3D v3.63 ice accretion software. The inflow conditions for the two conditions were similar with the main differences being that the condition that produced the icing event was 6.8 K colder than the non-icing event case and the inflow ice water content (IWC) for the non-icing event case was 50% less than for the icing event case. The particle analysis, which considered sublimation, evaporation and phase change, was generated for a 5 micron ice particle with a sticky impact model and for a 24 micron median volume diameter (MVD), 7 bin ice particle distribution with a supercooled large droplet (SLD) splash model used to simulate ice particle breakup. The particle analysis did not consider the effect of the runback and re-impingement of water resulting from the heated spinner and anti-icing system. The results from the analysis showed that the amount of impingement for the components were similar for the same particle size and impact model for the icing and non-icing event conditions. This was attributed to the similar aerodynamic conditions in the booster for the two cases. The particle temperature and melt fraction were higher at the same location and particle size for the non-icing event than for the icing event case due to the higher incoming inflow temperature for the non-event case. The 5 micron ice particle case produced higher impact temperatures and higher melt fractions on the components downstream of the fan than the 24 micron MVD case because the average particle size generated by the particle breakup was larger than 5 microns which yielded less warming and melting. The analysis also showed that the melt fraction and wet bulb temperature icing criterion developed during tests in the Research Altitude Test Facility (RATFac) at the National Research Council (NRC) of Canada were useful in predicting icing events in the ALF502 engine. The development of an ice particle impact model which includes the effects of particle breakup, phase change, and surface state is necessary to further improve the prediction of ice particle transport with phase change through turbomachinery.
Hegarty, Peter
2017-01-01
Drawing together social psychologists' concerns with equality and cognitive psychologists' concerns with scientific inference, 6 studies (N = 841) showed how implicit category norms make the generation and test of hypothesis about race highly asymmetric. Having shown that Whiteness is the default race of celebrity actors (Study 1), Study 2 used a variant of Wason's (1960) rule discovery task to demonstrate greater difficulty in discovering rules that require specifying that race is shared by White celebrity actors than by Black celebrity actors. Clues to the Whiteness of White actors from analogous problems had little effect on hypothesis formation or rule discovery (Studies 3 and 4). Rather, across Studies 2 and 4 feedback about negative cases-non-White celebrities-facilitated the discovery that White actors shared a race, whether participants or experimenters generated the negative cases. These category norms were little affected by making White actors' Whiteness more informative (Study 5). Although participants understood that discovering that White actors are White would be harder than discovering that Black actors are Black, they showed limited insight into the information contained in negative cases (Study 6). Category norms render some identities as implicit defaults, making hypothesis formation and generalization about real social groups asymmetric in ways that have implications for scientific reasoning and social equality. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
A Human Proximity Operations System test case validation approach
NASA Astrophysics Data System (ADS)
Huber, Justin; Straub, Jeremy
A Human Proximity Operations System (HPOS) poses numerous risks in a real world environment. These risks range from mundane tasks such as avoiding walls and fixed obstacles to the critical need to keep people and processes safe in the context of the HPOS's situation-specific decision making. Validating the performance of an HPOS, which must operate in a real-world environment, is an ill posed problem due to the complexity that is introduced by erratic (non-computer) actors. In order to prove the HPOS's usefulness, test cases must be generated to simulate possible actions of these actors, so the HPOS can be shown to be able perform safely in environments where it will be operated. The HPOS must demonstrate its ability to be as safe as a human, across a wide range of foreseeable circumstances. This paper evaluates the use of test cases to validate HPOS performance and utility. It considers an HPOS's safe performance in the context of a common human activity, moving through a crowded corridor, and extrapolates (based on this) to the suitability of using test cases for AI validation in other areas of prospective application.
NASA Astrophysics Data System (ADS)
Kim, Duk-hyun; Lee, Hyoung-Jin
2018-04-01
A study of efficient aerodynamic database modeling method was conducted. A creation of database using periodicity and symmetry characteristic of missile aerodynamic coefficient was investigated to minimize the number of wind tunnel test cases. In addition, studies of how to generate the aerodynamic database when the periodicity changes due to installation of protuberance and how to conduct a zero calibration were carried out. Depending on missile configurations, the required number of test cases changes and there exist tests that can be omitted. A database of aerodynamic on deflection angle of control surface can be constituted using phase shift. A validity of modeling method was demonstrated by confirming that the result which the aerodynamic coefficient calculated by using the modeling method was in agreement with wind tunnel test results.
A Model Independent S/W Framework for Search-Based Software Testing
Baik, Jongmoon
2014-01-01
In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314
A 400-kWe high-efficiency steam turbine for industrial cogeneration
NASA Technical Reports Server (NTRS)
Leibowitz, H. M.
1982-01-01
An advanced state-of-the-art steam turbine-generator developed to serve as the power conversion subsystem for the Department of Energy's Sandia National Laboratories' Solar Total-Energy Project (STEP) is described. The turbine-generator, which is designed to provide 400-kW of net electrical power, represents the largest turbine-generator built specifically for commercial solar-powered cogeneration. The controls for the turbine-generator incorporate a multiple, partial-arc entry to provide efficient off-design performance, as well as an extraction control scheme to permit extraction flow regulation while maintaining 110-spsig pressure. Normal turbine operation is achieved while synchronized to a local utility and in a stand-alone mode. In both cases, the turbine-generator features automatic load control as well as remote start-up and shutdown capability. Tests totaling 200 hours were conducted to confirm the integrity of the turbine's mechanical structure and control function. Performance tests resulted in a measured inlet throttle flow of 8,450 pounds per hour, which was near design conditions.
Yucel, Deniz Sanliyuksel; Baba, Alper
2016-08-01
The Etili neighborhood in Can County (northwestern Turkey) has large reserves of coal and has been the site of many small- to medium-scale mining operations since the 1980s. Some of these have ceased working while others continue to operate. Once activities cease, the mining facilities and fields are usually abandoned without rehabilitation. The most significant environmental problem is acid mine drainage (AMD). This study was carried out to determine the acid generation potential of various lithological units in the Etili coal mine using static test methods. Seventeen samples were selected from areas with high acidic water concentrations: from different alteration zones belonging to volcanic rocks, from sedimentary rocks, and from coals and mine wastes. Static tests (paste pH, standard acid-base accounting, and net acid generation tests) were performed on these samples. The consistency of the static test results showed that oxidation of sulfide minerals, especially pyrite-which is widely found not only in the alteration zones of volcanic rocks but also in the coals and mine wastes-is the main factor controlling the generation of AMD in this mine. Lack of carbonate minerals in the region also increases the occurrence of AMD.
Xenograft model for therapeutic drug testing in recurrent respiratory papillomatosis.
Ahn, Julie; Bishop, Justin A; Akpeng, Belinda; Pai, Sara I; Best, Simon R A
2015-02-01
Identifying effective treatment for papillomatosis is limited by a lack of animal models, and there is currently no preclinical model for testing potential therapeutic agents. We hypothesized that xenografting of papilloma may facilitate in vivo drug testing to identify novel treatment options. A biopsy of fresh tracheal papilloma was xenografted into a NOD-scid-IL2Rgamma(null) (NSG) mouse. The xenograft began growing after 5 weeks and was serially passaged over multiple generations. Each generation showed a consistent log-growth pattern, and in all xenografts, the presence of the human papillomavirus (HPV) genome was confirmed by polymerase chain reaction (PCR). Histopathologic analysis demonstrated that the squamous architecture of the original papilloma was maintained in each generation. In vivo drug testing with bevacizumab (5 mg/kg i.p. twice weekly for 3 weeks) showed a dramatic therapeutic response compared to saline control. We report here the first successful case of serial xenografting of a tracheal papilloma in vivo with a therapeutic response observed with drug testing. In severely immunocompromised mice, the HPV genome and squamous differentiation of the papilloma can be maintained for multiple generations. This is a feasible approach to identify therapeutic agents in the treatment of recurrent respiratory papillomatosis. © The Author(s) 2014.
LogiKit - assisting complex logic specification and implementation for embedded control systems
NASA Astrophysics Data System (ADS)
Diglio, A.; Nicolodi, B.
2002-07-01
LogiKit provides an overall lifecycle solution. LogiKit is a powerful software engineering case toolkit for requirements specification, simulation and documentation. LogiKit also provides an automatic ADA software design, code and unit test generator.
Supersonic CO electric-discharge lasers
NASA Technical Reports Server (NTRS)
Hason, R. K.; Mitchner, M.; Stanton, A.
1975-01-01
Laser modeling activity is described which involved addition of an option allowing N2 as a second diatomic gas. This option is now operational and a few test cases involving N2/CO mixtures were run. Results from these initial test cases are summarized. In the laboratory, a CW double-discharge test facility was constructed and tested. Features include: water-cooled removable electrodes, O-ring construction to facilitate cleaning and design modifications, increased discharge length, and addition of a post-discharge observation section. Preliminary tests with this facility using N2 yielded higher power loadings than obtained in the first-generation facility. Another test-section modification, recently made and as yet untested, will permit injection of secondary gases into the cathode boundary layer. The objective will be to vary and enhance the UV emission spectrum from the auxiliary discharge, thereby influencing the level of photoionization in the main discharge region.
Walitt, Brian; Mackey, Rachel; Kuller, Lewis; Deane, Kevin D; Robinson, William; Holers, V Michael; Chang, Yue-Fang; Moreland, Larry
2013-05-01
Rheumatoid arthritis (RA) research using large databases is limited by insufficient case validity. Of 161,808 postmenopausal women in the Women's Health Initiative, 15,691 (10.2%) reported having RA, far higher than the expected 1% population prevalence. Since chart review for confirmation of an RA diagnosis is impractical in large cohort studies, the current study (2009-2011) tested the ability of baseline serum measurements of rheumatoid factor and anti-cyclic citrullinated peptide antibodies, second-generation assay (anti-CCP2), to identify physician-validated RA among the chart-review study participants with self-reported RA (n = 286). Anti-CCP2 positivity had the highest positive predictive value (PPV) (80.0%), and rheumatoid factor positivity the lowest (44.6%). Together, use of disease-modifying antirheumatic drugs and anti-CCP2 positivity increased PPV to 100% but excluded all seronegative cases (approximately 15% of all RA cases). Case definitions inclusive of seronegative cases had PPVs between 59.6% and 63.6%. False-negative results were minimized in these test definitions, as evidenced by negative predictive values of approximately 90%. Serological measurements, particularly measurement of anti-CCP2, improved the test characteristics of RA case definitions in the Women's Health Initiative.
Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation
1989-08-01
pobitive, as false positives generated by a medical program can often be caught by a physician upon further testing . False negatives, however, may be...improvement over the knowledge base tested is obtained. Although our work is pretty much theoretical research oriented one example of ex- periments is...knowledge base, improves the performance by about 10%. of tests . First, we divide the cases into a training set and a validation set with 70% vs. 30% each
Second-Generation Large Civil Tiltrotor 7- by 10-Foot Wind Tunnel Test Data Report
NASA Technical Reports Server (NTRS)
Theodore, Colin R.; Russell, Carl R.; Willink, Gina C.; Pete, Ashley E.; Adibi, Sierra A.; Ewert, Adam; Theuns, Lieselotte; Beierle, Connor
2016-01-01
An approximately 6-percent scale model of the NASA Second-Generation Large Civil Tiltrotor (LCTR2) Aircraft was tested in the U.S. Army 7- by 10-Foot Wind Tunnel at NASA Ames Research Center January 4 to April 19, 2012, and September 18 to November 1, 2013. The full model was tested, along with modified versions in order to determine the effects of the wing tip extensions and nacelles; the wing was also tested separately in the various configurations. In both cases, the wing and nacelles used were adopted from the U.S. Army High Efficiency Tilt Rotor (HETR) aircraft, in order to limit the cost of the experiment. The full airframe was tested in high-speed cruise and low-speed hover flight conditions, while the wing was tested only in cruise conditions, with Reynolds numbers ranging from 0 to 1.4 million. In all cases, the external scale system of the wind tunnel was used to collect data. Both models were mounted to the scale using two support struts attached underneath the wing; the full airframe model also used a third strut attached at the tail. The collected data provides insight into the performance of the preliminary design of the LCTR2 and will be used for computational fluid dynamics (CFD) validation and the development of flight dynamics simulation models.
NASA Astrophysics Data System (ADS)
Yoshikawa, Joe; Nishio, Yu; Izawa, Seiichiro; Fukunishi, Yu
2018-01-01
Numerical simulations are carried out to discover the flow structure that plays an important role in the laminar-turbulent transition process of a boundary layer on a flat plate. The boundary layer is destabilized by ejecting a short-duration jet from a hole in the surface. When the jet velocity is set to 20% of the uniform-flow velocity, a laminar-turbulent transition takes place, whereas in the 18% case, the disturbances created by the jet decay downstream. It is found that in both cases, hairpin vortices are generated; however, these first-generation hairpins do not directly cause the transition. Only in the 20% case does a new hairpin vortex of a different shape with wider distance between the legs appear. The new hairpin grows with time and evokes the generation of vortical structures one after another around it, turning the flow turbulent. It is found that the difference between the two cases is whether or not one of the first-generation hairpin vortices gets connected with the nearby longitudinal vortices. Only when the connection is successful is the new hairpin vortex with wider distance between the legs created. For each of several cases tested with changing jet-ejecting conditions, no difference is found in the importance of the role of the hairpin structure. Therefore, we conclude that the hairpin vortex with widespread legs is a key structure in the transition to turbulence.
Diagnosis and Management of Hereditary Phaeochromocytoma and Paraganglioma.
Lalloo, Fiona
2016-01-01
About 30% of phaeochromocytomas or paragangliomas are genetic. Whilst some individuals will have clinical features or a family history of inherited cancer syndrome such as neurofibromatosis type 1 (NF1) or multiple endocrine neoplasia 2 (MEN2), the majority will present as an isolated case. To date, 14 genes have been described in which pathogenic mutations have been demonstrated to cause paraganglioma or phaeochromocytoma . Many cases with a pathogenic mutation may be at risk of developing further tumours. Therefore, identification of genetic cases is important in the long-term management of these individuals, ensuring that they are entered into a surveillance programme. Mutation testing also facilitates cascade testing within the family, allowing identification of other at-risk individuals. Many algorithms have been described to facilitate cost-effective genetic testing sequentially of these genes, with phenotypically driven pathways. New genetic technologies including next-generation sequencing and whole-exome sequencing will allow much quicker, cheaper and extensive testing of individuals in whom a genetic aetiology is suspected.
Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools
NASA Technical Reports Server (NTRS)
Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory
2013-01-01
Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.
NASA Astrophysics Data System (ADS)
Udomsungworagul, A.; Charnsethikul, P.
2018-03-01
This article introduces methodology to solve large scale two-phase linear programming with a case of multiple time period animal diet problems under both nutrients in raw materials and finished product demand uncertainties. Assumption of allowing to manufacture multiple product formulas in the same time period and assumption of allowing to hold raw materials and finished products inventory have been added. Dantzig-Wolfe decompositions, Benders decomposition and Column generations technique has been combined and applied to solve the problem. The proposed procedure was programmed using VBA and Solver tool in Microsoft Excel. A case study was used and tested in term of efficiency and effectiveness trade-offs.
Health-Care Referrals from Direct-to-Consumer Genetic Testing
Giovanni, Monica A.; Fickie, Matthew R.; Lehmann, Lisa S.; Green, Robert C.; Meckley, Lisa M.; Veenstra, David
2010-01-01
Background: Direct-to-consumer genetic testing (DTC-GT) provides personalized genetic risk information directly to consumers. Little is known about how and why consumers then communicate the results of this testing to health-care professionals. Aim: To query specialists in clinical genetics about their experience with individuals who consulted them after DTC-GT. Methods: Invitations to participate in a questionnaire were sent to three different groups of genetic professionals, totaling 4047 invitations, asking questions about individuals who consulted them after DTC-GT. For each case reported, respondents were asked to describe how the case was referred to them, the patient's rationale for DTC-GT, and the type of DTC-GT performed. Respondents were also queried about the consequences of the consultations in terms of additional testing ordered. The costs associated with each consultation were estimated. A clinical case series was compiled based upon clinician responses. Results: The invitation resulted in 133 responses describing 22 cases of clinical interactions following DTC-GT. Most consultations (59.1%) were self-referred to genetics professionals, but 31.8% were physician referred. Among respondents, 52.3% deemed the DTC-GT to be “clinically useful.” BRCA1/2 testing was considered clinically useful in 85.7% of cases; 35.7% of other tests were considered clinically useful. Subsequent referrals from genetics professionals to specialists and/or additional diagnostic testing were common, generating individual downstream costs estimated to range from $40 to $20,600. Conclusions: This clinical case series suggests that approximately half of clinical geneticists who saw patients after DTC-GT judged that testing was clinically useful, especially the BRCA1/2 testing. Further studies are needed in larger and more diverse populations to better understand the interactions between DTC-GT and the health-care system. PMID:20979566
Health-care referrals from direct-to-consumer genetic testing.
Giovanni, Monica A; Fickie, Matthew R; Lehmann, Lisa S; Green, Robert C; Meckley, Lisa M; Veenstra, David; Murray, Michael F
2010-12-01
direct-to-consumer genetic testing (DTC-GT) provides personalized genetic risk information directly to consumers. Little is known about how and why consumers then communicate the results of this testing to health-care professionals. to query specialists in clinical genetics about their experience with individuals who consulted them after DTC-GT. invitations to participate in a questionnaire were sent to three different groups of genetic professionals, totaling 4047 invitations, asking questions about individuals who consulted them after DTC-GT. For each case reported, respondents were asked to describe how the case was referred to them, the patient's rationale for DTC-GT, and the type of DTC-GT performed. Respondents were also queried about the consequences of the consultations in terms of additional testing ordered. The costs associated with each consultation were estimated. A clinical case series was compiled based upon clinician responses. the invitation resulted in 133 responses describing 22 cases of clinical interactions following DTC-GT. Most consultations (59.1%) were self-referred to genetics professionals, but 31.8% were physician referred. Among respondents, 52.3% deemed the DTC-GT to be "clinically useful." BRCA1/2 testing was considered clinically useful in 85.7% of cases; 35.7% of other tests were considered clinically useful. Subsequent referrals from genetics professionals to specialists and/or additional diagnostic testing were common, generating individual downstream costs estimated to range from $40 to $20,600. this clinical case series suggests that approximately half of clinical geneticists who saw patients after DTC-GT judged that testing was clinically useful, especially the BRCA1/2 testing. Further studies are needed in larger and more diverse populations to better understand the interactions between DTC-GT and the health-care system.
Schroeder, Lee F; Robilotti, Elizabeth; Peterson, Lance R; Banaei, Niaz; Dowdy, David W
2014-02-01
Clostridium difficile infection (CDI) is the most common cause of infectious diarrhea in health care settings, and for patients presumed to have CDI, their isolation while awaiting laboratory results is costly. Newer rapid tests for CDI may reduce this burden, but the economic consequences of different testing algorithms remain unexplored. We used decision analysis from the hospital perspective to compare multiple CDI testing algorithms for adult inpatients with suspected CDI, assuming patient management according to laboratory results. CDI testing strategies included combinations of on-demand PCR (odPCR), batch PCR, lateral-flow diagnostics, plate-reader enzyme immunoassay, and direct tissue culture cytotoxicity. In the reference scenario, algorithms incorporating rapid testing were cost-effective relative to nonrapid algorithms. For every 10,000 symptomatic adults, relative to a strategy of treating nobody, lateral-flow glutamate dehydrogenase (GDH)/odPCR generated 831 true-positive results and cost $1,600 per additional true-positive case treated. Stand-alone odPCR was more effective and more expensive, identifying 174 additional true-positive cases at $6,900 per additional case treated. All other testing strategies were dominated by (i.e., more costly and less effective than) stand-alone odPCR or odPCR preceded by lateral-flow screening. A cost-benefit analysis (including estimated costs of missed cases) favored stand-alone odPCR in most settings but favored odPCR preceded by lateral-flow testing if a missed CDI case resulted in less than $5,000 of extended hospital stay costs and <2 transmissions, if lateral-flow GDH diagnostic sensitivity was >93%, or if the symptomatic carrier proportion among the toxigenic culture-positive cases was >80%. These results can aid guideline developers and laboratory directors who are considering rapid testing algorithms for diagnosing CDI.
Robilotti, Elizabeth; Peterson, Lance R.; Banaei, Niaz; Dowdy, David W.
2014-01-01
Clostridium difficile infection (CDI) is the most common cause of infectious diarrhea in health care settings, and for patients presumed to have CDI, their isolation while awaiting laboratory results is costly. Newer rapid tests for CDI may reduce this burden, but the economic consequences of different testing algorithms remain unexplored. We used decision analysis from the hospital perspective to compare multiple CDI testing algorithms for adult inpatients with suspected CDI, assuming patient management according to laboratory results. CDI testing strategies included combinations of on-demand PCR (odPCR), batch PCR, lateral-flow diagnostics, plate-reader enzyme immunoassay, and direct tissue culture cytotoxicity. In the reference scenario, algorithms incorporating rapid testing were cost-effective relative to nonrapid algorithms. For every 10,000 symptomatic adults, relative to a strategy of treating nobody, lateral-flow glutamate dehydrogenase (GDH)/odPCR generated 831 true-positive results and cost $1,600 per additional true-positive case treated. Stand-alone odPCR was more effective and more expensive, identifying 174 additional true-positive cases at $6,900 per additional case treated. All other testing strategies were dominated by (i.e., more costly and less effective than) stand-alone odPCR or odPCR preceded by lateral-flow screening. A cost-benefit analysis (including estimated costs of missed cases) favored stand-alone odPCR in most settings but favored odPCR preceded by lateral-flow testing if a missed CDI case resulted in less than $5,000 of extended hospital stay costs and <2 transmissions, if lateral-flow GDH diagnostic sensitivity was >93%, or if the symptomatic carrier proportion among the toxigenic culture-positive cases was >80%. These results can aid guideline developers and laboratory directors who are considering rapid testing algorithms for diagnosing CDI. PMID:24478478
Composite Overwrap Fragmentation Observations, Concerns, and Recommendations
NASA Technical Reports Server (NTRS)
Bangham, Mike; Hovater, Mary
2017-01-01
A series of test activities has raised some concerns about the generation of orbital debris caused by failures of composite overwrapped pressure vessels (COPVs). These tests have indicated that a large number of composite fragments can be produced by either pressure burst failures or by high-speed impacts. A review of prior high-speed tests with COPV indicates that other tests have produced large numbers of composite fragments. As was the case with the test referenced here, the tests tended to produce a large number of small composite fragments with relatively low velocities induced by the impact and or gas expansion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, H; Chen, J; Pouliot, J
2015-06-15
Purpose: Deformable image registration (DIR) is a powerful tool with the potential to deformably map dose from one computed-tomography (CT) image to another. Errors in the DIR, however, will produce errors in the transferred dose distribution. We have proposed a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), which predicts voxel-specific dose mapping errors on a patient-by-patient basis. This work validates the effectiveness of AUTODIRECT to predict dose mapping errors with virtual and physical phantom datasets. Methods: AUTODIRECT requires 4 inputs: moving and fixed CT images and two noise scans of a water phantom (for noise characterization). Then,more » AUTODIRECT uses algorithms to generate test deformations and applies them to the moving and fixed images (along with processing) to digitally create sets of test images, with known ground-truth deformations that are similar to the actual one. The clinical DIR algorithm is then applied to these test image sets (currently 4) . From these tests, AUTODIRECT generates spatial and dose uncertainty estimates for each image voxel based on a Student’s t distribution. This work compares these uncertainty estimates to the actual errors made by the Velocity Deformable Multi Pass algorithm on 11 virtual and 1 physical phantom datasets. Results: For 11 of the 12 tests, the predicted dose error distributions from AUTODIRECT are well matched to the actual error distributions within 1–6% for 10 virtual phantoms, and 9% for the physical phantom. For one of the cases though, the predictions underestimated the errors in the tail of the distribution. Conclusion: Overall, the AUTODIRECT algorithm performed well on the 12 phantom cases for Velocity and was shown to generate accurate estimates of dose warping uncertainty. AUTODIRECT is able to automatically generate patient-, organ- , and voxel-specific DIR uncertainty estimates. This ability would be useful for patient-specific DIR quality assurance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quaglioni, S.; Beck, B. R.
The Monte Carlo All Particle Method generator and collision physics library features two models for allowing a particle to either up- or down-scatter due to collisions with material at finite temperature. The two models are presented and compared. Neutron interaction with matter through elastic collisions is used as testing case.
Mody, R K; Meyer, S; Trees, E; White, P L; Nguyen, T; Sowadsky, R; Henao, O L; Lafon, P C; Austin, J; Azzam, I; Griffin, P M; Tauxe, R V; Smith, K; Williams, I T
2014-05-01
We investigated an outbreak of 396 Salmonella enterica serotype I 4,5,12:i:- infections to determine the source. After 7 weeks of extensive hypothesis-generation interviews, no refined hypothesis was formed. Nevertheless, a case-control study was initiated. Subsequently, an iterative hypothesis-generation approach used by a single interviewing team identified brand A not-ready-to-eat frozen pot pies as a likely vehicle. The case-control study, modified to assess this new hypothesis, along with product testing indicated that the turkey variety of pot pies was responsible. Review of product labels identified inconsistent language regarding preparation, and the cooking instructions included undefined microwave wattage categories. Surveys found that most patients did not follow the product's cooking instructions and did not know their oven's wattage. The manufacturer voluntarily recalled pot pies and improved the product's cooking instructions. This investigation highlights the value of careful hypothesis-generation and the risks posed by frozen not-ready-to-eat microwavable foods.
[Methodology of psychiatric case histories].
Scherbaum, N; Mirzaian, E
1999-05-01
This paper deals with the methodology of psychiatric case histories. Three types of case histories are differentiated. The didactic case history teaches about the typical aspects of a psychiatric disorder or treatment by using an individual patient as an example. In the heuristic case history the individual case gives rise to challenging established concepts or to generate new hypotheses. Such hypotheses drawn from inductive reasoning have then to be tested using representative samples. The focus of hermeneutic case histories is the significance of pathological behaviour and experience in the context of the biography of an individual patient. So-called psychopathographies of important historical figures can also be differentiated according to these types. Based on these methodological considerations, quality standards for the named types of case histories are stated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sepehri, Aliasghar; Loeffler, Troy D.; Chen, Bin, E-mail: binchen@lsu.edu
2014-08-21
A new method has been developed to generate bending angle trials to improve the acceptance rate and the speed of configurational-bias Monte Carlo. Whereas traditionally the trial geometries are generated from a uniform distribution, in this method we attempt to use the exact probability density function so that each geometry generated is likely to be accepted. In actual practice, due to the complexity of this probability density function, a numerical representation of this distribution function would be required. This numerical table can be generated a priori from the distribution function. This method has been tested on a united-atom model ofmore » alkanes including propane, 2-methylpropane, and 2,2-dimethylpropane, that are good representatives of both linear and branched molecules. It has been shown from these test cases that reasonable approximations can be made especially for the highly branched molecules to reduce drastically the dimensionality and correspondingly the amount of the tabulated data that is needed to be stored. Despite these approximations, the dependencies between the various geometrical variables can be still well considered, as evident from a nearly perfect acceptance rate achieved. For all cases, the bending angles were shown to be sampled correctly by this method with an acceptance rate of at least 96% for 2,2-dimethylpropane to more than 99% for propane. Since only one trial is required to be generated for each bending angle (instead of thousands of trials required by the conventional algorithm), this method can dramatically reduce the simulation time. The profiling results of our Monte Carlo simulation code show that trial generation, which used to be the most time consuming process, is no longer the time dominating component of the simulation.« less
Measurements and modeling of flow structure in the wake of a low profile wishbone vortex generator
NASA Technical Reports Server (NTRS)
Wendt, B. J.; Hingst, W. R.
1994-01-01
The results of an experimental examination of the vortex structures shed from a low profile 'wishbone' generator are presented. The vortex generator height relative to the turbulent boundary layer was varied by testing two differently sized models. Measurements of the mean three-dimensional velocity field were conducted in cross-stream planes downstream of the vortex generators. In all cases, a counter-rotating vortex pair was observed. Individual vortices were characterized by three descriptors derived from the velocity data; circulation, peak vorticity, and cross-stream location of peak vorticity. Measurements in the cross plane at two axial locations behind the smaller wishbone characterize the downstream development of the vortex pairs. A single region of stream wise velocity deficit is shared by both vortex cores. This is in contrast to conventional generators, where each core coincides with a region of velocity deficit. The measured cross-stream velocities for each case are compared to an Oseen model with matching descriptors. The best comparison occurs with the data from the larger wishbone.
Smart pitch control strategy for wind generation system using doubly fed induction generator
NASA Astrophysics Data System (ADS)
Raza, Syed Ahmed
A smart pitch control strategy for a variable speed doubly fed wind generation system is presented in this thesis. A complete dynamic model of DFIG system is developed. The model consists of the generator, wind turbine, aerodynamic and the converter system. The strategy proposed includes the use of adaptive neural network to generate optimized controller gains for pitch control. This involves the generation of controller parameters of pitch controller making use of differential evolution intelligent technique. Training of the back propagation neural network has been carried out for the development of an adaptive neural network. This tunes the weights of the network according to the system states in a variable wind speed environment. Four cases have been taken to test the pitch controller which includes step and sinusoidal changes in wind speeds. The step change is composed of both step up and step down changes in wind speeds. The last case makes use of scaled wind data collected from the wind turbine installed at King Fahd University beach front. Simulation studies show that the differential evolution based adaptive neural network is capable of generating the appropriate control to deliver the maximum possible aerodynamic power available from wind to the generator in an efficient manner by minimizing the transients.
Thermal analysis of insulated north-wall greenhouse with solar collector under passive mode
NASA Astrophysics Data System (ADS)
Chauhan, Prashant Singh; Kumar, Anil
2018-04-01
An insulated north wall greenhouse dryer has been fabricated and tested for no-load condition under passive mode. Testing has been conducted in two different cases. Case-I is considered for solar collector kept inside the dryer and Case-II is dryer without solar collector. Convective heat transfer coefficient and various heat transfer dimensionless numbers with have been calculated for thermal analysis. The maximum convective heat transfer coefficient is found 52.18 W/m2°C at 14 h during the first day for Case-I. The difference of the highest convective heat transfer coefficient of both cases was 8.34 W/m2°C. Net heat gain inside room curves are uniform and smooth for Case-I, which shows the steady heat generation process due to presence of solar collector inside the dryer. Above results depicts the effectiveness of solar collector and insulated north wall. The selection of suitable crop for drying can be done by analysing article's result.
Zanini, Surama F; Silva-Angulo, Angela B; Rosenthal, Amauri; Aliaga, Dolores Rodrigo; Martínez, Antonio
2014-04-01
The main goal of this work was to study the bacterial adaptive responses to antibiotics induced by sublethal concentration of citral on first-and second-generation cells of Listeria monocytogenes serovar 4b (CECT 4032) and Salmonella enterica serovar Typhimurium (CECT 443). The first-generation cells were not pretreated with citral, while the second-generation cells were obtained from cells previously exposed to citral during 5 h. The trials were conducted at 37°C. The presence of citral in the culture medium and the antibiotic strips resulted in a reduced minimum inhibitory concentration (MIC) for the first-generation cells of Listeria monocytogenes serovar 4b and Salmonella Typhimurium. This result was observed for almost all the antibiotics, compared with the same microorganisms of the control group (without citral), which could represent an additive effect. For Listeria serovar 4b, the second-generation cells of the test group maintained the same susceptibility to antibiotics compared with cells in the control group and in the test group of the first generation. The second-generation cells of the control group indicated that the Salmonella Typhimurium maintained the same sensitivity to the antibiotics tested compared with the first generation of this group, except in the case of erythromycin, which exhibited an increased MIC value. With respect to the second-generation cells of Salmonella Typhimurium, the presence of citral determined a decrease in the antibiotic susceptibility for almost all of the antibiotics, except colistin, compared with the first-generation of the test group, which can be seen by increase of MIC values. In conclusion, the presence of citral in the culture medium of Listeria 4b and Salmonella Typhimurium increased the antibiotic susceptibility of the first generations, while we observed an increase in antibiotic resistance in the second generation of Salmonella Typhimurium.
Bertona, E; Radice, M; Rodríguez, C H; Barberis, C; Vay, C; Famiglietti, A; Gutkind, G
2005-01-01
Enterobacter spp. are becoming increasingly frequent nosocomial pathogens with multiple resistance mechanism to beta-lactam antibiotics. We carried out the phenotypic and genotypic characterization of beta-lactamases in 27 Enterobacter spp. (25 Enterobacter cloacae y 2 Enterobacter aerogenes), as well as the ability of different extended spectrum-lactamase (ESBL) screening methods. Resistance to third generation cephalosporins was observed in 15/27 (63%) isolates. Twelve resistant isolates produced high level chromosomal encoded AmpC beta-lactamase; 6 of them were also producers of PER-2. Resistance to third generation cephalosporins in the remaining 3 isolates was due to the presence of ESBLs, PER-2 in 2 cases, and CTX-M-2 in the other. Only CTX-M-2 production was detected with all tested cephalosporins using difusion synergy tests, while cefepime improved ESBLs detection in 7/8 PER-2 producers, 4/8 in the inhibitor approximation test and 7/8 with double disk test using cefepime containing disk with and without clavulanic acid. Dilution method, including cephalosporins with and without the inhibitor detected 1/9 ESBLs producers.
The Dynamical Core Model Intercomparison Project (DCMIP-2016): Results of the Supercell Test Case
NASA Astrophysics Data System (ADS)
Zarzycki, C. M.; Reed, K. A.; Jablonowski, C.; Ullrich, P. A.; Kent, J.; Lauritzen, P. H.; Nair, R. D.
2016-12-01
The 2016 Dynamical Core Model Intercomparison Project (DCMIP-2016) assesses the modeling techniques for global climate and weather models and was recently held at the National Center for Atmospheric Research (NCAR) in conjunction with a two-week summer school. Over 12 different international modeling groups participated in DCMIP-2016 and focused on the evaluation of the newest non-hydrostatic dynamical core designs for future high-resolution weather and climate models. The paper highlights the results of the third DCMIP-2016 test case, which is an idealized supercell storm on a reduced-radius Earth. The supercell storm test permits the study of a non-hydrostatic moist flow field with strong vertical velocities and associated precipitation. This test assesses the behavior of global modeling systems at extremely high spatial resolution and is used in the development of next-generation numerical weather prediction capabilities. In this regime the effective grid spacing is very similar to the horizontal scale of convective plumes, emphasizing resolved non-hydrostatic dynamics. The supercell test case sheds light on the physics-dynamics interplay and highlights the impact of diffusion on model solutions.
Generating and using truly random quantum states in Mathematica
NASA Astrophysics Data System (ADS)
Miszczak, Jarosław Adam
2012-01-01
The problem of generating random quantum states is of a great interest from the quantum information theory point of view. In this paper we present a package for Mathematica computing system harnessing a specific piece of hardware, namely Quantis quantum random number generator (QRNG), for investigating statistical properties of quantum states. The described package implements a number of functions for generating random states, which use Quantis QRNG as a source of randomness. It also provides procedures which can be used in simulations not related directly to quantum information processing. Program summaryProgram title: TRQS Catalogue identifier: AEKA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 7924 No. of bytes in distributed program, including test data, etc.: 88 651 Distribution format: tar.gz Programming language: Mathematica, C Computer: Requires a Quantis quantum random number generator (QRNG, http://www.idquantique.com/true-random-number-generator/products-overview.html) and supporting a recent version of Mathematica Operating system: Any platform supporting Mathematica; tested with GNU/Linux (32 and 64 bit) RAM: Case dependent Classification: 4.15 Nature of problem: Generation of random density matrices. Solution method: Use of a physical quantum random number generator. Running time: Generating 100 random numbers takes about 1 second, generating 1000 random density matrices takes more than a minute.
NASA Astrophysics Data System (ADS)
Wu, N.; Wang, J. H.; Shen, L.
2017-03-01
This paper presents a numerical investigation on the three-dimensional interaction between two bow shock waves in two environments, i.e. ground high-enthalpy wind tunnel test and real space flight, using Fluent 15.0. The first bow shock wave, also called induced shock wave, which is generated by the leading edge of a hypersonic vehicle. The other bow shock wave can be deemed objective shock wave, which is generated by the cowl clip of hypersonic inlet, and in this paper the inlet is represented by a wedge shaped nose cone. The interaction performances including flow field structures, aerodynamic pressure and heating are analyzed and compared between the ground test and the real space flight. Through the analysis and comparison, we can find the following important phenomena: 1) Three-dimensional complicated flow structures appear in both cases, but only in the real space flight condition, a local two-dimensional type IV interaction appears; 2) The heat flux and pressure in the interaction region are much larger than those in the no-interaction region in both cases, but the peak values of the heat flux and pressure in real space flight are smaller than those in ground test. 3) The interaction region on the objective surface are different in the two cases, and there is a peak value displacement of 3 mm along the stagnation line.
A Model Based Security Testing Method for Protocol Implementation
Fu, Yu Long; Xin, Xiao Long
2014-01-01
The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation. PMID:25105163
A model based security testing method for protocol implementation.
Fu, Yu Long; Xin, Xiao Long
2014-01-01
The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.
Orbit Estimation of Non-Cooperative Maneuvering Spacecraft
2015-06-01
only take on values that generate real sigma points; therefore, λ > −n. The additional weighting scheme is outlined in the following equations κ = α2...orbit shapes resulted in a similar model weighting. Additional cases of this orbit type also resulted in heavily weighting smaller η value models. It is...determined using both the symmetric and additional parameters UTs. The best values for the weighting parameters are then compared for each test case
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweetser, John David
2013-10-01
This report details Sculpt's implementation from a user's perspective. Sculpt is an automatic hexahedral mesh generation tool developed at Sandia National Labs by Steve Owen. 54 predetermined test cases are studied while varying the input parameters (Laplace iterations, optimization iterations, optimization threshold, number of processors) and measuring the quality of the resultant mesh. This information is used to determine the optimal input parameters to use for an unknown input geometry. The overall characteristics are covered in Chapter 1. The speci c details of every case are then given in Appendix A. Finally, example Sculpt inputs are given in B.1 andmore » B.2.« less
High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole
NASA Astrophysics Data System (ADS)
Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei
2018-01-01
Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb ×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits /s , with a failure probability less than 10-5. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.
Molecular Diagnostics in Pathology: Time for a Next-Generation Pathologist?
Fassan, Matteo
2018-03-01
- Comprehensive molecular investigations of mainstream carcinogenic processes have led to the use of effective molecular targeted agents in most cases of solid tumors in clinical settings. - To update readers regarding the evolving role of the pathologist in the therapeutic decision-making process and the introduction of next-generation technologies into pathology practice. - Current literature on the topic, primarily sourced from the PubMed (National Center for Biotechnology Information, Bethesda, Maryland) database, were reviewed. - Adequate evaluation of cytologic-based and tissue-based predictive diagnostic biomarkers largely depends on both proper pathologic characterization and customized processing of biospecimens. Moreover, increased requests for molecular testing have paralleled the recent, sharp decrease in tumor material to be analyzed-material that currently comprises cytology specimens or, at minimum, small biopsies in most cases of metastatic/advanced disease. Traditional diagnostic pathology has been completely revolutionized by the introduction of next-generation technologies, which provide multigene, targeted mutational profiling, even in the most complex of clinical cases. Combining traditional and molecular knowledge, pathologists integrate the morphological, clinical, and molecular dimensions of a disease, leading to a proper diagnosis and, therefore, the most-appropriate tailored therapy.
Subjective evaluation of next-generation video compression algorithms: a case study
NASA Astrophysics Data System (ADS)
De Simone, Francesca; Goldmann, Lutz; Lee, Jong-Seok; Ebrahimi, Touradj; Baroncini, Vittorio
2010-08-01
This paper describes the details and the results of the subjective quality evaluation performed at EPFL, as a contribution to the effort of the Joint Collaborative Team on Video Coding (JCT-VC) for the definition of the next-generation video coding standard. The performance of 27 coding technologies have been evaluated with respect to two H.264/MPEG-4 AVC anchors, considering high definition (HD) test material. The test campaign involved a total of 494 naive observers and took place over a period of four weeks. While similar tests have been conducted as part of the standardization process of previous video coding technologies, the test campaign described in this paper is by far the most extensive in the history of video coding standardization. The obtained subjective quality scores show high consistency and support an accurate comparison of the performance of the different coding solutions.
Multi-Layer Artificial Neural Networks Based MPPT-Pitch Angle Control of a Tidal Stream Generator
Bouallègue, Soufiene; Garrido, Aitor J.; Haggège, Joseph
2018-01-01
Artificial intelligence technologies are widely investigated as a promising technique for tackling complex and ill-defined problems. In this context, artificial neural networks methodology has been considered as an effective tool to handle renewable energy systems. Thereby, the use of Tidal Stream Generator (TSG) systems aim to provide clean and reliable electrical power. However, the power captured from tidal currents is highly disturbed due to the swell effect and the periodicity of the tidal current phenomenon. In order to improve the quality of the generated power, this paper focuses on the power smoothing control. For this purpose, a novel Artificial Neural Network (ANN) is investigated and implemented to provide the proper rotational speed reference and the blade pitch angle. The ANN supervisor adequately switches the system in variable speed and power limitation modes. In order to recover the maximum power from the tides, a rotational speed control is applied to the rotor side converter following the Maximum Power Point Tracking (MPPT) generated from the ANN block. In case of strong tidal currents, a pitch angle control is set based on the ANN approach to keep the system operating within safe limits. Two study cases were performed to test the performance of the output power. Simulation results demonstrate that the implemented control strategies achieve a smoothed generated power in the case of swell disturbances. PMID:29695127
Multi-Layer Artificial Neural Networks Based MPPT-Pitch Angle Control of a Tidal Stream Generator.
Ghefiri, Khaoula; Bouallègue, Soufiene; Garrido, Izaskun; Garrido, Aitor J; Haggège, Joseph
2018-04-24
Artificial intelligence technologies are widely investigated as a promising technique for tackling complex and ill-defined problems. In this context, artificial neural networks methodology has been considered as an effective tool to handle renewable energy systems. Thereby, the use of Tidal Stream Generator (TSG) systems aim to provide clean and reliable electrical power. However, the power captured from tidal currents is highly disturbed due to the swell effect and the periodicity of the tidal current phenomenon. In order to improve the quality of the generated power, this paper focuses on the power smoothing control. For this purpose, a novel Artificial Neural Network (ANN) is investigated and implemented to provide the proper rotational speed reference and the blade pitch angle. The ANN supervisor adequately switches the system in variable speed and power limitation modes. In order to recover the maximum power from the tides, a rotational speed control is applied to the rotor side converter following the Maximum Power Point Tracking (MPPT) generated from the ANN block. In case of strong tidal currents, a pitch angle control is set based on the ANN approach to keep the system operating within safe limits. Two study cases were performed to test the performance of the output power. Simulation results demonstrate that the implemented control strategies achieve a smoothed generated power in the case of swell disturbances.
Czaplik, M; Bergrath, S; Rossaint, R; Thelen, S; Brodziak, T; Valentin, B; Hirsch, F; Beckers, S K; Brokmann, J C
2014-01-01
Demographic change, rising co-morbidity and an increasing number of emergencies are the main challenges that emergency medical services (EMS) in several countries worldwide are facing. In order to improve quality in EMS, highly trained personnel and well-equipped ambulances are essential. However several studies have shown a deficiency in qualified EMS physicians. Telemedicine emerges as a complementary system in EMS that may provide expertise and improve quality of medical treatment on the scene. Hence our aim is to develop and test a specific teleconsultation system. During the development process several use cases were defined and technically specified by medical experts and engineers in the areas of: system administration, start-up of EMS assistance systems, audio communication, data transfer, routine tele-EMS physician activities and research capabilities. Upon completion, technical field tests were performed under realistic conditions to test system properties such as robustness, feasibility and usability, providing end-to-end measurements. Six ambulances were equipped with telemedical facilities based on the results of the requirement analysis and 55 scenarios were tested under realistic conditions in one month. The results indicate that the developed system performed well in terms of usability and robustness. The major challenges were, as expected, mobile communication and data network availability. Third generation networks were only available in 76.4% of the cases. Although 3G (third generation), such as Universal Mobile Telecommunications System (UMTS), provides beneficial conditions for higher bandwidth, system performance for most features was also acceptable under adequate 2G (second generation) test conditions. An innovative concept for the use of telemedicine for medical consultations in EMS was developed. Organisational and technical aspects were considered and practical requirements specified. Since technical feasibility was demonstrated in these technical field tests, the next step would be to prove medical usefulness and technical robustness under real conditions in a clinical trial.
NASA Astrophysics Data System (ADS)
Pairan, M. Rasidi; Asmuin, Norzelawati; Isa, Nurasikin Mat; Sies, Farid
2017-04-01
Water mist sprays are used in wide range of application. However it is depend to the spray characteristic to suit the particular application. This project studies the water droplet velocity and penetration angle generated by new development mist spray with a flat spray pattern. This research conducted into two part which are experimental and simulation section. The experimental was conducted by using particle image velocimetry (PIV) method, ANSYS software was used as tools for simulation section meanwhile image J software was used to measure the penetration angle. Three different of combination pressure of air and water were tested which are 1 bar (case A), 2 bar (case B) and 3 bar (case C). The flat spray generated by the new development nozzle was examined at 9cm vertical line from 8cm of the nozzle orifice. The result provided in the detailed analysis shows that the trend of graph velocity versus distance gives the good agreement within simulation and experiment for all the pressure combination. As the water and air pressure increased from 1 bar to 2 bar, the velocity and angle penetration also increased, however for case 3 which run under 3 bar condition, the water droplet velocity generated increased but the angle penetration is decreased. All the data then validated by calculate the error between experiment and simulation. By comparing the simulation data to the experiment data for all the cases, the standard deviation for this case A, case B and case C relatively small which are 5.444, 0.8242 and 6.4023.
Ohara, Nobumasa; Kaneko, Masanori; Kitazawa, Masaru; Uemura, Yasuyuki; Minagawa, Shinichi; Miyakoshi, Masashi; Kaneko, Kenzo; Kamoi, Kyuzi
2017-02-06
Graves' disease is an autoimmune thyroid disorder characterized by hyperthyroidism, and patients exhibit thyroid-stimulating hormone receptor antibody. The major methods of measuring circulating thyroid-stimulating hormone receptor antibody include the thyroid-stimulating hormone-binding inhibitory immunoglobulin assays. Although the diagnostic accuracy of these assays has been improved, a minority of patients with Graves' disease test negative even on second-generation and third-generation thyroid-stimulating hormone-binding inhibitory immunoglobulins. We report a rare case of a thyroid-stimulating hormone-binding inhibitory immunoglobulin-positive patient with Graves' disease who showed rapid lowering of thyroid-stimulating hormone-binding inhibitory immunoglobulin levels following administration of the anti-thyroid drug thiamazole, but still experienced Graves' hyperthyroidism. A 45-year-old Japanese man presented with severe hyperthyroidism (serum free triiodothyronine >25.0 pg/mL; reference range 1.7 to 3.7 pg/mL) and tested weakly positive for thyroid-stimulating hormone-binding inhibitory immunoglobulins on second-generation tests (2.1 IU/L; reference range <1.0 IU/L). Within 9 months of treatment with oral thiamazole (30 mg/day), his thyroid-stimulating hormone-binding inhibitory immunoglobulin titers had normalized, but he experienced sustained hyperthyroidism for more than 8 years, requiring 15 mg/day of thiamazole to correct. During that period, he tested negative on all first-generation, second-generation, and third-generation thyroid-stimulating hormone-binding inhibitory immunoglobulin assays, but thyroid scintigraphy revealed diffuse and increased uptake, and thyroid ultrasound and color flow Doppler imaging showed typical findings of Graves' hyperthyroidism. The possible explanations for serial changes in the thyroid-stimulating hormone-binding inhibitory immunoglobulin results in our patient include the presence of thyroid-stimulating hormone receptor antibody, which is bioactive but less reactive on thyroid-stimulating hormone-binding inhibitory immunoglobulin assays, or the effect of reduced levels of circulating thyroid-stimulating hormone receptor antibody upon improvement of thyroid autoimmunity with thiamazole treatment. Physicians should keep in mind that patients with Graves' disease may show thyroid-stimulating hormone-binding inhibitory immunoglobulin assay results that do not reflect the severity of Graves' disease or indicate the outcome of the disease, and that active Graves' disease may persist even after negative results on thyroid-stimulating hormone-binding inhibitory immunoglobulin assays. Timely performance of thyroid function tests in combination with sensitive imaging tests, including thyroid ultrasound and scintigraphy, are necessary to evaluate the severity of Graves' disease and treatment efficacy.
Deformation of Cases in High Capacitance Value Wet Tantalum Capacitors under Environmental Stresses
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2016-01-01
Internal gas pressure in hermetic wet tantalum capacitors is created by air, electrolyte vapor, and gas generated by electrochemical reactions at the electrodes. This pressure increases substantially with temperature and time of operation due to excessive leakage currents. Deformation of the case occurs when the internal pressure exceeds pressure of the environments and can raise significantly when a part operates in space. Contrary to the cylinder case wet tantalum capacitors that have external sealing by welding and internal sealing provided by the Teflon bushing and crimping of the case, no reliable internal sealing exists in the button case capacitors. Single seal design capacitors are used for high capacitance value wet tantalum capacitors manufactured per DLA L&M drawings #04003, 04005, and 10011, and require additional analysis to assure their reliable application in space systems. In this work, leakage currents and case deformation of button case capacitors were measured during different environmental test conditions. Recommendations for derating, screening and qualification testing are given. This work is a continuation of a series of NEPP reports related to quality and reliability of wet tantalum capacitors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Jin; Zhang, Yingchen; Veda, Santosh
Recent large penetrations of solar photovoltaic (PV) generation and the inertial characteristics of inverter-based generation technologies have caught the attention of those in the electric power industry in the United States. This paper presents a systematic approach to developing test cases of high penetrations of PV for the Western Interconnection. First, to examine the accuracy of the base case model, the Western Electricity Coordinating Council (WECC) model is validated by using measurement data from synchronized phasor measurement units. Based on the 2022 Light Spring case, we developed four high PV penetration cases for the WECC system that are of interestmore » to the industry: 5% PV+15 % wind, 25% PV+15% wind, 45% PV+15% wind, 65% PV+15% wind). Additionally, a method to project PV is proposed that is based on collected, realistic PV distribution information, including the current and future PV power plant locations and penetrations in the WECC system. Both the utility-scale PV plant and residential rooftop PV are included in this study.« less
Developing High PV Penetration Cases for Frequency Response Study of U.S. Western Interconnection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Jin; Zhang, Yingchen; Veda, Santosh
Recent large penetrations of solar photovoltaic (PV) generation and the inertial characteristics of inverter-based generation technologies have caught the attention of those in the electric power industry in the United States. This paper presents a systematic approach to developing test cases of high penetrations of PV for the Western Interconnection. First, to examine the accuracy of the base case model, the Western Electricity Coordinating Council (WECC) model is validated by using measurement data from synchronized phasor measurement units. Based on the 2022 Light Spring case, we developed four high PV penetration cases for the WECC system that are of interestmore » to the industry: 5% PV+15 % wind, 25% PV+15% wind, 45% PV+15% wind, 65% PV+15% wind). Additionally, a method to project PV is proposed that is based on collected, realistic PV distribution information, including the current and future PV power plant locations and penetrations in the WECC system. Both the utility-scale PV plant and residential rooftop PV are included in this study.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Jin; Zhang, Yingchen; Veda, Santosh
2017-04-11
Recent large penetrations of solar photovoltaic (PV) generation and the inertial characteristics of inverter-based generation technologies have caught the attention of those in the electric power industry in the United States. This paper presents a systematic approach to developing test cases of high penetrations of PV for the Western Interconnection. First, to examine the accuracy of the base case model, the Western Electricity Coordinating Council (WECC) model is validated by using measurement data from synchronized phasor measurement units. Based on the 2022 Light Spring case, we developed four high PV penetration cases for the WECC system that are of interestmore » to the industry: 5% PV+15 % wind, 25% PV+15% wind, 45% PV+15% wind, 65% PV+15% wind). Additionally, a method to project PV is proposed that is based on collected, realistic PV distribution information, including the current and future PV power plant locations and penetrations in the WECC system. Both the utility-scale PV plant and residential rooftop PV are included in this study.« less
The Nature of Mathematics Anxiety.
ERIC Educational Resources Information Center
Cemen, Pamala Byrd
This paper attempts to generate a comprehensive description of the nature of mathematics anxiety through a synthesis of: (1) the general and test anxiety literatures applied to mathematics anxiety; (2) the mathematics anxiety literature, and (3) case studies developed through in-depth interviews. The indepth interviews were conducted with seven…
Moorchung, Nikhil; Phillip, Joseph; Sarkar, Ravi Shankar; Prasad, Rupesh; Dutta, Vibha
2013-01-01
Hemoglobinopathies constitute entities that are generated by either abnormal hemoglobin or thalassemias. high pressure liquid chromatography (HPLC) is one of the best methods for screening and detection of various hemoglobinopathies but it has intrinsic interpretive problems. The study was designed to evaluate the different mutations seen in cases of hemoglobinopathies and compare the same with screening tests. 68 patients of hemoglobinopathies were screened by HPLC. Mutation studies in the beta globin gene was performed using the polymerase chain reaction (PCR)-based allele-specific Amplification Refractory Mutation System (ARMS). Molecular analysis for the sickle cell mutation was done by standard methods. The IVS 1/5 mutation was the commonest mutation seen and it was seen in 26 (38.23%) of the cases. This was followed by the IVS 1/1, codon 41/42, codon 8/9, del 22 mutation, codon 15 mutation and the -619 bp deletion. No mutation was seen in eight cases. There was a 100% concordance between the sickle cell trait as diagnosed by HPLC and genetic testing. Our study underlies the importance of molecular testing in all cases of hemoglobinopathies. Although HPLC is a useful screening tool, molecular testing is very useful in accurately diagnosing the mutations. Molecular testing is especially applicable in cases with an abnormal hemoglobin (HbD, HbE and HbS) because there may be a concomitant inheritance of a beta thalassemia mutation. Molecular testing is the gold standard when it comes to the diagnosis of hemoglobinopathies.
Clinical Validation of Targeted Next Generation Sequencing for Colon and Lung Cancers
D’Haene, Nicky; Le Mercier, Marie; De Nève, Nancy; Blanchard, Oriane; Delaunoy, Mélanie; El Housni, Hakim; Dessars, Barbara; Heimann, Pierre; Remmelink, Myriam; Demetter, Pieter; Tejpar, Sabine; Salmon, Isabelle
2015-01-01
Objective Recently, Next Generation Sequencing (NGS) has begun to supplant other technologies for gene mutation testing that is now required for targeted therapies. However, transfer of NGS technology to clinical daily practice requires validation. Methods We validated the Ion Torrent AmpliSeq Colon and Lung cancer panel interrogating 1850 hotspots in 22 genes using the Ion Torrent Personal Genome Machine. First, we used commercial reference standards that carry mutations at defined allelic frequency (AF). Then, 51 colorectal adenocarcinomas (CRC) and 39 non small cell lung carcinomas (NSCLC) were retrospectively analyzed. Results Sensitivity and accuracy for detecting variants at an AF >4% was 100% for commercial reference standards. Among the 90 cases, 89 (98.9%) were successfully sequenced. Among the 86 samples for which NGS and the reference test were both informative, 83 showed concordant results between NGS and the reference test; i.e. KRAS and BRAF for CRC and EGFR for NSCLC, with the 3 discordant cases each characterized by an AF <10%. Conclusions Overall, the AmpliSeq colon/lung cancer panel was specific and sensitive for mutation analysis of gene panels and can be incorporated into clinical daily practice. PMID:26366557
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferraioli, Luigi; Hueller, Mauro; Vitale, Stefano
The scientific objectives of the LISA Technology Package experiment on board of the LISA Pathfinder mission demand accurate calibration and validation of the data analysis tools in advance of the mission launch. The level of confidence required in the mission outcomes can be reached only by intensively testing the tools on synthetically generated data. A flexible procedure allowing the generation of a cross-correlated stationary noise time series was set up. A multichannel time series with the desired cross-correlation behavior can be generated once a model for a multichannel cross-spectral matrix is provided. The core of the procedure comprises a noisemore » coloring, multichannel filter designed via a frequency-by-frequency eigendecomposition of the model cross-spectral matrix and a subsequent fit in the Z domain. The common problem of initial transients in a filtered time series is solved with a proper initialization of the filter recursion equations. The noise generator performance was tested in a two-dimensional case study of the closed-loop LISA Technology Package dynamics along the two principal degrees of freedom.« less
Structural integrity of power generating speed bumps made of concrete foam composite
NASA Astrophysics Data System (ADS)
Syam, B.; Muttaqin, M.; Hastrino, D.; Sebayang, A.; Basuki, W. S.; Sabri, M.; Abda, S.
2018-02-01
In this paper concrete foam composite speed bumps were designed to generate electrical power by utilizing the movements of commuting vehicles on highways, streets, parking gates, and drive-thru station of fast food restaurants. The speed bumps were subjected to loadings generated by vehicles pass over the power generating mechanical system. In this paper, we mainly focus our discussion on the structural integrity of the speed bumps and discuss the electrical power generating speed bumps in another paper. One aspect of structural integrity is its ability to support designed loads without breaking and includes the study of past structural failures in order to prevent failures in future designs. The concrete foam composites were used for the speed bumps; the reinforcement materials are selected from empty fruit bunch of oil palm. In this study, the speed bump materials and structure were subjected to various tests to obtain its physical and mechanical properties. To analyze the structure stability of the speed bumps some models were produced and tested in our speed bump test station. We also conduct a FEM-based computer simulation to analyze stress responses of the speed bump structures. It was found that speed bump type 1 significantly reduced the radial voltage. In addition, the speed bump is equipped with a steel casing is also suitable for use as a component component in generating electrical energy.
Is receptor oligomerization causally linked to activation of the EGF receptor kinase?
NASA Technical Reports Server (NTRS)
Rintoul, D. A.; Spooner, B. S. (Principal Investigator)
1992-01-01
Transduction of a signal from an extracellular peptide hormone to produce an intracellular response is often mediated by a cell surface receptor, which is usually a glycoprotein. The secondary intracellular signal(s) generated after hormone binding to the receptor have been intensively studied. The nature of the primary signal generated by ligand binding to the receptor is understood less well in most cases. The particular case of the epidermal growth factor (EGF) receptor is analyzed, and evidence for or against two dissimilar models of primary signal transduction is reviewed. Evidence for the most widely accepted current model is found to be unconvincing. Evidence for the other model is substantial but indirect; a direct test of this model remains to be done.
Dispersed storage and generation case studies
NASA Technical Reports Server (NTRS)
Bahrami, K.; Stallkamp, J. A.; Walton, A.
1980-01-01
Three installations utilizing separate dispersed storage and generation (DSG) technologies were investigated. Each of the systems is described in costs and control. Selected institutional and environmental issues are discussed, including life cycle costs. No unresolved technical, environmental, or institutional problems were encountered in the installations. The wind and solar photovoltaic DSG were installed for test purposes, and appear to be presently uneconomical. However, a number of factors are decreasing the cost of DSG relative to conventional alternatives, and an increased DSG penetration level may be expected in the future.
A Different Approach to the Generation of Patient Management Problems from a Knowledge-Based System
Barriga, Rosa Maria
1988-01-01
Several strategies are proposed to approach the generation of Patient Management Problems from a Knowledge Base and avoid inconsistencies in the results. These strategies are based on a different Knowledge Base structure and in the use of case introductions that describe the patient attributes which are not disease-dependent. This methodology has proven effective in a recent pilot test and it is on its way to implementation as part of an educational program at CWRU, School of Medicine.
Design Flexibility for Uncertain Distributed Generation from Photovoltaics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Krishnamurthy, Dheepak; Wu, Hongyu
2016-12-12
Uncertainty in the future adoption patterns for distributed energy resources (DERs) introduces a challenge for electric distribution system planning. This paper explores the potential for flexibility in design - also known as real options - to identify design solutions that may never emerge when future DER patterns are treated as deterministic. A test case for storage system design with uncertain distributed generation for solar photovoltaics (DGPV) demonstrates this approach and is used to study sensitivities to a range of techno-economic assumptions.
Adaption of unstructured meshes using node movement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, J.G.; McRae, V.D.S.
1996-12-31
The adaption algorithm of Benson and McRae is modified for application to unstructured grids. The weight function generation was modified for application to unstructured grids and movement was limited to prevent cross over. A NACA 0012 airfoil is used as a test case to evaluate the modified algorithm when applied to unstructured grids and compared to results obtained by Warren. An adaptive mesh solution for the Sudhoo and Hall four element airfoil is included as a demonstration case.
A novel torsional exciter for modal vibration testing of large rotating machinery
NASA Astrophysics Data System (ADS)
Sihler, Christof
2006-10-01
A novel exciter for applying a dynamic torsional force to a rotating structure is presented in this paper. It has been developed at IPP in order to perform vibration tests with shaft assemblies of large flywheel generators (synchronous machines). The electromagnetic exciter (shaker) needs no fixture to the rotating shaft because the torque is applied by means of the stator winding of an electrical machine. Therefore, the exciter can most easily be applied in cases where a three-phase electrical machine (a motor or generator) is part of the shaft assembly. The oscillating power for the shaker is generated in a separate current-controlled DC circuit with an inductor acting as a buffer storage of magnetic energy. An AC component with adjustable frequency is superimposed on the inductor current in order to generate pulsating torques acting on the rotating shaft with the desired waveform and frequency. Since this torsional exciter does not require an external power source, can easily be installed (without contact to the rotating structure) and provides dynamic torsional forces which are sufficient for multi-megawatt applications, it is best suited for on-site tests of large rotating machinery.
Suh, James H; Schrock, Alexa B; Johnson, Adrienne; Lipson, Doron; Gay, Laurie M; Ramkissoon, Shakti; Vergilio, Jo-Anne; Elvin, Julia A; Shakir, Abdur; Ruehlman, Peter; Reckamp, Karen L; Ou, Sai-Hong Ignatius; Ross, Jeffrey S; Stephens, Philip J; Miller, Vincent A; Ali, Siraj M
2018-03-14
In our recent study, of cases positive for epidermal growth factor receptor ( EGFR ) exon 19 deletions using comprehensive genomic profiling (CGP), 17/77 (22%) patients with prior standard of care (SOC) EGFR testing results available were previously negative for exon 19 deletion. Our aim was to compare the detection rates of CGP versus SOC testing for well-characterized sensitizing EGFR point mutations (pm) in our 6,832-patient cohort. DNA was extracted from 40 microns of formalin-fixed paraffin-embedded sections from 6,832 consecutive cases of non-small cell lung cancer (NSCLC) of various histologies (2012-2015). CGP was performed using a hybrid capture, adaptor ligation-based next-generation sequencing assay to a mean coverage depth of 576×. Genomic alterations (pm, small indels, copy number changes and rearrangements) involving EGFR were recorded for each case and compared with prior testing results if available. Overall, there were 482 instances of EGFR exon 21 L858R (359) and L861Q (20), exon 18 G719X (73) and exon 20 S768I (30) pm, of which 103 unique cases had prior EGFR testing results that were available for review. Of these 103 cases, CGP identified 22 patients (21%) with sensitizing EGFR pm that were not detected by SOC testing, including 9/75 (12%) patients with L858R, 4/7 (57%) patients with L861Q, 8/20 (40%) patients with G719X, and 4/7 (57%) patients with S768I pm (some patients had multiple EGFR pm). In cases with available clinical data, benefit from small molecule inhibitor therapy was observed. CGP, even when applied to low tumor purity clinical-grade specimens, can detect well-known EGFR pm in NSCLC patients that would otherwise not be detected by SOC testing. Taken together with EGFR exon 19 deletions, over 20% of patients who are positive for EGFR -activating mutations using CGP are previously negative by SOC EGFR mutation testing, suggesting that thousands of such patients per year in the U.S. alone could experience improved clinical outcomes when hybrid capture-based CGP is used to inform therapeutic decisions. This study points out that genomic profiling, as based on hybrid capture next-generation sequencing, can identify lung cancer patients with point mutation in epidermal growth factor receptor (EGFR) missed by standard molecular testing who can likely benefit from anti-EGFR targeted therapy. Beyond the specific findings regarding false-negative point mutation testing for EGFR, this study highlights the need for oncologists and pathologists to be cognizant of the performance characteristics of testing deployed and the importance of clinical intuition in questioning the results of laboratory testing. © AlphaMed Press 2018.
2016-09-01
Some technologies that were not included in the analysis (due to site-level evaluations), but could be added in the future, include: wind turbines ...number of entities involved in the procurement, operation, maintenance , testing, and fueling of the generators, detailed inventory and cost data is...difficult to obtain. The DPW is often understaffed, leading to uneven testing and maintenance of the equipment despite their best efforts. The
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patton, A.D.; Ayoub, A.K.; Singh, C.
1982-07-01
This report describes the structure and operation of prototype computer programs developed for a Monte Carlo simulation model, GENESIS, and for two analytical models, OPCON and OPPLAN. It includes input data requirements and sample test cases.
Consumer Online Search and New-Product Marketing
ERIC Educational Resources Information Center
Kim, Ho
2013-01-01
This dissertation contains three essays that study the implications of online search activity for new-product marketing. Using the U.S. motion picture industry as a test case, the first essay examines the dynamic causal relationship between traditional media, consumers' media generation activity, media consumption activity, and market demand…
Fassan, Matteo; Rachiglio, Anna Maria; Cappellesso, Rocco; Antonello, Davide; Amato, Eliana; Mafficini, Andrea; Lambiase, Matilde; Esposito, Claudia; Bria, Emilio; Simonato, Francesca; Scardoni, Maria; Turri, Giona; Chilosi, Marco; Tortora, Giampaolo; Fassina, Ambrogio; Normanno, Nicola
2013-01-01
Identification of driver mutations in lung adenocarcinoma has led to development of targeted agents that are already approved for clinical use or are in clinical trials. Therefore, the number of biomarkers that will be needed to assess is expected to rapidly increase. This calls for the implementation of methods probing the mutational status of multiple genes for inoperable cases, for which limited cytological or bioptic material is available. Cytology specimens from 38 lung adenocarcinomas were subjected to the simultaneous assessment of 504 mutational hotspots of 22 lung cancer-associated genes using 10 nanograms of DNA and Ion Torrent PGM next-generation sequencing. Thirty-six cases were successfully sequenced (95%). In 24/36 cases (67%) at least one mutated gene was observed, including EGFR, KRAS, PIK3CA, BRAF, TP53, PTEN, MET, SMAD4, FGFR3, STK11, MAP2K1. EGFR and KRAS mutations, respectively found in 6/36 (16%) and 10/36 (28%) cases, were mutually exclusive. Nine samples (25%) showed concurrent alterations in different genes. The next-generation sequencing test used is superior to current standard methodologies, as it interrogates multiple genes and requires limited amounts of DNA. Its applicability to routine cytology samples might allow a significant increase in the fraction of lung cancer patients eligible for personalized therapy. PMID:24236184
Transient and Steady-state Tests of the Space Power Research Engine with Resistive and Motor Loads
NASA Technical Reports Server (NTRS)
Rauch, Jeffrey S.; Kankam, M. David
1995-01-01
The NASA Lewis Research Center (LeRC) has been testing free-piston Stirling engine/linear alternators (FPSE/LA) to develop advanced power convertors for space-based electrical power generation. Tests reported herein were performed to evaluate the interaction and transient behavior of FPSE/LA-based power systems with typical user loads. Both resistive and small induction motor loads were tested with the space power research engine (SPRE) power system. Tests showed that the control system could maintain constant long term voltage and stable periodic operation over a large range of engine operating parameters and loads. Modest resistive load changes were shown to cause relatively large voltage and, therefore, piston and displacer amplitude excursions. Starting a typical small induction motor was shown to cause large and, in some cases, deleterious voltage transients. The tests identified the need for more effective controls, if FPSE/LAs are to be used for stand-alone power systems. The tests also generated a large body of transient dynamic data useful for analysis code validation.
Transient and steady-state tests of the space power research engine with resistive and motor loads
NASA Astrophysics Data System (ADS)
Rauch, Jeffrey S.; Kankam, M. David
1995-01-01
The NASA Lewis Research Center (LeRC) has been testing free-piston Stirling engine/linear alternators (FPSE/LA) to develop advanced power convertors for space-based electrical power generation. Tests reported herein were performed to evaluate the interaction and transient behavior of FPSE/LA-based power systems with typical user loads. Both resistive and small induction motor loads were tested with the space power research engine (SPRE) power system. Tests showed that the control system could maintain constant long term voltage and stable periodic operation over a large range of engine operating parameters and loads. Modest resistive load changes were shown to cause relatively large voltage and, therefore, piston and displacer amplitude excursions. Starting a typical small induction motor was shown to cause large and, in some cases, deleterious voltage transients. The tests identified the need for more effective controls, if FPSE/LAs are to be used for stand-alone power systems. The tests also generated a large body of transient dynamic data useful for analysis code validation.
High-Speed Device-Independent Quantum Random Number Generation without a Detection Loophole.
Liu, Yang; Yuan, Xiao; Li, Ming-Han; Zhang, Weijun; Zhao, Qi; Zhong, Jiaqiang; Cao, Yuan; Li, Yu-Huai; Chen, Luo-Kan; Li, Hao; Peng, Tianyi; Chen, Yu-Ao; Peng, Cheng-Zhi; Shi, Sheng-Cai; Wang, Zhen; You, Lixing; Ma, Xiongfeng; Fan, Jingyun; Zhang, Qiang; Pan, Jian-Wei
2018-01-05
Quantum mechanics provides the means of generating genuine randomness that is impossible with deterministic classical processes. Remarkably, the unpredictability of randomness can be certified in a manner that is independent of implementation devices. Here, we present an experimental study of device-independent quantum random number generation based on a detection-loophole-free Bell test with entangled photons. In the randomness analysis, without the independent identical distribution assumption, we consider the worst case scenario that the adversary launches the most powerful attacks against the quantum adversary. After considering statistical fluctuations and applying an 80 Gb×45.6 Mb Toeplitz matrix hashing, we achieve a final random bit rate of 114 bits/s, with a failure probability less than 10^{-5}. This marks a critical step towards realistic applications in cryptography and fundamental physics tests.
NASA Astrophysics Data System (ADS)
Cordes, V.
1983-09-01
Three solar generator projects in developing countries are discussed. A brackish water desalination unit was developed and built. A 2.4 kW solar generator supplies the desalination unit which produces 1.5 cum drinking water per day, and the pump installed for a hoisting depth of 20 m. The unit switches on if the solar energy is sufficiently high; a 160 Ahr battery guarantees constant operation. A photovoltaically supplied UHF transmitter was installed. A 3.6 kW solar generator and a battery make transmitter and air traffic warning illumination self-sufficient. A small diesel generator is installed for emergency cases. Experience shows that solar generator (4 kW) and battery have to be enlarged. An emergency solar energy generator was installed in a hospital. The solar generator has a maximum power of 150 W; together with a battery it can deliver the energy for a minimum 5 hr of emergency illumination.
The IDEA model: A single equation approach to the Ebola forecasting challenge.
Tuite, Ashleigh R; Fisman, David N
2018-03-01
Mathematical modeling is increasingly accepted as a tool that can inform disease control policy in the face of emerging infectious diseases, such as the 2014-2015 West African Ebola epidemic, but little is known about the relative performance of alternate forecasting approaches. The RAPIDD Ebola Forecasting Challenge (REFC) tested the ability of eight mathematical models to generate useful forecasts in the face of simulated Ebola outbreaks. We used a simple, phenomenological single-equation model (the "IDEA" model), which relies only on case counts, in the REFC. Model fits were performed using a maximum likelihood approach. We found that the model performed reasonably well relative to other more complex approaches, with performance metrics ranked on average 4th or 5th among participating models. IDEA appeared better suited to long- than short-term forecasts, and could be fit using nothing but reported case counts. Several limitations were identified, including difficulty in identifying epidemic peak (even retrospectively), unrealistically precise confidence intervals, and difficulty interpolating daily case counts when using a model scaled to epidemic generation time. More realistic confidence intervals were generated when case counts were assumed to follow a negative binomial, rather than Poisson, distribution. Nonetheless, IDEA represents a simple phenomenological model, easily implemented in widely available software packages that could be used by frontline public health personnel to generate forecasts with accuracy that approximates that which is achieved using more complex methodologies. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
Difference between the vocalizations of two sister species of pigeons explained in dynamical terms.
Alonso, R Gogui; Kopuchian, Cecilia; Amador, Ana; Suarez, Maria de Los Angeles; Tubaro, Pablo L; Mindlin, Gabriel B
2016-05-01
Vocal communication is an unique example, where the nonlinear nature of the periphery can give rise to complex sounds even when driven by simple neural instructions. In this work we studied the case of two close-related bird species, Patagioenas maculosa and Patagioenas picazuro, whose vocalizations differ only in the timbre. The temporal modulation of the fundamental frequency is similar in both cases, differing only in the existence of sidebands around the fundamental frequency in the P. maculosa. We tested the hypothesis that the qualitative difference between these vocalizations lies in the nonlinear nature of the syrinx. In particular, we propose that the roughness of maculosa's vocalizations is due to an asymmetry between the right and left vibratory membranes, whose nonlinear dynamics generate the sound. To test the hypothesis, we generated a biomechanical model for vocal production with an asymmetric parameter Q with which we can control the level of asymmetry between these membranes. Using this model we generated synthetic vocalizations with the principal acoustic features of both species. In addition, we confirmed the anatomical predictions by making post mortem inspection of the syrinxes, showing that the species with tonal song (picazuro) has a more symmetrical pair of membranes compared to maculosa.
Difference between the vocalizations of two sister species of pigeons explained in dynamical terms
Alonso, R. Gogui; Kopuchian, Cecilia; Amador, Ana; de los Angeles Suarez, Maria; Tubaro, Pablo L.; Mindlin, Gabriel B.
2016-01-01
Vocal communication is a unique example where the nonlinear nature of the periphery can give rise to complex sounds even when driven by simple neural instructions. In this work we studied the case of two close-related bird species, Patagioenas maculosa and Patagioenas picazuro, whose vocalizations differ only in the timbre. The temporal modulation of the fundamental frequency is similar in both cases, differing only in the existence of sidebands around the fundamental frequency in the Patagioenas maculosa. We tested the hypothesis that the qualitative difference between these vocalizations lies in the nonlinear nature of the syrinx. In particular, we propose that the roughness of maculosa's vocalizations is due to an asymmetry between the right and left vibratory membranes, whose nonlinear dynamics generate the sound. To test the hypothesis, we generated a biomechanical model for vocal production with an asymmetric parameter Q with which we can control the level of asymmetry between these membranes. Using this model we generated synthetic vocalizations with the principal acoustic features of both species. In addition, we confirmed the anatomical predictions by making post-mortem inspection of the syrinxes, showing that the species with tonal song (picazuro) has a more symmetrical pair of membranes compared to maculosa. PMID:27033354
Hybrid EEG-fNIRS-Based Eight-Command Decoding for BCI: Application to Quadcopter Control.
Khan, Muhammad Jawad; Hong, Keum-Shik
2017-01-01
In this paper, a hybrid electroencephalography-functional near-infrared spectroscopy (EEG-fNIRS) scheme to decode eight active brain commands from the frontal brain region for brain-computer interface is presented. A total of eight commands are decoded by fNIRS, as positioned on the prefrontal cortex, and by EEG, around the frontal, parietal, and visual cortices. Mental arithmetic, mental counting, mental rotation, and word formation tasks are decoded with fNIRS, in which the selected features for classification and command generation are the peak, minimum, and mean ΔHbO values within a 2-s moving window. In the case of EEG, two eyeblinks, three eyeblinks, and eye movement in the up/down and left/right directions are used for four-command generation. The features in this case are the number of peaks and the mean of the EEG signal during 1 s window. We tested the generated commands on a quadcopter in an open space. An average accuracy of 75.6% was achieved with fNIRS for four-command decoding and 86% with EEG for another four-command decoding. The testing results show the possibility of controlling a quadcopter online and in real-time using eight commands from the prefrontal and frontal cortices via the proposed hybrid EEG-fNIRS interface.
Tool Support for Parametric Analysis of Large Software Simulation Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony
2008-01-01
The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.
NASA Astrophysics Data System (ADS)
Rai, Aakash C.; Lin, Chao-Hsin; Chen, Qingyan
2015-02-01
Ozone-terpene reactions are important sources of indoor ultrafine particles (UFPs), a potential health hazard for human beings. Humans themselves act as possible sites for ozone-initiated particle generation through reactions with squalene (a terpene) that is present in their skin, hair, and clothing. This investigation developed a numerical model to probe particle generation from ozone reactions with clothing worn by humans. The model was based on particle generation measured in an environmental chamber as well as physical formulations of particle nucleation, condensational growth, and deposition. In five out of the six test cases, the model was able to predict particle size distributions reasonably well. The failure in the remaining case demonstrated the fundamental limitations of nucleation models. The model that was developed was used to predict particle generation under various building and airliner cabin conditions. These predictions indicate that ozone reactions with human-worn clothing could be an important source of UFPs in densely occupied classrooms and airliner cabins. Those reactions could account for about 40% of the total UFPs measured on a Boeing 737-700 flight. The model predictions at this stage are indicative and should be improved further.
Nonlinear dynamic simulation of single- and multi-spool core engines
NASA Technical Reports Server (NTRS)
Schobeiri, T.; Lippke, C.; Abouelkheir, M.
1993-01-01
In this paper a new computational method for accurate simulation of the nonlinear dynamic behavior of single- and multi-spool core engines, turbofan engines, and power generation gas turbine engines is presented. In order to perform the simulation, a modularly structured computer code has been developed which includes individual mathematical modules representing various engine components. The generic structure of the code enables the dynamic simulation of arbitrary engine configurations ranging from single-spool thrust generation to multi-spool thrust/power generation engines under adverse dynamic operating conditions. For precise simulation of turbine and compressor components, row-by-row calculation procedures were implemented that account for the specific turbine and compressor cascade and blade geometry and characteristics. The dynamic behavior of the subject engine is calculated by solving a number of systems of partial differential equations, which describe the unsteady behavior of the individual components. In order to ensure the capability, accuracy, robustness, and reliability of the code, comprehensive critical performance assessment and validation tests were performed. As representatives, three different transient cases with single- and multi-spool thrust and power generation engines were simulated. The transient cases range from operating with a prescribed fuel schedule, to extreme load changes, to generator and turbine shut down.
Generalized functional linear models for gene-based case-control association studies.
Fan, Ruzong; Wang, Yifan; Mills, James L; Carter, Tonia C; Lobach, Iryna; Wilson, Alexander F; Bailey-Wilson, Joan E; Weeks, Daniel E; Xiong, Momiao
2014-11-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene region are disease related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease datasets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. © 2014 WILEY PERIODICALS, INC.
Generalized Functional Linear Models for Gene-based Case-Control Association Studies
Mills, James L.; Carter, Tonia C.; Lobach, Iryna; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Weeks, Daniel E.; Xiong, Momiao
2014-01-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene are disease-related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease data sets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. PMID:25203683
Space shuttle main engine fault detection using neural networks
NASA Technical Reports Server (NTRS)
Bishop, Thomas; Greenwood, Dan; Shew, Kenneth; Stevenson, Fareed
1991-01-01
A method for on-line Space Shuttle Main Engine (SSME) anomaly detection and fault typing using a feedback neural network is described. The method involves the computation of features representing time-variance of SSME sensor parameters, using historical test case data. The network is trained, using backpropagation, to recognize a set of fault cases. The network is then able to diagnose new fault cases correctly. An essential element of the training technique is the inclusion of randomly generated data along with the real data, in order to span the entire input space of potential non-nominal data.
Pre-Test Assessment of the Use Envelope of the Normal Force of a Wind Tunnel Strain-Gage Balance
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2016-01-01
The relationship between the aerodynamic lift force generated by a wind tunnel model, the model weight, and the measured normal force of a strain-gage balance is investigated to better understand the expected use envelope of the normal force during a wind tunnel test. First, the fundamental relationship between normal force, model weight, lift curve slope, model reference area, dynamic pressure, and angle of attack is derived. Then, based on this fundamental relationship, the use envelope of a balance is examined for four typical wind tunnel test cases. The first case looks at the use envelope of the normal force during the test of a light wind tunnel model at high subsonic Mach numbers. The second case examines the use envelope of the normal force during the test of a heavy wind tunnel model in an atmospheric low-speed facility. The third case reviews the use envelope of the normal force during the test of a floor-mounted semi-span model. The fourth case discusses the normal force characteristics during the test of a rotated full-span model. The wind tunnel model's lift-to-weight ratio is introduced as a new parameter that may be used for a quick pre-test assessment of the use envelope of the normal force of a balance. The parameter is derived as a function of the lift coefficient, the dimensionless dynamic pressure, and the dimensionless model weight. Lower and upper bounds of the use envelope of a balance are defined using the model's lift-to-weight ratio. Finally, data from a pressurized wind tunnel is used to illustrate both application and interpretation of the model's lift-to-weight ratio.
Low Speed and High Speed Correlation of SMART Active Flap Rotor Loads
NASA Technical Reports Server (NTRS)
Kottapalli, Sesi B. R.
2010-01-01
Measured, open loop and closed loop data from the SMART rotor test in the NASA Ames 40- by 80- Foot Wind Tunnel are compared with CAMRAD II calculations. One open loop high-speed case and four closed loop cases are considered. The closed loop cases include three high-speed cases and one low-speed case. Two of these high-speed cases include a 2 deg flap deflection at 5P case and a test maximum-airspeed case. This study follows a recent, open loop correlation effort that used a simple correction factor for the airfoil pitching moment Mach number. Compared to the earlier effort, the current open loop study considers more fundamental corrections based on advancing blade aerodynamic conditions. The airfoil tables themselves have been studied. Selected modifications to the HH-06 section flap airfoil pitching moment table are implemented. For the closed loop condition, the effect of the flap actuator is modeled by increased flap hinge stiffness. Overall, the open loop correlation is reasonable, thus confirming the basic correctness of the current semi-empirical modifications; the closed loop correlation is also reasonable considering that the current flap model is a first generation model. Detailed correlation results are given in the paper.
NASA Technical Reports Server (NTRS)
Dor, J. B.; Mignosi, A.; Plazanet, M.
1984-01-01
The T2 wind tunnel is described. The process of generating a cyrogenic gust using the example of a test made at very low temperature is presented. Detailed results of tests on temperatures for flow in the settling chamber, the interior walls of the system, and the metal casing are given. The transverse temperature distribution in the settling chamber and working section, and of the thermal gradients in the walls, are given as a function of the temperature level of the test.
Validating a UAV artificial intelligence control system using an autonomous test case generator
NASA Astrophysics Data System (ADS)
Straub, Jeremy; Huber, Justin
2013-05-01
The validation of safety-critical applications, such as autonomous UAV operations in an environment which may include human actors, is an ill posed problem. To confidence in the autonomous control technology, numerous scenarios must be considered. This paper expands upon previous work, related to autonomous testing of robotic control algorithms in a two dimensional plane, to evaluate the suitability of similar techniques for validating artificial intelligence control in three dimensions, where a minimum level of airspeed must be maintained. The results of human-conducted testing are compared to this automated testing, in terms of error detection, speed and testing cost.
Singh, Jaya; Mishra, Avshesh; Pandian, Arunachalam Jayamuruga; Mallipatna, Ashwin C.; Khetan, Vikas; Sripriya, S.; Kapoor, Suman; Agarwal, Smita; Sankaran, Satish; Katragadda, Shanmukh; Veeramachaneni, Vamsi; Hariharan, Ramesh; Subramanian, Kalyanasundaram
2016-01-01
Purpose Retinoblastoma (Rb) is the most common primary intraocular cancer of childhood and one of the major causes of blindness in children. India has the highest number of patients with Rb in the world. Mutations in the RB1 gene are the primary cause of Rb, and heterogeneous mutations are distributed throughout the entire length of the gene. Therefore, genetic testing requires screening of the entire gene, which by conventional sequencing is time consuming and expensive. Methods In this study, we screened the RB1 gene in the DNA isolated from blood or saliva samples of 50 unrelated patients with Rb using the TruSight Cancer panel. Next-generation sequencing (NGS) was done on the Illumina MiSeq platform. Genetic variations were identified using the Strand NGS software and interpreted using the StrandOmics platform. Results We were able to detect germline pathogenic mutations in 66% (33/50) of the cases, 12 of which were novel. We were able to detect all types of mutations, including missense, nonsense, splice site, indel, and structural variants. When we considered bilateral Rb cases only, the mutation detection rate increased to 100% (22/22). In unilateral Rb cases, the mutation detection rate was 30% (6/20). Conclusions Our study suggests that NGS-based approaches increase the sensitivity of mutation detection in the RB1 gene, making it fast and cost-effective compared to the conventional tests performed in a reflex-testing mode. PMID:27582626
Moving Towards a Science-Driven Workbench for Earth Science Solutions
NASA Astrophysics Data System (ADS)
Graves, S. J.; Djorgovski, S. G.; Law, E.; Yang, C. P.; Keiser, K.
2017-12-01
The NSF-funded EarthCube Integration and Test Environment (ECITE) prototype was proposed as a 2015 Integrated Activities project and resulted in the prototyping of an EarthCube federated cloud environment and the Integration and Testing Framework. The ECITE team has worked with EarthCube science and technology governance committees to define the types of integration, testing and evaluation necessary to achieve and demonstrate interoperability and functionality that benefit and support the objectives of the EarthCube cyber-infrastructure. The scope of ECITE also includes reaching beyond NSF and EarthCube to work with the broader Earth science community, such as the Earth Science Information Partners (ESIP) to incorporate lessons learned from other testbed activities, and ultimately provide broader community benefits. This presentation will discuss evolving ECITE ideas for a science-driven workbench that will start with documented science use cases, map the use cases to solution scenarios that identify the available technology and data resources that match the use case, the generation of solution workflows and test plans, the testing and evaluation of the solutions in a cloud environment, and finally the documentation of identified technology and data gaps that will assist with driving the development of additional EarthCube resources.
Mathematical model of snake-type multi-directional wave generation
NASA Astrophysics Data System (ADS)
Muarif; Halfiani, Vera; Rusdiana, Siti; Munzir, Said; Ramli, Marwan
2018-01-01
Research on extreme wave generation is one intensive research on water wave study because the fact that the occurrence of this wave in the ocean can cause serious damage to the ships and offshore structures. One method to be used to generate the wave is self-correcting. This method controls the signal on the wavemakers in a wave tank. Some studies also consider the nonlinear wave generation in a wave tank by using numerical approach. Study on wave generation is essential in the effectiveness and efficiency of offshore structure model testing before it can be operated in the ocean. Generally, there are two types of wavemakers implemented in the hydrodynamic laboratory, piston-type and flap-type. The flap-type is preferred to conduct a testing to a ship in deep water. Single flap wavemaker has been explained in many studies yet snake-type wavemaker (has more than one flap) is still a case needed to be examined. Hence, the formulation in controlling the wavemaker need to be precisely analyzed such that the given input can generate the desired wave in the space-limited wave tank. By applying the same analogy and methodhology as the previous study, this article represents multi-directional wave generation by implementing snake-type wavemakers.
NASA Astrophysics Data System (ADS)
Ispas, N.; Năstăsoiu, M.
2016-08-01
Reducing occupant injuries for cars involves in traffic accidents is a main target of today cars designers. Known as active or passive safety, many technological solutions were developing over the time for an actual better car's occupant safety. In the real world, in traffic accidents are often involved cars from different generations with various safety historical solutions. The main aim of these papers are to quantify the influences over the car driver chest loads in cases of same or different generation of cars involved in side car crashes. Both same and different cars generations were used for the study. Other goal of the paper was the study of in time loads conformity for diver's chests from both cars involved in crash. The paper's experimental results were obtained by support of DSD, Dr. Steffan Datentechnik GmbH - Linz, Austria. The described tests were performed in full test facility of DSD Linz, in “Easter 2015 PC-Crash Seminar”. In all crashes we obtaining results from both dummy placed in impacted and hits car. The novelty of the paper are the comparisons of data set from each of driver (dummy) of two cars involved in each of six experimental crashes. Another novelty of this paper consists in possibilities to analyse the influences of structural historical cars solutions over deformation and loads in cases of traffic accidents involved. Paper's conclusions can be future used for car passive safety improvement.
1990-09-29
for generating narrowband and 1 broadband shock-associated noise. Shoi .<-associated noise was first investigated by Powell who studied choked jet...for one configuration and one operating condition, but that one mode was always dominant over the others. For the axisymmetric case , two different...an NPR setting of 3.1 is shown in Figure 12. This data plot shows 14 narrowband peaks which is an extreme case . However, if on the average each data
NASA Technical Reports Server (NTRS)
Biringen, S. H.; Mcmillan, O. J.
1980-01-01
The use of a computer code for the calculation of two dimensional inlet flow fields in a supersonic free stream and a nonorthogonal mesh-generation code are illustrated by specific examples. Input, output, and program operation and use are given and explained for the case of supercritical inlet operation at a subdesign Mach number (M Mach free stream = 2.09) for an isentropic-compression, drooped-cowl inlet. Source listings of the computer codes are also provided.
[Cost-effectiveness ratio of using rapid tests for malaria diagnosis in the Peruvian Amazon].
Rosas Aguirre, Angel Martín; Llanos Zavalaga, Luis Fernando; Trelles de Belaunde, Miguel
2009-05-01
To determine the cost-effectiveness ratios of three options for diagnosing malaria at the local health provider in 50 communities near the Peruvian Amazon. Calculation of the incremental cost-effectiveness ratios of three options for diagnosing malaria-not using rapid tests, using rapid tests, and accessing microscopy-in patients presenting with fever in 50 communities near Iquitos in the Peruvian Amazon, communities with limited access to microscopy that depend on a network of local health providers. The incremental costs and effects of the two latter options were calculated and compared with the first option (currently in use). By dividing the incremental costs among the incremental effects, the incremental cost-effectiveness ratio was calculated. Using rapid tests would save the Ministry of Health of Peru: US$191 for each new case of Plasmodium falciparum malaria treated promptly and appropriately; US$31 per new case of P. vivax malaria treated promptly and appropriately; US$1,051 per case of acute malaria averted; and US$17,655 for each death avoided. Access to microscopy by all the communities would generate an additional cost of: US$198 per new case of P. falciparum malaria treated promptly and appropriately; US$31 per new case of P. vivax malaria treated promptly and appropriately; US$1,086 per case of acute malaria averted; and US$18,255 for each death avoided. The use of rapid tests by local health providers can improve the effectiveness of malaria diagnosis in patients with fever in the 50 communities studied, at a cost lower than the current method. The recommendation is to expand the use of rapid tests among the health providers in communities similar to those studied.
Optimizing DER Participation in Inertial and Primary-Frequency Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall-Anese, Emiliano; Zhao, Changhong; Guggilam, Swaroop
This paper develops an approach to enable the optimal participation of distributed energy resources (DERs) in inertial and primary-frequency response alongside conventional synchronous generators. Leveraging a reduced-order model description of frequency dynamics, DERs' synthetic inertias and droop coefficients are designed to meet time-domain performance objectives of frequency overshoot and steady-state regulation. Furthermore, an optimization-based method centered around classical economic dispatch is developed to ensure that DERs share the power injections for inertial- and primary-frequency response in proportion to their power ratings. Simulations for a modified New England test-case system composed of ten synchronous generators and six instances of the IEEEmore » 37-node test feeder with frequency-responsive DERs validate the design strategy.« less
Preliminary design of mesoscale turbocompressor and rotordynamics tests of rotor bearing system
NASA Astrophysics Data System (ADS)
Hossain, Md Saddam
2011-12-01
A mesoscale turbocompressor spinning above 500,000 RPM is evolutionary technology for micro turbochargers, turbo blowers, turbo compressors, micro-gas turbines, auxiliary power units, etc for automotive, aerospace, and fuel cell industries. Objectives of this work are: (1) to evaluate different air foil bearings designed for the intended applications, and (2) to design & perform CFD analysis of a micro-compressor. CFD analysis of shrouded 3-D micro compressor was conducted using Ansys Bladegen as blade generation tool, ICEM CFD as mesh generation tool, and CFX as main solver for different design and off design cases and also for different number of blades. Comprehensive experimental facilities for testing the turbocompressor system have been also designed and proposed for future work.
NASA Astrophysics Data System (ADS)
Lazzari, R.; Parma, C.; De Marco, A.; Bittanti, S.
2015-07-01
In this paper, we describe a control strategy for a photovoltaic (PV) power plant equipped with an energy storage system (ESS), based on lithium-ion battery. The plant consists of the following units: the PV generator, the energy storage system, the DC-bus and the inverter. The control, organised in a hierarchical manner, maximises the self-consumption of the local load unit. In particular, the ESS action performs power balance in case of low solar radiation or surplus of PV generation, thus managing the power exchange variability at the plant with the grid. The implemented control strategy is under testing in RSE pilot test facility in Milan, Italy.
Minagawa, Hiroko; Yasui, Yoshihiro; Adachi, Hirokazu; Ito, Miyabi; Hirose, Emi; Nakamura, Noriko; Hata, Mami; Kobayashi, Shinichi; Yamashita, Teruo
2015-11-09
Japan was verified as having achieved measles elimination by the Measles Regional Verification Commission in the Western Pacific Region in March 2015. Verification of measles elimination implies the absence of continuous endemic transmission. After the last epidemic in 2007 with an estimated 18,000 cases, Japan introduced nationwide case-based measles surveillance in January 2008. Laboratory diagnosis for all suspected measles cases is essentially required by law, and virus detection tests are mostly performed by municipal public health institutes. Despite relatively high vaccination coverage and vigorous response to every case by the local health center staff, outbreak of measles is repeatedly observed in Aichi Prefecture, Japan. Measles virus N and H gene detection by nested double RT-PCR was performed with all specimens collected from suspected cases and transferred to our institute. Genotyping and further molecular epidemiological analyses were performed with the direct nucleotide sequence data of appropriate PCR products. Between 2010 and 2014, specimens from 389 patients suspected for measles were tested in our institute. Genotypes D9, D8, H1 and B3 were detected. Further molecular epidemiological analyses were helpful to establish links between patients, and sometimes useful to discriminate one outbreak from another. All virus-positive cases, including 49 cases involved in three outbreaks without any obvious epidemiological link with importation, were considered as import-related based on the nucleotide sequence information. Chain of transmission in the latest outbreak in 2014 terminated after the third generations, much earlier than the 2010-11 outbreak (6th generations). Since 2010, almost all measles cases reported in Aichi Prefecture are either import or import-related, based primarily on genotypes and nucleotide sequences of measles virus detected. In addition, genotyping and molecular epidemiological analyses are indispensable to prove the interruption of endemic transmission when the importations of measles are repeatedly observed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Constraining torsion with Gravity Probe B
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao Yi; Guth, Alan H.; Cabi, Serkan
2007-11-15
It is well-entrenched folklore that all torsion gravity theories predict observationally negligible torsion in the solar system, since torsion (if it exists) couples only to the intrinsic spin of elementary particles, not to rotational angular momentum. We argue that this assumption has a logical loophole which can and should be tested experimentally, and consider nonstandard torsion theories in which torsion can be generated by macroscopic rotating objects. In the spirit of action=reaction, if a rotating mass like a planet can generate torsion, then a gyroscope would be expected to feel torsion. An experiment with a gyroscope (without nuclear spin) suchmore » as Gravity Probe B (GPB) can test theories where this is the case. Using symmetry arguments, we show that to lowest order, any torsion field around a uniformly rotating spherical mass is determined by seven dimensionless parameters. These parameters effectively generalize the parametrized post-Newtonian formalism and provide a concrete framework for further testing Einstein's general theory of relativity (GR). We construct a parametrized Lagrangian that includes both standard torsion-free GR and Hayashi-Shirafuji maximal torsion gravity as special cases. We demonstrate that classic solar system tests rule out the latter and constrain two observable parameters. We show that Gravity Probe B is an ideal experiment for further constraining nonstandard torsion theories, and work out the most general torsion-induced precession of its gyroscope in terms of our torsion parameters.« less
Analysis of messy data with heteroscedastic in mean models
NASA Astrophysics Data System (ADS)
Trianasari, Nurvita; Sumarni, Cucu
2016-02-01
In the analysis of the data, we often faced with the problem of data where the data did not meet some assumptions. In conditions of such data is often called data messy. This problem is a consequence of the data that generates outliers that bias or error estimation. To analyze the data messy, there are three approaches, namely standard analysis, transform data and data analysis methods rather than a standard. Simulations conducted to determine the performance of a third comparative test procedure on average often the model variance is not homogeneous. Data simulation of each scenario is raised as much as 500 times. Next, we do the analysis of the average comparison test using three methods, Welch test, mixed models and Welch-r test. Data generation is done through software R version 3.1.2. Based on simulation results, these three methods can be used for both normal and abnormal case (homoscedastic). The third method works very well on data balanced or unbalanced when there is no violation in the homogenity's assumptions variance. For balanced data, the three methods still showed an excellent performance despite the violation of the assumption of homogeneity of variance, with the requisite degree of heterogeneity is high. It can be shown from the level of power test above 90 percent, and the best to Welch method (98.4%) and the Welch-r method (97.8%). For unbalanced data, Welch method will be very good moderate at in case of heterogeneity positive pair with a 98.2% power. Mixed models method will be very good at case of highly heterogeneity was negative negative pairs with power. Welch-r method works very well in both cases. However, if the level of heterogeneity of variance is very high, the power of all method will decrease especially for mixed models methods. The method which still works well enough (power more than 50%) is Welch-r method (62.6%), and the method of Welch (58.6%) in the case of balanced data. If the data are unbalanced, Welch-r method works well enough in the case of highly heterogeneous positive positive or negative negative pairs, there power are 68.8% and 51% consequencly. Welch method perform well enough only in the case of highly heterogeneous variety of positive positive pairs with it is power of 64.8%. While mixed models method is good in the case of a very heterogeneous variety of negative partner with 54.6% power. So in general, when there is a variance is not homogeneous case, Welch method is applied to the data rank (Welch-r) has a better performance than the other methods.
"Clustering" Documents Automatically to Support Scoping Reviews of Research: A Case Study
ERIC Educational Resources Information Center
Stansfield, Claire; Thomas, James; Kavanagh, Josephine
2013-01-01
Background: Scoping reviews of research help determine the feasibility and the resource requirements of conducting a systematic review, and the potential to generate a description of the literature quickly is attractive. Aims: To test the utility and applicability of an automated clustering tool to describe and group research studies to improve…
Interdigital erosions - tinea pedis?
Orgaz-Molina, Jacinto; Orgaz-Molina, Maria Carmen; Cutugno, Marilena; Arias-Santiago, Salvador
2012-10-01
Interdigital erosions are frequently due to tinea pedis. However, other infectious conditions, such as candidiasis, erythrasma or bacterial infections, can generate lesions that cannot be differentiated at the clinical level. Microbiological tests are therefore necessary. This clinical case shows a man with interdigital lesions of 10 months of evolution that are not responding to antifungal treatment.
The Experimental State of Mind in Elicitation: Illustrations from Tonal Fieldwork
ERIC Educational Resources Information Center
Yu, Kristine M.
2014-01-01
This paper illustrates how an "experimental state of mind", i.e. principles of experimental design, can inform hypothesis generation and testing in structured fieldwork elicitation. The application of these principles is demonstrated with case studies in toneme discovery. Pike's classic toneme discovery procedure is shown to be a special…
Mattesini, Alessio; Dall'Ara, Gianni; Mario, Carlo Di
2014-01-01
Fully bioresorbable vascular scaffolds (BVS) are a new approach to the percutaneous treatment of coronary artery disease. The BVS have not yet been fully tested in complex lesions, including chronic total occlusion (CTO). We report a CTO case successfully treated with a second-generation bioabsorbable drug-eluting scaffold. PMID:25061461
A genetic-algorithm approach for assessing the liquefaction potential of sandy soils
NASA Astrophysics Data System (ADS)
Sen, G.; Akyol, E.
2010-04-01
The determination of liquefaction potential is required to take into account a large number of parameters, which creates a complex nonlinear structure of the liquefaction phenomenon. The conventional methods rely on simple statistical and empirical relations or charts. However, they cannot characterise these complexities. Genetic algorithms are suited to solve these types of problems. A genetic algorithm-based model has been developed to determine the liquefaction potential by confirming Cone Penetration Test datasets derived from case studies of sandy soils. Software has been developed that uses genetic algorithms for the parameter selection and assessment of liquefaction potential. Then several estimation functions for the assessment of a Liquefaction Index have been generated from the dataset. The generated Liquefaction Index estimation functions were evaluated by assessing the training and test data. The suggested formulation estimates the liquefaction occurrence with significant accuracy. Besides, the parametric study on the liquefaction index curves shows a good relation with the physical behaviour. The total number of misestimated cases was only 7.8% for the proposed method, which is quite low when compared to another commonly used method.
Second-generation microstimulator.
Arcos, Isabel; Davis, R; Fey, K; Mishler, D; Sanderson, D; Tanacs, C; Vogel, M J; Wolf, R; Zilberman, Y; Schulman, J
2002-03-01
The first-generation injectable microstimulator was glass encased with an external tantalum capacitor electrode. This second-generation device uses a hermetically sealed ceramic case with platinum electrodes. Zener diodes protect the electronics from defibrillation shocks and from electrostatic discharge. The capacitor is sealed inside the case so that it cannot be inadvertently damaged by surgical instruments. This microstimulator, referred to as BION, is the main component of a 255-channel wireless stimulating system. BION devices have been implanted in rats for periods of up to 5 months. Results show benign tissue reactions resulting in identical encapsulation around BION and controls. Stimulation threshold levels did not change significantly over time and ranged between 0.81 to 1.35 mA for all the animals at a 60 micros pulse width. All of the tests performed to date indicate that the BION is safe and effective for long-term human implant. We have elected to develop BION applications by seeking collaboration with the research community through our BION Technology Partnership.
Ontology to relational database transformation for web application development and maintenance
NASA Astrophysics Data System (ADS)
Mahmudi, Kamal; Inggriani Liem, M. M.; Akbar, Saiful
2018-03-01
Ontology is used as knowledge representation while database is used as facts recorder in a KMS (Knowledge Management System). In most applications, data are managed in a database system and updated through the application and then they are transformed to knowledge as needed. Once a domain conceptor defines the knowledge in the ontology, application and database can be generated from the ontology. Most existing frameworks generate application from its database. In this research, ontology is used for generating the application. As the data are updated through the application, a mechanism is designed to trigger an update to the ontology so that the application can be rebuilt based on the newest ontology. By this approach, a knowledge engineer has a full flexibility to renew the application based on the latest ontology without dependency to a software developer. In many cases, the concept needs to be updated when the data changed. The framework is built and tested in a spring java environment. A case study was conducted to proof the concepts.
A Statistical Method to Distinguish Functional Brain Networks
Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045
A Statistical Method to Distinguish Functional Brain Networks.
Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).
Shapiro, Adam J; Leigh, Margaret W
2017-01-01
Primary ciliary dyskinesia (PCD) is a genetic disorder causing chronic oto-sino-pulmonary disease. No single diagnostic test will detect all PCD cases. Transmission electron microscopy (TEM) of respiratory cilia was previously considered the gold standard diagnostic test for PCD, but 30% of all PCD cases have either normal ciliary ultrastructure or subtle changes which are non-diagnostic. These cases are identified through alternate diagnostic tests, including nasal nitric oxide measurement, high-speed videomicroscopy analysis, immunofluorescent staining of axonemal proteins, and/or mutation analysis of various PCD causing genes. Autosomal recessive mutations in DNAH11 and HYDIN produce normal TEM ciliary ultrastructure, while mutations in genes encoding for radial spoke head proteins result in some cross-sections with non-diagnostic alterations in the central apparatus interspersed with normal ciliary cross-sections. Mutations in nexin link and dynein regulatory complex genes lead to a collection of different ciliary ultrastructures; mutations in CCDC65, CCDC164, and GAS8 produce normal ciliary ultrastructure, while mutations in CCDC39 and CCDC40 cause absent inner dynein arms and microtubule disorganization in some ciliary cross-sections. Mutations in CCNO and MCIDAS cause near complete absence of respiratory cilia due to defects in generation of multiple cellular basal bodies; however, the scant cilia generated may have normal ultrastructure. Lastly, a syndromic form of PCD with retinal degeneration results in normal ciliary ultrastructure through mutations in the RPGR gene. Clinicians must be aware of these genetic causes of PCD resulting in non-diagnostic TEM ciliary ultrastructure and refrain from using TEM of respiratory cilia as a test to rule out PCD.
Diagnostic and prognostic value of human prion detection in cerebrospinal fluid.
Foutz, Aaron; Appleby, Brian S; Hamlin, Clive; Liu, Xiaoqin; Yang, Sheng; Cohen, Yvonne; Chen, Wei; Blevins, Janis; Fausett, Cameron; Wang, Han; Gambetti, Pierluigi; Zhang, Shulin; Hughson, Andrew; Tatsuoka, Curtis; Schonberger, Lawrence B; Cohen, Mark L; Caughey, Byron; Safar, Jiri G
2017-01-01
Several prion amplification systems have been proposed for detection of prions in cerebrospinal fluid (CSF), most recently, the measurements of prion seeding activity with second-generation real-time quaking-induced conversion (RT-QuIC). The objective of this study was to investigate the diagnostic performance of the RT-QuIC prion test in the broad phenotypic spectrum of prion diseases. We performed CSF RT-QuIC testing in 2,141 patients who had rapidly progressive neurological disorders, determined diagnostic sensitivity and specificity in 272 cases that were autopsied, and evaluated the impact of mutations and polymorphisms in the PRNP gene, and type 1 or type 2 human prions on diagnostic performance. The 98.5% diagnostic specificity and 92% sensitivity of CSF RT-QuIC in a blinded retrospective analysis matched the 100% specificity and 95% sensitivity of a blind prospective study. The CSF RT-QuIC differentiated 94% of cases of sporadic Creutzfeldt-Jakob disease (sCJD) MM1 from the sCJD MM2 phenotype, and 80% of sCJD VV2 from sCJD VV1. The mixed prion type 1-2 and cases heterozygous for codon 129 generated intermediate CSF RT-QuIC patterns, whereas genetic prion diseases revealed distinct profiles for each PRNP gene mutation. The diagnostic performance of the improved CSF RT-QuIC is superior to surrogate marker tests for prion diseases such as 14-3-3 and tau proteins, and together with PRNP gene sequencing the test allows the major prion subtypes to be differentiated in vivo. This differentiation facilitates prediction of the clinicopathological phenotype and duration of the disease-two important considerations for envisioned therapeutic interventions. ANN NEUROL 2017;81:79-92. © 2016 American Neurological Association.
Diagnostic and Prognostic Value of Human Prion Detection in Cerebrospinal Fluid
Foutz, Aaron; Appleby, Brian S.; Hamlin, Clive; Liu, Xiaoqin; Yang, Sheng; Cohen, Yvonne; Chen, Wei; Blevins, Janis; Fausett, Cameron; Wang, Han; Gambetti, Pierluigi; Zhang, Shulin; Hughson, Andrew; Tatsuoka, Curtis; Schonberger, Lawrence B.; Cohen, Mark L.; Caughey, Byron; Safar, Jiri G.
2016-01-01
Objective Several prion amplification systems have been proposed for detection of prions in cerebrospinal fluid (CSF), most recently, the measurements of prion seeding activity with second-generation real-time quaking-induced conversion (RT-QuIC). The objective of this study was to investigate the diagnostic performance of the RT-QuIC prion test in the broad phenotypic spectrum of prion diseases. Methods We performed CSF RT-QuIC testing in 2,141 patients who had rapidly progressive neurological disorders, determined diagnostic sensitivity and specificity in 272 cases which were autopsied, and evaluated the impact of mutations and polymorphisms in the PRNP gene, and Type 1 or Type 2 of human prions on diagnostic performance. Results The 98.5% diagnostic specificity and 92% sensitivity of CSF RT-QuIC in a blinded retrospective analysis matched the 100% specificity and 95% sensitivity of a blind prospective study. The CSF RT-QuIC differentiated 94% of cases of sporadic Creutzfeldt-Jakob disease (sCJD) MM1 from the sCJD MM2 phenotype, and 80% of sCJD VV2 from sCJD VV1. The mixed prion type 1–2 and cases heterozygous for codon 129 generated intermediate CSF RT-QuIC patterns, while genetic prion diseases revealed distinct profiles for each PRNP gene mutation. Interpretation The diagnostic performance of the improved CSF RT-QuIC is superior to surrogate marker tests for prion diseases such as 14-3-3 and Tau proteins and together with PRNP gene sequencing, the test allows the major prion subtypes to be differentiated in vivo. This differentiation facilitates prediction of the clinicopathological phenotype and duration of the disease—two important considerations for envisioned therapeutic interventions. PMID:27893164
Pretest analysis of natural circulation on the PWR model PACTEL with horizontal steam generators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kervinen, T.; Riikonen, V.; Ritonummi, T.
A new tests facility - parallel channel tests loop (PACTEL)- has been designed and built to simulate the major components and system behavior of pressurized water reactors (PWRs) during postulated small- and medium-break loss-of-coolant accidents. Pretest calculations have been performed for the first test series, and the results of these calculations are being used for planning experiments, for adjusting the data acquisition system, and for choosing the optimal position and type of instrumentation. PACTEL is a volumetrically scaled (1:305) model of the VVER-440 PWR. In all the calculated cases, the natural circulation was found to be effective in removing themore » heat from the core to the steam generator. The loop mass flow rate peaked at 60% mass inventory. The straightening of the loop seals increased the mass flow rate significantly.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-01-01
Foam spray equipment and materials for dust suppression on longwall double drum shearer faces have been procured. This equipment includes metering pumps, foam generators and mounting brackets, foam solutions, flow meters, real time and gravimetric sampling equipment, hoses and valve banks. Initial tests have been conducted in the laboratory with three types of generators and five types of foam solutions. Based on these tests, Senior Conflow's cluster spray and Onyx Chemical Company's millifoam solution have been selected. For pumping foam solution to the shearer, Jon Bean's 2 hp, 120 VAC single-phase ceramic lined piston pump has been selected. For fieldmore » tests, equipment has been installed underground in Dobbin mine in Upper Freeport seam on Eickhoff EDW 300 double drum shearer. Foamspray tests have been conducted. Real time and gravimetric dust samples have been collected. Real time sampling results indicate a dust level reduction of up to 37 percent with foam spray compared to the base case of water sprays.« less
NASA Astrophysics Data System (ADS)
Kopp, G.; Brückmann, S.; Kriescher, M.; Friedrich, H. E.
In times of climate change vehicle emissions have to be reduced clearly. One possibility is to reduce the mass of the body in white using lightweight sandwich structures. The department `Lightweight and Hybrid Design Methods' of the Institute of Vehicle Concepts develops a vehicle body structure by using sandwiches with aluminum top layers and polyurethane foam as core material. For that the foam and the sandwiches were investigated under different load cases, e.g. pressure loading and in-plane tests. In tests with components the high potential of the sandwich materials were shown. On the dynamic component test facility of the institute, vehicle front structures were tested successfully. The results of all investigations regarding sandwich materials, integration of functions (e.g. crash, thermal) in vehicle structures and the concept LUV are developed under the research program of Next Generation Car of the DLR. We will show the development and results of the LUV.
Analyzing the test process using structural coverage
NASA Technical Reports Server (NTRS)
Ramsey, James; Basili, Victor R.
1985-01-01
A large, commercially developed FORTRAN program was modified to produce structural coverage metrics. The modified program was executed on a set of functionally generated acceptance tests and a large sample of operational usage cases. The resulting structural coverage metrics are combined with fault and error data to evaluate structural coverage. It was shown that in the software environment the functionally generated tests seem to be a good approximation of operational use. The relative proportions of the exercised statement subclasses change as the structural coverage of the program increases. A method was also proposed for evaluating if two sets of input data exercise a program in a similar manner. Evidence was provided that implies that in this environment, faults revealed in a procedure are independent of the number of times the procedure is executed and that it may be reasonable to use procedure coverage in software models that use statement coverage. Finally, the evidence suggests that it may be possible to use structural coverage to aid in the management of the acceptance test processed.
Comparison of Control Group Generating Methods.
Szekér, Szabolcs; Fogarassy, György; Vathy-Fogarassy, Ágnes
2017-01-01
Retrospective studies suffer from drawbacks such as selection bias. As the selection of the control group has a significant impact on the evaluation of the results, it is very important to find the proper method to generate the most appropriate control group. In this paper we suggest two nearest neighbors based control group selection methods that aim to achieve good matching between the individuals of case and control groups. The effectiveness of the proposed methods is evaluated by runtime and accuracy tests and the results are compared to the classical stratified sampling method.
Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2005-01-01
A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Reich, Christian G; Ryan, Patrick B; Schuemie, Martijn J
2013-10-01
A systematic risk identification system has the potential to test marketed drugs for important Health Outcomes of Interest or HOI. For each HOI, multiple definitions are used in the literature, and some of them are validated for certain databases. However, little is known about the effect of different definitions on the ability of methods to estimate their association with medical products. Alternative definitions of HOI were studied for their effect on the performance of analytical methods in observational outcome studies. A set of alternative definitions for three HOI were defined based on literature review and clinical diagnosis guidelines: acute kidney injury, acute liver injury and acute myocardial infarction. The definitions varied by the choice of diagnostic codes and the inclusion of procedure codes and lab values. They were then used to empirically study an array of analytical methods with various analytical choices in four observational healthcare databases. The methods were executed against predefined drug-HOI pairs to generate an effect estimate and standard error for each pair. These test cases included positive controls (active ingredients with evidence to suspect a positive association with the outcome) and negative controls (active ingredients with no evidence to expect an effect on the outcome). Three different performance metrics where used: (i) Area Under the Receiver Operator Characteristics (ROC) curve (AUC) as a measure of a method's ability to distinguish between positive and negative test cases, (ii) Measure of bias by estimation of distribution of observed effect estimates for the negative test pairs where the true effect can be assumed to be one (no relative risk), and (iii) Minimal Detectable Relative Risk (MDRR) as a measure of whether there is sufficient power to generate effect estimates. In the three outcomes studied, different definitions of outcomes show comparable ability to differentiate true from false control cases (AUC) and a similar bias estimation. However, broader definitions generating larger outcome cohorts allowed more drugs to be studied with sufficient statistical power. Broader definitions are preferred since they allow studying drugs with lower prevalence than the more precise or narrow definitions while showing comparable performance characteristics in differentiation of signal vs. no signal as well as effect size estimation.
Acharya, Abhinav P; Theisen, Kathryn M; Correa, Andres; Meyyappan, Thiagarajan; Apfel, Abraham; Sun, Tao; Tarin, Tatum V; Little, Steven R
2017-11-01
Although hematuria (blood in urine) is the most common symptom of bladder cancer, 70-98% of hematuria cases are benign. These hematuria patients unnecessarily undergo costly, invasive, and expensive evaluation for bladder cancer. Therefore, there remains a need for noninvasive office-based tests that can rapidly and reliably rule out bladder cancer in patients undergoing hematuria evaluation. Herein, a clinical assay for matrix metalloproteinases ("Ammps") is presented, which generates a visual signal based on the collagenase activity (in urine of patients) on the Ammps substrates. Ammps substrates are generated by crosslinking gelatin with Fe(II) chelated alginate nanoparticles, which precipitate in urine samples. The cleavage of gelatin-conjugated alginate (Fe(II)) nanoparticles by collagenases generates free-floating alginate (Fe(II)) nanoparticles that participate in Fenton's reaction to generate a visual signal. In a pilot study of 88 patients, Ammps had 100% sensitivity, 85% specificity, and a negative predictive value (NPV) of 100% for diagnosing bladder cancer. This high NPV can be useful in ruling out bladder cancer in patients referred for hematuria evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Broadband Fan Noise Prediction System for Turbofan Engines. Volume 3; Validation and Test Cases
NASA Technical Reports Server (NTRS)
Morin, Bruce L.
2010-01-01
Pratt & Whitney has developed a Broadband Fan Noise Prediction System (BFaNS) for turbofan engines. This system computes the noise generated by turbulence impinging on the leading edges of the fan and fan exit guide vane, and noise generated by boundary-layer turbulence passing over the fan trailing edge. BFaNS has been validated on three fan rigs that were tested during the NASA Advanced Subsonic Technology Program (AST). The predicted noise spectra agreed well with measured data. The predicted effects of fan speed, vane count, and vane sweep also agreed well with measurements. The noise prediction system consists of two computer programs: Setup_BFaNS and BFaNS. Setup_BFaNS converts user-specified geometry and flow-field information into a BFaNS input file. From this input file, BFaNS computes the inlet and aft broadband sound power spectra generated by the fan and FEGV. The output file from BFaNS contains the inlet, aft and total sound power spectra from each noise source. This report is the third volume of a three-volume set documenting the Broadband Fan Noise Prediction System: Volume 1: Setup_BFaNS User s Manual and Developer s Guide; Volume 2: BFaNS User s Manual and Developer s Guide; and Volume 3: Validation and Test Cases. The present volume begins with an overview of the Broadband Fan Noise Prediction System, followed by validation studies that were done on three fan rigs. It concludes with recommended improvements and additional studies for BFaNS.
NASA Technical Reports Server (NTRS)
Cook, M.
1990-01-01
Qualification testing of Combustion Engineering's AMDATA Intraspect/98 Data Acquisition and Imaging System that applies to the redesigned solid rocket motor field joint capture feature case-to-insulation bondline inspection was performed. Testing was performed at M-111, the Thiokol Corp. Inert Parts Preparation Building. The purpose of the inspection was to verify the integrity of the capture feature area case-to-insulation bondline. The capture feature scanner was calibrated over an intentional 1.0 to 1.0 in. case-to-insulation unbond. The capture feature scanner was then used to scan 60 deg of a capture feature field joint. Calibration of the capture feature scanner was then rechecked over the intentional unbond to ensure that the calibration settings did not change during the case scan. This procedure was successfully performed five times to qualify the unbond detection capability of the capture feature scanner. The capture feature scanner qualified in this test contains many points of mechanical instability that can affect the overall ultrasonic signal response. A new generation scanner, designated the sigma scanner, should be implemented to replace the current configuration scanner. The sigma scanner eliminates the unstable connection points of the current scanner and has additional inspection capabilities.
Keen, P; Conway, D P; Cunningham, P; McNulty, A; Couldwell, D L; Davies, S C; Smith, D E; Gray, J; Holt, M; O'Connor, C C; Read, P; Callander, D; Prestage, G; Guy, R
2017-01-01
The Trinity Biotech Uni-Gold HIV test (Uni-Gold) is often used as a supplementary rapid test in testing algorithms. To evaluate the operational performance of the Uni-Gold as a first-line screening test among gay and bisexual men (GBM) in a setting where 4th generation HIV laboratory assays are routinely used. We compared the performance of Uni-Gold with conventional HIV serology conducted in parallel among GBM attending 22 testing sites. Sensitivity was calculated separately for acute and established infection, defined using 4th generation screening Ag/Ab immunoassay (EIA) and Western blot results. Previous HIV testing history and results of supplementary 3rd generation HIV Ab EIA, and p24 antigen EIA were used to further characterise cases of acute infection. Of 10,793 specimens tested with Uni-Gold and conventional serology, 94 (0.90%, 95%CI:0.70-1.07) were confirmed as HIV-positive by conventional serology, and 37 (39.4%) were classified as acute infection. Uni-Gold sensitivity was 81.9% overall (77/94, 95%CI:72.6-89.1); 56.8% for acute infection (21/37, 95%CI:39.5-72.9) and 98.2% for established infection (56/57, 95%CI:90.6-100.0). Of 17 false non-reactive Uni-Gold results, 16 were acute infections, and of these seven were p24 antigen reactive but antibody negative. Uni-Gold specificity was 99.9% (10,692/10,699, 95%CI:99.9-100.0), PPV was 91.7% (95%CI:83.6-96.6) and NPV was 99.8% (95%CI:99.7-99.9), respectively. In this population, Uni-Gold had good specificity and sensitivity was high for established infections when compared to 4th generation laboratory assays, however sensitivity was lower in acute infections. Where rapid tests are used in populations with a high proportion of acute infections, additional testing strategies are needed to detect acute infections. Copyright © 2016 Elsevier B.V. All rights reserved.
[Effect of maternal death on family dynamics and infant survival].
Reyes Frausto, S; Bobadilla Fernández, J L; Karchmer Krivitzky, S; Martínez González, L
1998-10-01
Family adjustments, which are generated by a maternal death, have been analysed previously in Mexico by using a reduced number of cases in rural areas. This study was design in order to establish changes in family dynamic generated b y a maternal death and to analyse child surviving after one year of birth. Family members of maternal deaths cases, which occurred during 1988-89 in the Federal District, were interviewed by first time in order to know information related to family dynamic and women's characteristics. A second interview was made after one year of birth for cases in which the newborn survived hospital discharge. Simple frequencies were calculated and using X2 test compared groups. Main consequences were family disintegration, child acquiring new roles and economic problems when woman was the main or the only one support of the family. Child surviving was higher than we expected considering other national or international reports. Children were mainly integrated to their grandparent's family.
A coccidioidomycosis outbreak following the Northridge, Calif, earthquake
Schneider, E.; Hajjeh, R.A.; Spiegel, R.A.; Jibson, R.W.; Harp, E.L.; Marshall, G.A.; Gunn, R.A.; McNeil, M.M.; Pinner, R.W.; Baron, R.C.; Burger, R.C.; Hutwagner, L.C.; Crump, C.; Kaufman, L.; Reef, S.E.; Feldman, G.M.; Pappagianis, D.; Werner, S.B.
1997-01-01
Objective. - To describe a coccidioidomycosis outbreak in Ventura County following the January 1994 earthquake, centered in Northridge, Calif, and to identify factors that increased the risk for acquiring acute coccidioidomycosis infection. Design. - Epidemic investigation, population- based skin test survey, and case-control study. Setting. - Ventura County, California. Results. - In Ventura County, between January 24 and March 15, 1994, 203 outbreak-associated coccidioidomycosis cases, including 3 fatalities, were identified (attack rate [AR], 30 cases per 100 000 population). The majority of cases (56%) and the highest AR (114 per 100 000 population) occurred in the town of Simi Valley, a community located at the base of a mountain range that experienced numerous landslides associated with the earthquake. Disease onset for cases peaked 2 weeks after the earthquake. The AR was 2.8 times greater for persons 40 years of age and older than for younger persons (relative risk, 2.8; 95% confidence interval [CI], 2.1-3.7; P<.001). Environmental data indicated that large dust clouds, generated by landslides following the earthquake and strong aftershocks in the Santa Susana Mountains north of Simi Valley, were dispersed into nearby valleys by northeast winds. Simi Valley case-control study data indicated that physically being in a dust cloud (odds ratio, 3.0; 95% CI, 1.6-5.4; P<.001) and time spent in a dust cloud (P<.001) significantly increased the risk for being diagnosed with acute coccidioidomycosis. Conclusions. - Both the location and timing of cases strongly suggest that the coccidioidomycosis outbreak in Ventura County was caused when arthrospores were spread in dust clouds generated by the earthquake. This is the first report of a coccidioidomycosis outbreak following an earthquake. Public and physician awareness, especially in endemic areas following similar dust cloud- generating events, may result in prevention and early recognition of acute coccidioidomycosis.
NASA Technical Reports Server (NTRS)
Brown, David B.
1988-01-01
A history of the Query Utility Environment for Software Testing (QUEST)/Ada is presented. A fairly comprehensive literature review which is targeted toward issues of Ada testing is given. The definition of the system structure and the high level interfaces are then presented. The design of the three major components is described. The QUEST/Ada IORL System Specifications to this point in time are included in the Appendix. A paper is also included in the appendix which gives statistical evidence of the validity of the test case generation approach which is being integrated into QUEST/Ada.
The mixed-mode bending method for delamination testing
NASA Technical Reports Server (NTRS)
Reeder, James R.; Crews, John H., Jr.
1989-01-01
A mixed-mode bending (MMB) test procedure is presented which combines double cantilever beam mode-I loading and end-notch flexure mode II loading on a split, unidirectional laminate. The MMB test has been analyzed by FEM and by beam theory in order to ascertain the mode I and mode II components' respective strain energy release rates, G(I) and G(II); these analyses indicate that a wide range of G(I)/G(II) ratios can be generated by varying the applied load's position on the loading lever. The MMB specimen analysis and test procedures are demonstrated for the case of AS4/PEEK unidirectional laminates.
Allergy to dexchlorpheniramine. Study of a case.
Cáceres Calle, O; Fernández-Benítez, M
2004-01-01
Dexchlorpheniramine (DH) is a classical or first generation antihistamine belonging to the ethanolamine group. Adverse effects related to these antihistamines are frequent, but the hypersensitivity reactions described in the literature since 1940 are exceptional. We report the case of a 32-year-old woman who experienced two episodes of akathisia secondary to intravenous (i.v.) dexchlorpheniramine administration for a possible hypersensitivity reaction to local anesthetics. Allergological study consisted of the following tests: skin prick tests with routine allergens, with a negative result; skin prick and intradermal tests with local anesthetics and DH, with a positive result to DH in the intradermal skin test (+ +); serum specific IgE, which was within normal levels; histamine release test with DH with a negative result, and the basophil activation test (BAT) with local anesthetics and DH, which was positive for DH and weakly positive to Lidocaine. BAT is proving to be a highly useful tool in the field of drug allergy, with a higher sensitivity and specificity than other in vitro tests. Because it avoids the need for provocation tests, this is especially important in drug-induced allergic reactions in which in vivo tests are repeatedly negative despite a clear clinical history.
Reconstruction of multiple cracks from experimental electrostatic boundary measurements
NASA Technical Reports Server (NTRS)
Bryan, Kurt; Liepa, Valdis; Vogelius, Michael
1993-01-01
An algorithm for recovering a collection of linear cracks in a homogeneous electrical conductor from boundary measurements of voltages induced by specified current fluxes is described. The technique is a variation of Newton's method and is based on taking weighted averages of the boundary data. An apparatus that was constructed specifically for generating laboratory data on which to test the algorithm is also described. The algorithm is applied to a number of different test cases and the results are discussed.
Frommeyer, Gerrit; Zumhagen, Sven; Dechering, Dirk G; Larbig, Robert; Bettin, Markus; Löher, Andreas; Köbe, Julia; Reinke, Florian; Eckardt, Lars
2016-03-15
The results of the recently published randomized SIMPLE trial question the role of routine intraoperative defibrillation testing. However, testing is still recommended during implantation of the entirely subcutaneous implantable cardioverter-defibrillator (S-ICD) system. To address the question of whether defibrillation testing in S-ICD systems is still necessary, we analyzed the data of a large, standard-of-care prospective single-center S-ICD registry. In the present study, 102 consecutive patients received an S-ICD for primary (n=50) or secondary prevention (n=52). Defibrillation testing was performed in all except 4 patients. In 74 (75%; 95% CI 0.66-0.83) of 98 patients, ventricular fibrillation was effectively terminated by the first programmed internal shock. In 24 (25%; 95% CI 0.22-0.44) of 98 patients, the first internal shock was ineffective and further internal or external shock deliveries were required. In these patients, programming to reversed shock polarity (n=14) or repositioning of the sensing lead (n=1) or the pulse generator (n=5) led to successful defibrillation. In 4 patients, a safety margin of <10 J was not attained. Nevertheless, in these 4 patients, ventricular arrhythmias were effectively terminated with an internal 80-J shock. Although it has been shown that defibrillation testing is not necessary in transvenous ICD systems, it seems particular important for S-ICD systems, because in nearly 25% of the cases the primary intraoperative test was not successful. In most cases, a successful defibrillation could be achieved by changing shock polarity or by optimizing the shock vector caused by the pulse generator or lead repositioning. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Software engineering techniques and CASE tools in RD13
NASA Astrophysics Data System (ADS)
Buono, S.; Gaponenko, I.; Jones, R.; Khodabandeh, A.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Polesello, G.; Aguer, M.; Huet, M.
1994-12-01
The RD13 project was approved in April 1991 for the development of a scalable data-taking system suitable for hosting various LHC studies. One of its goals is the exploitation of software engineering techniques, in order to indicate their overall suitability for data acquisition (DAQ), software design and implementation. This paper describes how such techniques have been applied to the development of components of the RD13 DAQ used in test-beam runs at CERN. We describe our experience with the Artifex CASE tool and its associated methodology. The issues raised when code generated by a CASE tool has to be integrated into an existing environment are also discussed.
Outage maintenance checks on large generator windings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nindra, B.; Jeney, S.I.; Slobodinsky, Y.
In the present days of austerity, more constraints and pressures are being brought on the maintenance engineers to certify the generators for their reliability and life extension. The outages are shorter and intervals between the outages are becoming longer. The annual outages were very common when utilities had no regulatory constraints and also had standby capacities. Furthermore, due to lean and mean budgets, outage maintenance programs are being pursued more aggressively, so that longer interval outages can be achieved to ensure peak generator performance. This paper will discuss various visual checks, electrical tests and recommended fixes to achieve the abovemore » mentioned objectives, in case any deficiencies are found.« less
The role of high-risk HPV-DNA testing in the male sexual partners of women with HPV-induced lesions.
Giraldo, Paulo C; Eleutério, Jose; Cavalcante, Diane Isabelle M; Gonçalves, Ana Katherine S; Romão, Juliana A A; Eleutério, Renata M N
2008-03-01
The objectives were to assess the prevalence of high-risk HPV in the male sexual partners of women with HPV-induced lesions, and correlate it with biopsies guided by peniscopy. Fifty-four asymptomatic male sexual partners of women with low-grade squamous intra-epithelial lesions (LSIL) associated with high-risk HPV were examined between April 2003 and June 2005. The DNA-HPV was tested using a second-generation hybrid capture technique in scraped penile samples. Peniscopy identified acetowhite lesions leading to biopsy. High-risk HPV was present in 25.9% (14 out of 54) of the cases. Peniscopy led to 13 biopsies (24.07%), which resulted in two cases of condyloma, two cases of intra-epithelial neoplasia (PIN) I, one case of PIN II, and eight cases of normal tissue. The high-risk HPV test demonstrated 80% sensitivity, 100% specificity, 100% positive predictive value, and 88.9% negative predictive value for the identification of penile lesions. There was a greater chance of finding HPV lesions in the biopsy in the positive cases of high-risk HPV with abnormal peniscopy (p=0.007); OR=51 (CI 1.7-1527.1). Among asymptomatic male sexual partners of women with low-grade intra-epithelial squamous lesions, those infected by high-risk HPV have a higher chance of having abnormal penile tissue compared with male partners without that infection.
Trends in liability affecting technical writers
NASA Technical Reports Server (NTRS)
Driskill, L. P.
1981-01-01
Liability of technical writers for defective products is explored. Documents generated during a product's life cycle (including design memos, design tests, clinical trials, trial use reports, letters, and proposals) become relevant because they are likely to become the only available means of showing that the product was not defectively designed. These documents become the evidence that the product underwent balanced and well considered planning, development, testing, quality control, and field testing. The predicted increased involvement of technical writers in the prevention and defense of product liability claims is cited in view of a greater number of cases turning on "failure to warn".
Ventrella, Emanuela; Adamski, Zbigniew; Chudzińska, Ewa; Miądowicz-Kobielska, Mariola; Marciniak, Paweł; Büyükgüzel, Ender; Büyükgüzel, Kemal; Erdem, Meltem; Falabella, Patrizia; Scrano, Laura; Bufo, Sabino Aurelio
2016-01-01
Glycoalkaloids are secondary metabolites commonly found in Solanaceae plants. They have anti-bacterial, anti-fungal and insecticidal activities. In the present study we examine the effects of potato and tomato leaf extracts and their main components, the glycoalkaloids α-solanine, α-chaconine and α-tomatine, on development and reproduction of Drosophila melanogaster wild-type flies at different stages. Parental generation was exposed to five different concentrations of tested substances. The effects were examined also on the next, non-exposed generation. In the first (exposed) generation, addition of each extract reduced the number of organisms reaching the pupal and imaginal stages. Parent insects exposed to extracts and metabolites individually applied showed faster development. However, the effect was weaker in case of single metabolites than in case of exposure to extracts. An increase of developmental rate was also observed in the next, non-exposed generation. The imagoes of both generations exposed to extracts and pure metabolites showed some anomalies in body size and malformations, such as deformed wings and abdomens, smaller black abdominal zone. Our results further support the current idea that Solanaceae can be an impressive source of molecules, which could efficaciously be used in crop protection, as natural extract or in formulation of single pure metabolites in sustainable agriculture.
Postbuckling and Growth of Delaminations in Composite Plates Subjected to Axial Compression
NASA Technical Reports Server (NTRS)
Reeder, James R.; Chunchu, Prasad B.; Song, Kyongchan; Ambur, Damodar R.
2002-01-01
The postbuckling response and growth of circular delaminations in flat and curved plates are investigated as part of a study to identify the criticality of delamination locations through the laminate thickness. The experimental results from tests on delaminated plates are compared with finite element analysis results generated using shell models. The analytical prediction of delamination growth is obtained by assessing the strain energy release rate results from the finite element model and comparing them to a mixed-mode fracture toughness failure criterion. The analytical results for onset of delamination growth compare well with experimental results generated using a 3-dimensional displacement visualization system. The record of delamination progression measured in this study has resulted in a fully 3-dimensional test case with which progressive failure models can be validated.
NASA Astrophysics Data System (ADS)
Guerra, J. E.; Ullrich, P. A.
2015-12-01
Tempest is a next-generation global climate and weather simulation platform designed to allow experimentation with numerical methods at very high spatial resolutions. The atmospheric fluid equations are discretized by continuous / discontinuous finite elements in the horizontal and by a staggered nodal finite element method (SNFEM) in the vertical, coupled with implicit/explicit time integration. At global horizontal resolutions below 10km, many important questions remain on optimal techniques for solving the fluid equations. We present results from a suite of meso-scale test cases to validate the performance of the SNFEM applied in the vertical. Internal gravity wave, mountain wave, convective, and Cartesian baroclinic instability tests will be shown at various vertical orders of accuracy and compared with known results.
Transire, a Program for Generating Solid-State Interface Structures
2017-09-14
function-based electron transport property calculator. Three test cases are presented to demonstrate the usage of Transire: the misorientation of the...graphene bilayer, the interface energy as a function of misorientation of copper grain boundaries, and electron transport transmission across the...gallium nitride/silicon carbide interface. 15. SUBJECT TERMS crystalline interface, electron transport, python, computational chemistry, grain boundary
Generational Change: Closing the Test Score Gap
ERIC Educational Resources Information Center
Peterson, Paul E., Ed.
2006-01-01
In the 2003 Grutter v. Bollinger University of Michigan Law School affirmative action case, Sandra Day O'Connor declared on behalf of the majority of justices that, "We expect that 25 years from now, the use of racial preferences will no longer be necessary." As this amounts to no less than a mandate that affirmative action for college…
Transitioning from Software Requirements Models to Design Models
NASA Technical Reports Server (NTRS)
Lowry, Michael (Technical Monitor); Whittle, Jon
2003-01-01
Summary: 1. Proof-of-concept of state machine synthesis from scenarios - CTAS case study. 2. CTAS team wants to use the syntheses algorithm to validate trajectory generation. 3. Extending synthesis algorithm towards requirements validation: (a) scenario relationships' (b) methodology for generalizing/refining scenarios, and (c) interaction patterns to control synthesis. 4. Initial ideas tested on conflict detection scenarios.
Recycled fiber quality from a laboratory-scale blade separator/blend
Bei-Hong Liang; Stephen M. Shaler; Laurence Mott; Leslie Groom
1994-01-01
A simple and inexpensive fiber separator/blender was developed to generate useful secondary fibers from hydropulped waste paper. Processing wet hydropulped fiber resulted in a furnish with no change in average fiber length in three out of four types of recycled fibers tested. In all cases, the Canadian Standard freeness increased after processing compared to...
Recycled fiber quality from a laboratory-scale blade separator/blender
Bei-Hong Liang; Stephen M. Shaler; Laurence Mott; Leslie Groom
1994-01-01
A simple and inexpensive fiber separator/blender was developed to generate useful secondary fibers from hydropulped waste paper. Processing wet hydropulped fiber resulted in a furnish with no change in average fiber length in three out of four types of recycled fibers tested. In all cases, the canadian standard freeness increased after processing compared to...
Fuller, Maren Y; Mody, Dina; Hull, April; Pepper, Kristi; Hendrickson, Heather; Olsen, Randall
2018-02-01
- Thyroid nodules have a prevalence of approximately 70% in adults. Fine-needle aspiration (FNA) is a minimally invasive, cost-effective, standard method to collect tissue from thyroid nodules for cytologic examination. However, approximately 15% of thyroid FNA specimens cannot be unambiguously diagnosed as benign or malignant. - To investigate whether clinically actionable data can be obtained using next-generation sequencing of residual needle rinse material. - A total of 24 residual needle rinse specimens with malignant (n = 6), indeterminate (n = 9), or benign (n = 9) thyroid FNA diagnoses were analyzed in our clinical molecular diagnostics laboratory using next-generation sequencing assays designed to detect gene mutations and translocations that commonly occur in thyroid cancer. Results were correlated with surgical diagnoses and clinical outcomes. - Interpretable data were generated from 23 of 24 residual needle rinse specimens. Consistent with its well-known role in thyroid malignancy, BRAF V600E mutations were detected in 4 malignant cases. An NRAS mutation was detected in 1 benign case. No mutations were detected from specimens with indeterminate diagnoses. - Our data demonstrate that residual thyroid FNA needle rinses are an adequate source of material for molecular diagnostic testing. Importantly, detection of a mutation implicated in thyroid malignancy was predictive of the final surgical diagnosis and clinical outcome. Our strategy to triage thyroid nodules with indeterminate cytology with molecular testing eliminates the need to perform additional FNA passes into dedicated media or to schedule additional invasive procedures. Further investigation with a larger sample size to confirm the clinical utility of our proposed strategy is underway.
A Numerical Simulation of a Normal Sonic Jet into a Hypersonic Cross-Flow
NASA Technical Reports Server (NTRS)
Jeffries, Damon K.; Krishnamurthy, Ramesh; Chandra, Suresh
1997-01-01
This study involves numerical modeling of a normal sonic jet injection into a hypersonic cross-flow. The numerical code used for simulation is GASP (General Aerodynamic Simulation Program.) First the numerical predictions are compared with well established solutions for compressible laminar flow. Then comparisons are made with non-injection test case measurements of surface pressure distributions. Good agreement with the measurements is observed. Currently comparisons are underway with the injection case. All the experimental data were generated at the Southampton University Light Piston Isentropic Compression Tube.
TSA - A Two Scale Approximation for Wind-Generated Ocean Surface Waves
2012-09-30
broad-scale version of TSA, or ‘ dTSA ’. In this manner dTSA is able to respond to changing wind situations. Results were shown to compare well with ‘exact...We also implemented the revised version of TSA, denoted ‘ dTSA ’, in WW3 for tests with a storm case, hurricane Juan, which made landfall as a...manner in which the broad-scale of TSA was defined, developing ‘ dTSA ’ as described above, so that in complicated rapidly changing wave spectra cases, a
Implementation and utilization of the molecular tumor board to guide precision medicine.
Harada, Shuko; Arend, Rebecca; Dai, Qian; Levesque, Jessica A; Winokur, Thomas S; Guo, Rongjun; Heslin, Martin J; Nabell, Lisle; Nabors, L Burt; Limdi, Nita A; Roth, Kevin A; Partridge, Edward E; Siegal, Gene P; Yang, Eddy S
2017-08-22
With rapid advances in genomic medicine, the complexity of delivering precision medicine to oncology patients across a university health system demanded the creation of a Molecular Tumor Board (MTB) for patient selection and assessment of treatment options. The objective of this report is to analyze our progress to date and discuss the importance of the MTB in the implementation of personalized medicine. Patients were reviewed in the MTB for appropriateness for comprehensive next generation sequencing (NGS) cancer gene set testing based on set criteria that were in place. Because profiling of stage IV lung cancer, colon cancer, and melanoma cancers were standard of care, these cancer types were excluded from this process. We subsequently analyzed the types of cases referred for testing and approved with regards to their results. 191 cases were discussed at the MTB and 132 cases were approved for testing. Forty-six cases (34.8%) had driver mutations that were associated with an active targeted therapeutic agent, including BRAF, PIK3CA, IDH1, KRAS , and BRCA1 . An additional 56 cases (42.4%) had driver mutations previously reported in some type of cancer. Twenty-two cases (16.7%) did not have any clinically significant mutations. Eight cases did not yield adequate DNA. 15 cases were considered for targeted therapy, 13 of which received targeted therapy. One patient experienced a near complete response. Seven of 13 had stable disease or a partial response. MTB at University of Alabama-Birmingham is unique because it reviews the appropriateness of NGS testing for patients with recurrent cancer and serves as a forum to educate our physicians about the pathways of precision medicine. Our results suggest that our detection of actionable mutations may be higher due to our careful selection. The application of precision medicine and molecular genetic testing for cancer patients remains a continuous educational process for physicians.
McInnes, L Alison; González, Patricia Jiménez; Manghi, Elina R; Esquivel, Marcela; Monge, Silvia; Delgado, Marietha Fallas; Fournier, Eduardo; Bondy, Pamela; Castelle, Kathryn
2005-03-21
Autism is a heritable developmental disorder of communication and socialization that has not been well studied in Hispanic populations. Therefore, we are collecting and evaluating all possible cases of autism from a population isolate in the Central Valley of Costa Rica (CVCR) for a clinical and genetic study. We are assessing all subjects and parents, as appropriate, using the newly translated Spanish versions of the Autism Diagnostic Interview-Revised (ADI-R) and the Autism Diagnostic Observation Schedule (ADOS) as well as tests of intelligence and adaptive behavior. Detailed obstetric and family medical/psychiatric histories are taken. All cases are tested for Fragile X and will be extensively evaluated for cytogenetic abnormalities. To date we have obtained clinical evaluations on over 76 cases of possible autism referred to our study and report data for the initial 35 complete cases. The mean age of the probands is 6.7 years, and 31 of the 35 cases are male. Twenty-one of the cases have IQs <50 and only 6 cases have IQs > or = 70. Over half of the mothers had complications during pregnancy and/or delivery. No cases have tested positively for Fragile X or PKU. Chromosomal G-banding is not yet complete for all cases. Diagnostic data gathered on cases of autism in the CVCR using Spanish versions of the ADI-R and ADOS look similar to that generated by studies of English-speaking cases. However, only 17% of our cases have IQs within the normal range, compared to the figure of 25% seen in most studies. This result reflects an ascertainment bias in that only severe cases of autism come to treatment in the CVCR because there are no government-sponsored support programs or early intervention programs providing an incentive to diagnose autism. The severity of mental retardation seen in most of our cases may also be exaggerated by the lack of early intervention programs and the use of IQ tests without Costa Rican norms. Still, we must formally train healthcare providers and teachers to recognize and refer autistic cases with normal or near normal IQs that are not seen in treatment.
True random bit generators based on current time series of contact glow discharge electrolysis
NASA Astrophysics Data System (ADS)
Rojas, Andrea Espinel; Allagui, Anis; Elwakil, Ahmed S.; Alawadhi, Hussain
2018-05-01
Random bit generators (RBGs) in today's digital information and communication systems employ a high rate physical entropy sources such as electronic, photonic, or thermal time series signals. However, the proper functioning of such physical systems is bound by specific constrains that make them in some cases weak and susceptible to external attacks. In this study, we show that the electrical current time series of contact glow discharge electrolysis, which is a dc voltage-powered micro-plasma in liquids, can be used for generating random bit sequences in a wide range of high dc voltages. The current signal is quantized into a binary stream by first using a simple moving average function which makes the distribution centered around zero, and then applying logical operations which enables the binarized data to pass all tests in industry-standard randomness test suite by the National Institute of Standard Technology. Furthermore, the robustness of this RBG against power supply attacks has been examined and verified.
The Use of Profanity During Letter Fluency Tasks in Frontotemporal Dementia and Alzheimer's Disease
Ringman, John M.; Kwon, Eunice; Flores, Deborah L.; Rotko, Carol; Mendez, Mario F.; Lu, Po
2012-01-01
Objective To assess whether the production of profanity during letter fluency testing distinguishes frontotemporal dementia (FTD) and Alzheimer's disease (AD) patients. Background Alterations in language and social behavior typify FTD spectrum disorders. Nonetheless, in can be difficult to distinguish pathologically-defined frontotemporal lobar degeneration (FTLD) from AD clinically. Assessing verbal fluency by having patients generate as many words as they can beginning with specific letters in a given period of time can yield diverse information of diagnostic utility. Method Words produced during FAS letter fluency testing were reviewed and instances of the use of "f*ck", "*ss", and "sh*t" and other words felt to be inappropriate were sought. The frequency of these words was compared between clinically diagnosed FTD and AD patients using chi-square tests. Results We found that 6/32 (18.8%) patients with FTD generated the word "f*ck" during the "F" trial as opposed to none of 38 patients with AD (p = 0.007). Patients who said "f*ck" had diagnoses of either behavioral variant FTD (3/15), progressive non-fluent aphasia (2/8), or semantic dementia (1/3). Conclusions Though the specific neuropathology in these cases is uncertain, generation of "f*ck" during letter fluency testing appears to have utility in differentiating FTD from AD. PMID:20829665
USM3D Simulations for Second Sonic Boom Workshop
NASA Technical Reports Server (NTRS)
Elmiligui, Alaa; Carter, Melissa B.; Nayani, Sudheer N.; Cliff, Susan; Pearl, Jason M.
2017-01-01
The NASA Tetrahedral Unstructured Software System with the USM3D flow solver was used to compute test cases for the Second AIAA Sonic Boom Prediction Workshop. The intent of this report is to document the USM3D results for SBPW2 test cases. The test cases included an axisymmetric equivalent area body, a JAXA wing body, a NASA low boom supersonic configuration modeled with flow through nacelles and engine boundary conditions. All simulations were conducted for a free stream Mach number of 1.6, zero degrees angle of attack, and a Reynolds number of 5.7 million per meter. Simulations were conducted on tetrahedral grids provided by the workshop committee, as well as a family of grids generated by an in-house approach for sonic boom analyses known as BoomGrid using current best practices. The near-field pressure signatures were extracted and propagated to the ground with the atmospheric propagation code, sBOOM. The USM3D near-field pressure signatures, corresponding sBOOM ground signatures, and loudness levels on the ground are compared with mean values from other workshop participants.
Ghose, Soumya; Greer, Peter B; Sun, Jidi; Pichler, Peter; Rivest-Henault, David; Mitra, Jhimli; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Dowling, Jason A
2017-10-27
In MR only radiation therapy planning, generation of the tissue specific HU map directly from the MRI would eliminate the need of CT image acquisition and may improve radiation therapy planning. The aim of this work is to generate and validate substitute CT (sCT) scans generated from standard T2 weighted MR pelvic scans in prostate radiation therapy dose planning. A Siemens Skyra 3T MRI scanner with laser bridge, flat couch and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole pelvis MRI (1.6 mm 3D isotropic T2w SPACE sequence) was acquired. Patients received a routine planning CT scan. Co-registered whole pelvis CT and T2w MRI pairs were used as training images. Advanced tissue specific non-linear regression models to predict HU for the fat, muscle, bladder and air were created from co-registered CT-MRI image pairs. On a test case T2w MRI, the bones and bladder were automatically segmented using a novel statistical shape and appearance model, while other soft tissues were separated using an Expectation-Maximization based clustering model. The CT bone in the training database that was most 'similar' to the segmented bone was then transformed with deformable registration to create the sCT component of the test case T2w MRI bone tissue. Predictions for the bone, air and soft tissue from the separate regression models were successively combined to generate a whole pelvis sCT. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same IMRT dose plan was found to be [Formula: see text] (mean ± standard deviation) for 39 patients. The 3D Gamma pass rate was [Formula: see text] (2 mm/2%). The novel hybrid model is computationally efficient, generating an sCT in 20 min from standard T2w images for prostate cancer radiation therapy dose planning and DRR generation.
NASA Astrophysics Data System (ADS)
Ghose, Soumya; Greer, Peter B.; Sun, Jidi; Pichler, Peter; Rivest-Henault, David; Mitra, Jhimli; Richardson, Haylea; Wratten, Chris; Martin, Jarad; Arm, Jameen; Best, Leah; Dowling, Jason A.
2017-11-01
In MR only radiation therapy planning, generation of the tissue specific HU map directly from the MRI would eliminate the need of CT image acquisition and may improve radiation therapy planning. The aim of this work is to generate and validate substitute CT (sCT) scans generated from standard T2 weighted MR pelvic scans in prostate radiation therapy dose planning. A Siemens Skyra 3T MRI scanner with laser bridge, flat couch and pelvic coil mounts was used to scan 39 patients scheduled for external beam radiation therapy for localized prostate cancer. For sCT generation a whole pelvis MRI (1.6 mm 3D isotropic T2w SPACE sequence) was acquired. Patients received a routine planning CT scan. Co-registered whole pelvis CT and T2w MRI pairs were used as training images. Advanced tissue specific non-linear regression models to predict HU for the fat, muscle, bladder and air were created from co-registered CT-MRI image pairs. On a test case T2w MRI, the bones and bladder were automatically segmented using a novel statistical shape and appearance model, while other soft tissues were separated using an Expectation-Maximization based clustering model. The CT bone in the training database that was most ‘similar’ to the segmented bone was then transformed with deformable registration to create the sCT component of the test case T2w MRI bone tissue. Predictions for the bone, air and soft tissue from the separate regression models were successively combined to generate a whole pelvis sCT. The change in monitor units between the sCT-based plans relative to the gold standard CT plan for the same IMRT dose plan was found to be 0.3%+/-0.9% (mean ± standard deviation) for 39 patients. The 3D Gamma pass rate was 99.8+/-0.00 (2 mm/2%). The novel hybrid model is computationally efficient, generating an sCT in 20 min from standard T2w images for prostate cancer radiation therapy dose planning and DRR generation.
Fast solver for large scale eddy current non-destructive evaluation problems
NASA Astrophysics Data System (ADS)
Lei, Naiguang
Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two-dimension raster scan data typically takes one to two days on a dedicated eight-core PC. A novel direct integral solver for eddy current problems and GPU-based implementation is also investigated in this research to reduce the computational time.
NASA Astrophysics Data System (ADS)
Szurgacz, Dawid; Brodny, Jaroław
2018-01-01
A powered roof support is a machine responsible for protection of an underground excavation against deformation generated by rock mass. In the case of dynamic impact of rock mass, the proper level of protection is hard to achieve. Therefore, the units of the roof support and its components are subject to detailed tests aimed at acquiring greater reliability, efficiency and efficacy. In the course of such test, however, it is not always possible to foresee values of load that may occur in actual conditions. The article presents a case of a dynamic load impacting the powered roof support during a high-energy tremor in an underground hard coal mine. The authors discuss the method for selecting powered roof support units proper for specific forecasted load conditions. The method takes into account the construction of the support and mining and geological conditions of an excavation. Moreover, the paper includes tests carried out on hydraulic legs and yield valves which were responsible for additional yielding of the support. Real loads impacting the support unit during tremors are analysed. The results indicated that the real registered values of the load were significantly greater than the forecasted values. The analysis results of roof support operation during dynamic impact generated by the rock mass (real life conditions) prompted the authors to develop a set of recommendations for manufacturers and users of powered roof supports. These include, inter alia, the need for innovative solutions for testing hydraulic section systems.
A monitoring tool for performance improvement in plastic surgery at the individual level.
Maruthappu, Mahiben; Duclos, Antoine; Orgill, Dennis; Carty, Matthew J
2013-05-01
The assessment of performance in surgery is expanding significantly. Application of relevant frameworks to plastic surgery, however, has been limited. In this article, the authors present two robust graphic tools commonly used in other industries that may serve to monitor individual surgeon operative time while factoring in patient- and surgeon-specific elements. The authors reviewed performance data from all bilateral reduction mammaplasties performed at their institution by eight surgeons between 1995 and 2010. Operative time was used as a proxy for performance. Cumulative sum charts and exponentially weighted moving average charts were generated using a train-test analytic approach, and used to monitor surgical performance. Charts mapped crude, patient case-mix-adjusted, and case-mix and surgical-experience-adjusted performance. Operative time was found to decline from 182 minutes to 118 minutes with surgical experience (p < 0.001). Cumulative sum and exponentially weighted moving average charts were generated using 1995 to 2007 data (1053 procedures) and tested on 2008 to 2010 data (246 procedures). The sensitivity and accuracy of these charts were significantly improved by adjustment for case mix and surgeon experience. The consideration of patient- and surgeon-specific factors is essential for correct interpretation of performance in plastic surgery at the individual surgeon level. Cumulative sum and exponentially weighted moving average charts represent accurate methods of monitoring operative time to control and potentially improve surgeon performance over the course of a career.
NASA Astrophysics Data System (ADS)
Srinivas, Kadivendi; Vundavilli, Pandu R.; Manzoor Hussain, M.; Saiteja, M.
2016-09-01
Welding input parameters such as current, gas flow rate and torch angle play a significant role in determination of qualitative mechanical properties of weld joint. Traditionally, it is necessary to determine the weld input parameters for every new welded product to obtain a quality weld joint which is time consuming. In the present work, the effect of plasma arc welding parameters on mild steel was studied using a neural network approach. To obtain a response equation that governs the input-output relationships, conventional regression analysis was also performed. The experimental data was constructed based on Taguchi design and the training data required for neural networks were randomly generated, by varying the input variables within their respective ranges. The responses were calculated for each combination of input variables by using the response equations obtained through the conventional regression analysis. The performances in Levenberg-Marquardt back propagation neural network and radial basis neural network (RBNN) were compared on various randomly generated test cases, which are different from the training cases. From the results, it is interesting to note that for the above said test cases RBNN analysis gave improved training results compared to that of feed forward back propagation neural network analysis. Also, RBNN analysis proved a pattern of increasing performance as the data points moved away from the initial input values.
NASA Astrophysics Data System (ADS)
Tolen, J.; Kodra, E. A.; Ganguly, A. R.
2011-12-01
The assertion that higher-resolution experiments or more sophisticated process models within the IPCC AR5 CMIP5 suite of global climate model ensembles improves precipitation projections over the IPCC AR4 CMIP3 suite remains a hypothesis that needs to be rigorously tested. The questions are particularly important for local to regional assessments at scales relevant for the management of critical infrastructures and key resources, particularly for the attributes of sever precipitation events, for example, the intensity, frequency and duration of extreme precipitation. Our case study is South America, where precipitation and their extremes play a central role in sustaining natural, built and human systems. To test the hypothesis that CMIP5 improves over CMIP3 in this regard, spatial and temporal measures of prediction skill are constructed and computed by comparing climate model hindcasts with the NCEP-II reanalysis data, considered here as surrogate observations, for the entire globe and for South America. In addition, gridded precipitation observations over South America based on rain gage measurements are considered. The results suggest that the utility of the next-generation of global climate models over the current generation needs to be carefully evaluated on a case-by-case basis before communicating to resource managers and policy makers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, J; Shi, F; Hrycushko, B
2015-06-15
Purpose: For tandem and ovoid (T&O) HDR brachytherapy in our clinic, it is required that the planning physicist manually capture ∼10 images during planning, perform a secondary dose calculation and generate a report, combine them into a single PDF document, and upload it to a record- and-verify system to prove to an independent plan checker that the case was planned correctly. Not only does this slow down the already time-consuming clinical workflow, the PDF document also limits the number of parameters that can be checked. To solve these problems, we have developed a web-based automatic quality assurance (QA) program. Methods:more » We set up a QA server accessible through a web- interface. A T&O plan and CT images are exported as DICOMRT files and uploaded to the server. The software checks 13 geometric features, e.g. if the dwell positions are reasonable, and 10 dosimetric features, e.g. secondary dose calculations via TG43 formalism and D2cc to critical structures. A PDF report is automatically generated with errors and potential issues highlighted. It also contains images showing important geometric and dosimetric aspects to prove the plan was created following standard guidelines. Results: The program has been clinically implemented in our clinic. In each of the 58 T&O plans we tested, a 14- page QA report was automatically generated. It took ∼45 sec to export the plan and CT images and ∼30 sec to perform the QA tests and generate the report. In contrast, our manual QA document preparation tooks on average ∼7 minutes under optimal conditions and up to 20 minutes when mistakes were made during the document assembly. Conclusion: We have tested the efficiency and effectiveness of an automated process for treatment plan QA of HDR T&O cases. This software was shown to improve the workflow compared to our conventional manual approach.« less
Assuring consumer safety without animal testing: a feasibility case study for skin sensitisation.
Maxwell, Gavin; Aleksic, Maja; Aptula, Aynur; Carmichael, Paul; Fentem, Julia; Gilmour, Nicola; Mackay, Cameron; Pease, Camilla; Pendlington, Ruth; Reynolds, Fiona; Scott, Daniel; Warner, Guy; Westmoreland, Carl
2008-11-01
Allergic Contact Dermatitis (ACD; chemical-induced skin sensitisation) represents a key consumer safety endpoint for the cosmetics industry. At present, animal tests (predominantly the mouse Local Lymph Node Assay) are used to generate skin sensitisation hazard data for use in consumer safety risk assessments. An animal testing ban on chemicals to be used in cosmetics will come into effect in the European Union (EU) from March 2009. This animal testing ban is also linked to an EU marketing ban on products containing any ingredients that have been subsequently tested in animals, from March 2009 or March 2013, depending on the toxicological endpoint of concern. Consequently, the testing of cosmetic ingredients in animals for their potential to induce skin sensitisation will be subject to an EU marketing ban, from March 2013 onwards. Our conceptual framework and strategy to deliver a non-animal approach to consumer safety risk assessment can be summarised as an evaluation of new technologies (e.g. 'omics', informatics), leading to the development of new non-animal (in silico and in vitro) predictive models for the generation and interpretation of new forms of hazard characterisation data, followed by the development of new risk assessment approaches to integrate these new forms of data and information in the context of human exposure. Following the principles of the conceptual framework, we have been investigating existing and developing new technologies, models and approaches, in order to explore the feasibility of delivering consumer safety risk assessment decisions in the absence of new animal data. We present here our progress in implementing this conceptual framework, with the skin sensitisation endpoint used as a case study. 2008 FRAME.
Numerical Modelling of Solitary Wave Experiments on Rubble Mound Breakwaters
NASA Astrophysics Data System (ADS)
Guler, H. G.; Arikawa, T.; Baykal, C.; Yalciner, A. C.
2016-12-01
Performance of a rubble mound breakwater protecting Haydarpasa Port, Turkey, has been tested under tsunami attack by physical model tests conducted at Port and Airport Research Institute (Guler et al, 2015). It is aimed to understand dynamic force of the tsunami by conducting solitary wave tests (Arikawa, 2015). In this study, the main objective is to perform numerical modelling of solitary wave tests in order to verify accuracy of the CFD model IHFOAM, developed in OpenFOAM environment (Higuera et al, 2013), by comparing results of the numerical computations with the experimental results. IHFOAM is the numerical modelling tool which is based on VARANS equations with a k-ω SST turbulence model including realistic wave generation, and active wave absorption. Experiments are performed using a Froude scale of 1/30, measuring surface elevation and flow velocity at several locations in the wave channel, and wave pressure around the crown wall of the breakwater. Solitary wave tests with wave heights of H=7.5 cm and H=10 cm are selected which represent the results of the experiments. The first test (H=7.5 cm) is the case that resulted in no damage whereas the second case (H=10 cm) resulted in total damage due to the sliding of the crown wall. After comparison of the preliminary results of numerical simulations with experimental data for both cases, it is observed that solitary wave experiments could be accurately modeled using IHFOAM focusing water surface elevations, flow velocities, and wave pressures on the crown wall of the breakwater (Figure, result of sim. at t=29.6 sec). ACKNOWLEDGEMENTSThe authors acknowledge developers of IHFOAM, further extend their acknowledgements for the partial supports from the research projects MarDiM, ASTARTE, RAPSODI, and TUBITAK 213M534. REFERENCESArikawa (2015) "Consideration of Characteristics of Pressure on Seawall by Solitary Waves Based on Hydraulic Experiments", Jour. of Japan. Soc. of Civ. Eng. Ser. B2 (Coast. Eng.), Vol 71, p I889-I894 Guler, Arikawa, Oei, Yalciner (2015) "Performance of Rubble Mound Breakwaters under Tsunami Attack, A Case Study: Haydarpasa Port, Istanbul, Turkey", Coast. Eng. 104, 43-53 Higuera, Lara, Losada (2013) "Realistic Wave Generation and Active Wave Absorption for Navier-Stokes Models, Application to OpenFOAM", Coast. Eng. 71, 102-118
Experimental impact testing and analysis of composite fan cases
NASA Astrophysics Data System (ADS)
Vander Klok, Andrew Joe
For aircraft engine certification, one of the requirements is to demonstrate the ability of the engine to withstand a fan blade-out (FBO) event. A FBO event may be caused by fatigue failure of the fan blade itself or by impact damage of foreign objects such as bird strike. An un-contained blade can damage flight critical engine components or even the fuselage. The design of a containment structure is related to numerous parameters such as the blade tip speed; blade material, size and shape; hub/tip diameter; fan case material, configuration, rigidity, etc. To investigate all parameters by spin experiments with a full size rotor assembly can be prohibitively expensive. Gas gun experiments can generate useful data for the design of engine containment cases at much lower costs. To replicate the damage modes similar to that on a fan case in FBO testing, the gas gun experiment has to be carefully designed. To investigate the experimental procedure and data acquisition techniques for FBO test, a low cost, small spin rig was first constructed. FBO tests were carried out with the small rig. The observed blade-to-fan case interactions were similar to those reported using larger spin rigs. The small rig has the potential in a variety of applications from investigating FBO events, verifying concept designs of rotors, to developing spin testing techniques. This rig was used in the developments of the notched blade releasing mechanism, a wire trigger method for synchronized data acquisition, high speed video imaging and etc. A relationship between the notch depth and the release speed was developed and verified. Next, an original custom designed spin testing facility was constructed. Driven by a 40HP, 40,000rpm air turbine, the spin rig is housed in a vacuum chamber of phi72inx40in (1829mmx1016mm). The heavily armored chamber is furnished with 9 viewports. This facility enables unprecedented investigations of FBO events. In parallel, a 15.4ft (4.7m) long phi4.1inch (105mm) diameter single stage gas gun was developed. A thermodynamic based relationship between the required gas pressure and targeted velocity was proposed. The predicted velocity was within +/-7%. Quantitative measurements of force and displacement were attempted. The transmitted impact force was measured with load cells. The out-of-plane deformation was measured with a projection grating profilometry method. The composite panels and fan cases used in this work were made of S2-glass plain weave fabrics with API SC-15 toughened epoxy resin using the vacuum assisted resin transfer molding (VARTM) method. Using the gas gun, the impact behavior of the composite was investigated at velocities ranging from 984ft/s to 1502ft/s (300m/s to 458m/s) following a draft ASTM testing standard. To compare the ballistic protection capability of different materials, a new parameter EBL, the projectile kinetic energy at the target ballistic limit normalized by the contact area of the projectile, was proposed. S2-glass/epoxy composite is ranked very high in EBL per areal weight. Finally, a testing method for replicating spin pit testing with a gas gun test was developed. Major differences between the two tests are the initial conditions of the blade upon contact with the target. In spin testing, the released blade has two velocity components, rotational and translational whereas in gas gun testing, the projectile has only the translational velocity. To account for the influence of the rotational velocity, three projectile designs were experimentally investigated. The results show that to generate similar damage modes in gas gun testing, it is critical to ensure the deformation of the projectile before testing is similar to that of a released blade. With the pre-bent blade, the gas gun experiment was able to replicate the damage modes of the fan case in FBO test on flat composite panels.
A New Method for Incremental Testing of Finite State Machines
NASA Technical Reports Server (NTRS)
Pedrosa, Lehilton Lelis Chaves; Moura, Arnaldo Vieira
2010-01-01
The automatic generation of test cases is an important issue for conformance testing of several critical systems. We present a new method for the derivation of test suites when the specification is modeled as a combined Finite State Machine (FSM). A combined FSM is obtained conjoining previously tested submachines with newly added states. This new concept is used to describe a fault model suitable for incremental testing of new systems, or for retesting modified implementations. For this fault model, only the newly added or modified states need to be tested, thereby considerably reducing the size of the test suites. The new method is a generalization of the well-known W-method and the G-method, but is scalable, and so it can be used to test FSMs with an arbitrarily large number of states.
Clinical and Genetic Diagnosis of Nonischemic Sudden Cardiac Death.
Jiménez-Jáimez, Juan; Alcalde Martínez, Vicente; Jiménez Fernández, Miriam; Bermúdez Jiménez, Francisco; Rodríguez Vázquez Del Rey, María Del Mar; Perin, Francesca; Oyonarte Ramírez, José Manuel; López Fernández, Silvia; de la Torre, Inmaculada; García Orta, Rocío; González Molina, Mercedes; Cabrerizo, Elisa María; Álvarez Abril, Beatriz; Álvarez, Miguel; Macías Ruiz, Rosa; Correa, Concepción; Tercedor, Luis
2017-10-01
Nonischemic sudden cardiac death (SCD) is predominantly caused by cardiomyopathies and channelopathies. There are many diagnostic tests, including some complex techniques. Our aim was to analyze the diagnostic yield of a systematic diagnostic protocol in a specialized unit. The study included 56 families with at least 1 index case of SCD (resuscitated or not). Survivors were studied with electrocardiogram, advanced cardiac imaging, exercise testing, familial study, genetic testing and, in some cases, pharmacological testing. Families with deceased probands were studied using the postmortem findings, familial evaluation, and molecular autopsy with next-generation sequencing (NGS). A positive diagnosis was obtained in 80.4% of the cases, with no differences between survivors and nonsurvivors (P=.53). Cardiac channelopathies were more prevalent among survivors than nonsurvivors (66.6% vs 40%, P=.03). Among the 30 deceased probands, the definitive diagnosis was given by autopsy in 7. A diagnosis of cardiomyopathy tended to be associated with a higher event rate in the family. Genetic testing with NGS was performed in 42 index cases, with a positive result in 28 (66.6%), with no differences between survivors and nonsurvivors (P=.21). There is a strong likelihood of reaching a diagnosis in SCD after a rigorous protocol, with a more prevalent diagnosis of channelopathy among survivors and a worse familial prognosis in cardiomyopathies. Genetic testing with NGS is useful and its value is increasing with respect to the Sanger method. Copyright © 2017 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Goncharov, German A.
2005-11-01
On 22 November 1955, the Semipalatinsk test site saw the test of the first domestic two-stage thermonuclear RDS-37 charge. The charge operation was based on the principle of radiation implosion. The kernel of the principle consists in the radiation generated in a primary A-bomb explosion and confined by the radiation-opaque casing propagating throughout the interior casing volume and flowing around the secondary thermonuclear unit. The secondary unit experiences a strong compression under the irradiation, with a resulting nuclear and thermonuclear explosion. The RDS-37 explosion was the strongest of all those ever realized at the Semipalatinsk test site. It produced an indelible impression on the participants in the test. This document-based paper describes the genesis of the ideas underlying the RDS-37 design and reflects the critical moments in its development. The advent of RDS-37 was an outstanding accomplishment of the scientists and engineers of our country.
Metrology laboratory requirements for third-generation synchrotron radiation sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takacs, P.Z.; Quian, Shinan
1997-11-01
New third-generation synchrotron radiation sources that are now, or will soon, come on line will need to decide how to handle the testing of optical components delivered for use in their beam lines. In many cases it is desirable to establish an in-house metrology laboratory to do the work. We review the history behind the formation of the Optical Metrology Laboratory at Brookhaven National Laboratory and the rationale for its continued existence. We offer suggestions to those who may be contemplating setting up similar facilities, based on our experiences over the past two decades.
Economic environmental dispatch using BSA algorithm
NASA Astrophysics Data System (ADS)
Jihane, Kartite; Mohamed, Cherkaoui
2018-05-01
Economic environmental dispatch problem (EED) is an important issue especially in the field of fossil fuel power plant system. It allows the network manager to choose among different units the most optimized in terms of fuel costs and emission level. The objective of this paper is to minimize the fuel cost with emissions constrained; the test is conducted for two cases: six generator unit and ten generator unit for the same power demand 1200Mw. The simulation has been computed in MATLAB and the result shows the robustness of the Backtracking Search optimization Algorithm (BSA) and the impact of the load demand on the emission.
Heat Melt Compactor Development Progress
NASA Technical Reports Server (NTRS)
Lee, Jeffrey M.; Fisher, John W.; Pace, Gregory
2017-01-01
The status of the Heat Melt Compactor (HMC) development project is reported. HMC Generation 2 (Gen 2) has been assembled and initial testing has begun. A baseline mission use case for trash volume reduction, water recovery, trash sterilization, and the venting of effluent gases and water vapor to space has been conceptualized. A test campaign to reduce technical risks is underway. This risk reduction testing examines the many varied operating scenarios and conditions needed for processing trash during a space mission. The test results along with performance characterization of HMC Gen 2 will be used to prescribe requirements and specifications for a future ISS flight Technology Demonstration. We report on the current status, technical risks, and test results in the context of an ISS vent-to-space Technology Demonstration.
Chitty, Lyn S; Mason, Sarah; Barrett, Angela N; McKay, Fiona; Lench, Nicholas; Daley, Rebecca; Jenkins, Lucy A
2015-07-01
Accurate prenatal diagnosis of genetic conditions can be challenging and usually requires invasive testing. Here, we demonstrate the potential of next-generation sequencing (NGS) for the analysis of cell-free DNA in maternal blood to transform prenatal diagnosis of monogenic disorders. Analysis of cell-free DNA using a PCR and restriction enzyme digest (PCR-RED) was compared with a novel NGS assay in pregnancies at risk of achondroplasia and thanatophoric dysplasia. PCR-RED was performed in 72 cases and was correct in 88.6%, inconclusive in 7% with one false negative. NGS was performed in 47 cases and was accurate in 96.2% with no inconclusives. Both approaches were used in 27 cases, with NGS giving the correct result in the two cases inconclusive with PCR-RED. NGS provides an accurate, flexible approach to non-invasive prenatal diagnosis of de novo and paternally inherited mutations. It is more sensitive than PCR-RED and is ideal when screening a gene with multiple potential pathogenic mutations. These findings highlight the value of NGS in the development of non-invasive prenatal diagnosis for other monogenic disorders. © 2015 John Wiley & Sons, Ltd.
Development of Yield and Tensile Strength Design Curves for Alloy 617
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nancy Lybeck; T. -L. Sham
2013-10-01
The U.S. Department of Energy Very High Temperature Reactor Program is acquiring data in preparation for developing an Alloy 617 Code Case for inclusion in the nuclear section of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel (B&PV) Code. A draft code case was previously developed, but effort was suspended before acceptance by ASME. As part of the draft code case effort, a database was compiled of yield and tensile strength data from tests performed in air. Yield strength and tensile strength at temperature are used to set time independent allowable stress for construction materials in B&PVmore » Code, Section III, Subsection NH. The yield and tensile strength data used for the draft code case has been augmented with additional data generated by Idaho National Laboratory and Oak Ridge National Laboratory in the U.S. and CEA in France. The standard ASME Section II procedure for generating yield and tensile strength at temperature is presented, along with alternate methods that accommodate the change in temperature trends seen at high temperatures, resulting in a more consistent design margin over the temperature range of interest.« less
González-García, Estefanía; Maly, Marek; de la Mata, Francisco Javier; Gómez, Rafael; Marina, María Luisa; García, María Concepción
2017-01-01
This work proposes a deep study on the interactions between sulphonate-terminated carbosilane dendrimers and proteins. Three different proteins with different molecular weights and isoelectric points were employed and different pHs, dendrimer concentrations and generations were tested. Variations in fluorescence intensity and emission wavelength were used as protein-dendrimer interaction probes. Interaction between dendrimers and proteins greatly depended on the protein itself and pH. Other important issues were the dendrimer concentration and generation. Protein-dendrimer interactions were favored under acidic working conditions when proteins were positively charged. Moreover, in general, high dendrimer generations promoted these interactions. Modeling of protein-dendrimer interactions allowed to understand the different behaviors observed for every protein. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Dywer, T. A. W., III; Lee, G. K. F.
1984-01-01
In connection with the current interest in agile spacecraft maneuvers, it has become necessary to consider the nonlinear coupling effects of multiaxial rotation in the treatment of command generation and tracking problems. Multiaxial maneuvers will be required in military missions involving a fast acquisition of moving targets in space. In addition, such maneuvers are also needed for the efficient operation of robot manipulators. Attention is given to details regarding the direct nonlinear command generation and tracking, an approach which has been successfully applied to the design of control systems for V/STOL aircraft, linearizing transformations for spacecraft controlled with external thrusters, the case of flexible spacecraft dynamics, examples from robot dynamics, and problems of implementation and testing.
Harnessing Next-Generation Sequencing Capabilities for Microbial Forensics
2014-07-15
content typing, rep-PCR, pulsed-field gel electrophoresis, optical mapping, and antimicrobial susceptibility testing (G. Gault et al., 2011; P...Tremlett, G, Pidd, 2011). This case demonstrates the vulnerability of our food supply and why unusual outbreaks involving endemic microbes must be taken as... food products to malevolent tampering, and the widespread international economic consequences that can occur even from limited product contamination
Use of Item Models in a Large-Scale Admissions Test: A Case Study
ERIC Educational Resources Information Center
Sinharay, Sandip; Johnson, Matthew S.
2008-01-01
"Item models" (LaDuca, Staples, Templeton, & Holzman, 1986) are classes from which it is possible to generate items that are equivalent/isomorphic to other items from the same model (e.g., Bejar, 1996, 2002). They have the potential to produce large numbers of high-quality items at reduced cost. This article introduces data from an…
ERIC Educational Resources Information Center
Jolivette, Kristine; Stichter, Janine P.; Houchins, David E.; Kennedy, Christina
2007-01-01
Functional analysis is used to generate and test hypotheses, specific to an individual's appropriate and inappropriate behaviors, by directly manipulating antecedent and consequent events within natural or analog environments. In the case that a function(s) was not determined or the behavior has multiple motivations during the functional analysis,…
ERIC Educational Resources Information Center
Dansby, Jacqueline O.; Dansby-Giles, Gloria
2011-01-01
Educational reform in the United States has focused on several factors such as academic achievement, performance on standardized test scores, dropout rates, the mandate of the No Child Left Behind (NCLB) Act of 2001 (Dee and Jacob, 2010) and other changes. A new call for a broader and bolder strategy for educational reform that focused on…
A review of the use of vortex generators for mitigating shock-induced separation
NASA Astrophysics Data System (ADS)
Titchener, Neil; Babinsky, Holger
2015-09-01
This article reviews research into the potential of vortex generators to mitigate shock-induced separation. Studies ranging from those conducted in the early post-war era to those performed recently are discussed. On the basis of the investigations described in this report, it is clear that vortex generators can alleviate shock-induced boundary layer separation. Yet, it will be shown that their potential and efficiency varies considerably in practical applications. Much more success is reported in transonic test cases compared to separation induced in purely supersonic interactions. Under a variety of flow conditions, the best performance is achieved with vortex generators with a height of roughly half the boundary layer thickness and a shape similar to a swept vane. Notwithstanding this, vortex generator performance is not as consistent as it is in low-speed applications. Further work is required before vortex generators can be implemented into the design process for eliminating shock-induced separation on transonic wings and in supersonic inlets.
Analysis of cash flow in academic medical centers in the United States.
McCue, Michael J; Thompson, Jon M
2011-09-01
To examine cash flow margins in academic medical centers (AMCs; i.e., teaching hospitals) in an effort both to determine any significant differences in a set of operational and financial factors known to influence cash flow for high- and low-cash-flow AMCs and to discuss how these findings affect AMC operations. The authors sampled the Medicare cost report data of 103 AMCs for fiscal years 2005, 2006, and 2007, and then they applied the t test to test for significant mean differences between the two cash flow groups across operational and financial variables (e.g., case mix, operating margin). Compared with low-cash-flow AMCs, high-cash-flow AMCs were larger-bed-size facilities, treated cases of greater complexity, generated higher net patient revenue per adjusted discharge, served a significantly lower percentage of Medicaid patients, had significantly higher average operating profit margins and cash flow margin ratios, possessed a higher number of days of cash on hand, and collected their receivables more quickly. Study findings imply that high-cash-flow AMCs were earning higher cash flow returns than low-cash-flow AMCs, which may be because high-cash-flow AMCs generate higher patient revenues while serving fewer lower-paying Medicaid patients.
Simic, Vladimir; Dimitrijevic, Branka
2015-02-01
An interval linear programming approach is used to formulate and comprehensively test a model for optimal long-term planning of vehicle recycling in the Republic of Serbia. The proposed model is applied to a numerical case study: a 4-year planning horizon (2013-2016) is considered, three legislative cases and three scrap metal price trends are analysed, availability of final destinations for sorted waste flows is explored. Potential and applicability of the developed model are fully illustrated. Detailed insights on profitability and eco-efficiency of the projected contemporary equipped vehicle recycling factory are presented. The influences of the ordinance on the management of end-of-life vehicles in the Republic of Serbia on the vehicle hulks procuring, sorting generated material fractions, sorted waste allocation and sorted metals allocation decisions are thoroughly examined. The validity of the waste management strategy for the period 2010-2019 is tested. The formulated model can create optimal plans for procuring vehicle hulks, sorting generated material fractions, allocating sorted waste flows and allocating sorted metals. Obtained results are valuable for supporting the construction and/or modernisation process of a vehicle recycling system in the Republic of Serbia. © The Author(s) 2015.
Church, Alanna J; Calicchio, Monica L; Nardi, Valentina; Skalova, Alena; Pinto, Andre; Dillon, Deborah A; Gomez-Fernandez, Carmen R; Manoj, Namitha; Haimes, Josh D; Stahl, Joshua A; Dela Cruz, Filemon S; Tannenbaum-Dvir, Sarah; Glade-Bender, Julia L; Kung, Andrew L; DuBois, Steven G; Kozakewich, Harry P; Janeway, Katherine A; Perez-Atayde, Antonio R; Harris, Marian H
2018-03-01
Infantile fibrosarcoma and congenital mesoblastic nephroma are tumors of infancy traditionally associated with the ETV6-NTRK3 gene fusion. However, a number of case reports have identified variant fusions in these tumors. In order to assess the frequency of variant NTRK3 fusions, and in particular whether the recently identified EML4-NTRK3 fusion is recurrent, 63 archival cases of infantile fibrosarcoma, congenital mesoblastic nephroma, mammary analog secretory carcinoma and secretory breast carcinoma (tumor types that are known to carry recurrent ETV6-NTRK3 fusions) were tested with NTRK3 break-apart FISH, EML4-NTRK3 dual fusion FISH, and targeted RNA sequencing. The EML4-NTRK3 fusion was identified in two cases of infantile fibrosarcoma (one of which was previously described), and in one case of congenital mesoblastic nephroma, demonstrating that the EML4-NTRK3 fusion is a recurrent genetic event in these related tumors. The growing spectrum of gene fusions associated with infantile fibrosarcoma and congenital mesoblastic nephroma along with the recent availability of targeted therapies directed toward inhibition of NTRK signaling argue for alternate testing strategies beyond ETV6 break-apart FISH. The use of either NTRK3 FISH or next-generation sequencing will expand the number of cases in which an oncogenic fusion is identified and facilitate optimal diagnosis and treatment for patients.
Five years of full-scale utility demonstration of pulsed energization of electric precipitators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, S.A.; Jacobus, P.L.; Casey, P.J.
1996-11-01
In a conventional electrostatic precipitator (ESP) the applied dc voltage fulfills three functions: (1) generation of negative ions, (2) charging of particles, and (3) transport of the charged particles to the collecting plates. In the case of high resistivity fly-ash (often associated with the burning of low sulfur coal) the dc voltage is limited by repeated electrical discharges and in extreme cases by back-corona. Lowering the applied dc voltage reduces sparking and back-corona, but also reduces the field on the discharge wires and leads to poorly distributed ion generation as well as reduced charging and particle transport forces. Pulsed energization,more » which consists of superimposing high voltage pulses of short duration onto the existing base dc voltage, offers an attractive way to improve the collection efficiency of ESPs suffering from poor energization. The superimposed pulses become responsible for uniform ion generation while the underlying dc field continues to fulfill the function of particle charging and transport. This paper describes the five-year test of the ESP at Madison Gas and Electric`s Blount Station.« less
Predictive tests to evaluate oxidative potential of engineered nanomaterials
NASA Astrophysics Data System (ADS)
Ghiazza, Mara; Carella, Emanuele; Oliaro-Bosso, Simonetta; Corazzari, Ingrid; Viola, Franca; Fenoglio, Ivana
2013-04-01
Oxidative stress constitutes one of the principal injury mechanisms through which particulate toxicants (asbestos, crystalline silica, hard metals) and engineered nanomaterials can induce adverse health effects. ROS may be generated indirectly by activated cells and/or directly at the surface of the material. The occurrence of these processes depends upon the type of material. Many authors have recently demonstrated that metal oxides and carbon-based nanoparticles may influence (increasing or decreasing) the generation of oxygen radicals in a cell environment. Metal oxide, such as iron oxides, crystalline silica, and titanium dioxide are able to generate free radicals via different mechanisms causing an imbalance within oxidant species. The increase of ROS species may lead to inflammatory responses and in some cases to the development of cancer. On the other hand carbon-based nanomaterials, such as fullerene, carbon nanotubes, carbon black as well as cerium dioxide are able to scavenge the free radicals generated acting as antioxidant. The high numbers of new-engineered nanomaterials, which are introduced in the market, are exponentially increasing. Therefore the definition of toxicological strategies is urgently needed. The development of acellular screening tests will make possible the reduction of the number of in vitro and in vivo tests to be performed. An integrated protocol that may be used to predict the oxidant/antioxidant potential of engineered nanoparticles will be here presented.
NASA Astrophysics Data System (ADS)
Liu, Dan; Li, Congsheng; Kang, Yangyang; Zhou, Zhou; Xie, Yi; Wu, Tongning
2017-09-01
In this study, the plane wave exposure of an infant to radiofrequency electromagnetic fields of 3.5 GHz was numerically analyzed to investigate the unintentional electromagnetic field (EMF) exposure of fifth generation (5G) signals during field test. The dosimetric influence of age-dependent dielectric properties and the influence of an adult body were evaluated using an infant model of 12 month old and an adult female model. The results demonstrated that the whole body-averaged specific absorption rate (WBASAR) was not significantly affected by age-dependent dielectric properties and the influence of the adult body did not enhance WBASAR. Taking the magnitude of the in situ
Computer generated maps from digital satellite data - A case study in Florida
NASA Technical Reports Server (NTRS)
Arvanitis, L. G.; Reich, R. M.; Newburne, R.
1981-01-01
Ground cover maps are important tools to a wide array of users. Over the past three decades, much progress has been made in supplementing planimetric and topographic maps with ground cover details obtained from aerial photographs. The present investigation evaluates the feasibility of using computer maps of ground cover from satellite input tapes. Attention is given to the selection of test sites, a satellite data processing system, a multispectral image analyzer, general purpose computer-generated maps, the preliminary evaluation of computer maps, a test for areal correspondence, the preparation of overlays and acreage estimation of land cover types on the Landsat computer maps. There is every indication to suggest that digital multispectral image processing systems based on Landsat input data will play an increasingly important role in pattern recognition and mapping land cover in the years to come.
Pareto versus lognormal: A maximum entropy test
NASA Astrophysics Data System (ADS)
Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano
2011-08-01
It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.
NASA Astrophysics Data System (ADS)
Guerra, Jorge; Ullrich, Paul
2016-04-01
Tempest is a next-generation global climate and weather simulation platform designed to allow experimentation with numerical methods for a wide range of spatial resolutions. The atmospheric fluid equations are discretized by continuous / discontinuous finite elements in the horizontal and by a staggered nodal finite element method (SNFEM) in the vertical, coupled with implicit/explicit time integration. At horizontal resolutions below 10km, many important questions remain on optimal techniques for solving the fluid equations. We present results from a suite of idealized test cases to validate the performance of the SNFEM applied in the vertical with an emphasis on flow features and dynamic behavior. Internal gravity wave, mountain wave, convective bubble, and Cartesian baroclinic instability tests will be shown at various vertical orders of accuracy and compared with known results.
Numerical and experimental investigation of VG flow control for a low-boom inlet
NASA Astrophysics Data System (ADS)
Rybalko, Michael
The application of vortex generators (VGs) for shock/boundary layer interaction flow control in a novel external compression, axisymmetric, low-boom concept inlet was studied using numerical and experimental methods. The low-boom inlet design features a zero-angle cowl and relaxed isentropic compression centerbody spike, resulting in defocused oblique shocks and a weak terminating normal shock. This allows reduced external gas dynamic waves at high mass flow rates but suffers from flow separation near the throat and a large hub-side boundary layer at the Aerodynamic Interface Plane (AIP), which marks the inflow to the jet engine turbo-machinery. Supersonic VGs were investigated to reduce the shock-induced flow separation near the throat while subsonic VGs were investigated to reduce boundary layer radial distortion at the AIP. To guide large-scale inlet experiments, Reynolds-Averaged Navier-Stokes (RANS) simulations using three-dimensional, structured, chimera (overset) grids and the WIND-US code were conducted. Flow control cases included conventional and novel types of vortex generators at positions both upstream of the terminating normal shock (supersonic VGs) and downstream (subsonic VGs). The performance parameters included incompressible axisymmetric shape factor, post-shock separation area, inlet pressure recovery, and mass flow ratio. The design of experiments (DOE) methodology was used to select device size and location, analyze the resulting data, and determine the optimal choice of device geometry. Based on the above studies, a test matrix of supersonic and subsonic VGs was adapted for a large-scale inlet test to be conducted at the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). Comparisons of RANS simulations with data from the Fall 2010 8'x6' inlet test showed that predicted VG performance trends and case rankings for both supersonic and subsonic devices were consistent with experimental results. For example, experimental surface oil flow visualization revealed a significant post-shock separation bubble with flow recirculation for the baseline (no VG) case that was substantially broken up in the micro-ramp VG case, consistent with simulations. Furthermore, the predicted subsonic VG performance with respect to a reduction in radial distortion (quantified in terms of axisymmetric incompressible shape factor) was found to be consistent with boundary layer rake measurements. To investigate the unsteady turbulent flow features associated with the shock-induced flow separation and the hub-side boundary layer, a detached eddy simulation (DES) approach using the WIND-US code was employed to model the baseline inlet flow field. This approach yielded improved agreement with experimental data for time-averaged diffuser stagnation pressure profiles and allowed insight into the pressure fluctuations and turbulent kinetic energy distributions which may be present at the AIP. In addition, streamwise shock position statistics were obtained and compared with experimental Schlieren results. The predicted shock oscillations were much weaker than those seen experimentally (by a factor of four), which indicates that the mechanism for the experimental shock oscillations was not captured. In addition, the novel supersonic vortex generator geometries were investigated experimentally (prior to the large-scale inlet 8'x6' wind tunnel tests) in an inlet-relevant flow field containing a Mach 1.4 normal shock wave followed by a subsonic diffuser. A parametric study of device height and distance upstream of the normal shock was undertaken for split-ramp and ramped-vane geometries. Flow field diagnostics included high-speed Schlieren, oil flow visualization, and Pitot-static pressure measurements. Parameters including flow separation, pressure recovery, centerline incompressible boundary layer shape factor, and shock stability were analyzed and compared to the baseline uncontrolled case. While all vortex generators tested eliminated centerline flow separation, the presence of VGs also increased the significant three-dimensionality of the flow via increased side-wall interaction. The stronger streamwise vorticity generated by ramped-vanes also yielded improved pressure recovery and fuller boundary layer velocity profiles within the subsonic diffuser. (Abstract shortened by UMI.)
Twist effects in quantum vortices and phase defects
NASA Astrophysics Data System (ADS)
Zuccher, Simone; Ricca, Renzo L.
2018-02-01
In this paper we show that twist, defined in terms of rotation of the phase associated with quantum vortices and other physical defects effectively deprived of internal structure, is a property that has observable effects in terms of induced axial flow. For this we consider quantum vortices governed by the Gross-Pitaevskii equation (GPE) and perform a number of test cases to investigate and compare the effects of twist in two different contexts: (i) when this is artificially superimposed on an initially untwisted vortex ring; (ii) when it is naturally produced on the ring by the simultaneous presence of a central straight vortex. In the first case large amplitude perturbations quickly develop, generated by the unnatural setting of the initial condition that is not an analytical solution of the GPE. In the second case much milder perturbations emerge, signature of a genuine physical process. This scenario is confirmed by other test cases performed at higher twist values. Since the second setting corresponds to essential linking, these results provide new evidence of the influence of topology on physics.
NASA Technical Reports Server (NTRS)
Kazin, S. B.; Minzner, W. R.; Paas, J. E.
1971-01-01
A scale model of the bypass flow region of a 1.5 pressure ratio, single stage, low tip speed fan was tested with a rotor tip casing bleed slot to determine its effects on noise generation. The bleed slot was located 1/2 inch (1.3 cm) upstream of the rotor leading edge and was configured to be a continuous opening around the circumference. The bleed manifold system was operated over a range of bleed rates corresponding to as much as 6% of the fan flow at approach thrust and 4.25% of the fan flow at takeoff thrust. Acoustic results indicate that a bleed rate of 4% of the fan flow reduces the fan maximum approach 200 foot (61.0 m) sideline PNL 0.5 PNdB and the corresponding takeoff thrust noise 1.1 PNdB below the level with zero bleed. However, comparison of the standard casing (no bleed slot) and the slotted bleed casing with zero bleed shows that the bleed slot itself caused a noise increase.
Bakas, Spyridon; Zeng, Ke; Sotiras, Aristeidis; Rathore, Saima; Akbari, Hamed; Gaonkar, Bilwaj; Rozycki, Martin; Pati, Sarthak; Davatzikos, Christos
2016-01-01
We present an approach for segmenting low- and high-grade gliomas in multimodal magnetic resonance imaging volumes. The proposed approach is based on a hybrid generative-discriminative model. Firstly, a generative approach based on an Expectation-Maximization framework that incorporates a glioma growth model is used to segment the brain scans into tumor, as well as healthy tissue labels. Secondly, a gradient boosting multi-class classification scheme is used to refine tumor labels based on information from multiple patients. Lastly, a probabilistic Bayesian strategy is employed to further refine and finalize the tumor segmentation based on patient-specific intensity statistics from the multiple modalities. We evaluated our approach in 186 cases during the training phase of the BRAin Tumor Segmentation (BRATS) 2015 challenge and report promising results. During the testing phase, the algorithm was additionally evaluated in 53 unseen cases, achieving the best performance among the competing methods.
NASA Technical Reports Server (NTRS)
Blelly, Pierre-Louis; Barakat, Abdullah R.; Fontanari, Jean; Alcayde, Denis; Blanc, Michel; Wu, Jian; Lathuillere, C.
1992-01-01
A method presented by Wu et al. (1992) for computing the H(+) vertical velocity from the main ionospheric parameters measured by the EISCAT VHF radar is tested in a fully controlled sequence which consists of generating an ideal ionospheric model by solving the coupled continuity and momentum equations for a two-ion plasma (O(+) and H(+)). Synthetic autocorrelation functions are generated from this model with the radar characteristics and used as actual measurements to compute the H(+) vertical velocities. Results of these simulations are shown and discussed for three cases of typical and low SNR and for low and increased mixing ratios. In most cases general agreement is found between computed H(+) velocities and generic ones with the altitude range considered, i.e., 200-1000 km. The method is shown to be reliable.
Computer simulations of planetary accretion dynamics: Sensitivity to initial conditions
NASA Technical Reports Server (NTRS)
Isaacman, R.; Sagan, C.
1976-01-01
The implications and limitations of program ACRETE were tested. The program is a scheme based on Newtonian physics and accretion with unit sticking efficiency, devised to simulate the origin of the planets. The dependence of the results on a variety of radial and vertical density distribution laws, the ratio of gas to dust in the solar nebula, the total nebular mass, and the orbital eccentricity of the accreting grains was explored. Only for a small subset of conceivable cases are planetary systems closely like our own generated. Many models have tendencies towards one of two preferred configurations: multiple star systems, or planetary systems in which Jovian planets either have substantially smaller masses than in our system or are absent altogether. But for a wide range of cases recognizable planetary systems are generated - ranging from multiple star systems with accompanying planets, to systems with Jovian planets at several hundred AU, to single stars surrounded only by asteroids.
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
Carter, Jamal; Miller, James Adam; Feller-Kopman, David; Ettinger, David; Sidransky, David; Maleki, Zahra
2017-07-01
Non-small-cell lung cancer (NSCLC)-associated malignant pleural effusions (MPEs) are sometimes the only available specimens for molecular analysis. This study evaluates diagnostic yield of NSCLC-associated MPE, its adequacy for molecular profiling and the potential influence of MPE volume/cellularity on the analytic sensitivity of our assays. Molecular results of 50 NSCLC-associated MPE cases during a 5-year period were evaluated. Molecular profiling was performed on cell blocks and consisted of fluorescent in situ hybridization (FISH) for ALK gene rearrangements and the following sequencing platforms: Sanger sequencing (for EGFR) and high-throughput pyrosequencing (for KRAS and BRAF) during the first 4 years of the study period, and targeted next-generation sequencing performed thereafter. A total of 50 NSCLC-associated MPE cases were identified where molecular testing was requested. Of these, 17 cases were excluded: 14 cases (28%) due to inadequate tumor cellularity and 3 cases due to unavailability of the slides to review. A total of 27 out of 50 MPE cases (54%) underwent at least EGFR and KRAS sequencing and FISH for ALK rearrangement. Of the 27 cases with molecular testing results available, a genetic abnormality was detected in 16 cases (59%). The most common genetic aberrations identified involved EGFR ( 9 ) and KRAS ( 7 ). Six cases had ALK FISH only, of which one showed rearrangement. MPE volume was not associated with overall cellularity or tumor cellularity (P = 0.360). Molecular profiling of MPE is a viable alternative to testing solid tissue in NSCLC. This study shows successful detection of genetic aberrations in 59% of samples with minimal risk of false negative.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartholomew, Rachel A.; Ozanich, Richard M.; Arce, Jennifer S.
2017-02-01
The goal of this testing was to evaluate the ability of currently available commercial off-the-shelf (COTS) biological indicator tests and immunoassays to detect Bacillus anthracis (Ba) spores and ricin. In general, immunoassays provide more specific identification of biological threats as compared to indicator tests [3]. Many of these detection products are widely used by first responders and other end users. In most cases, performance data for these instruments are supplied directly from the manufacturer, but have not been verified by an external, independent assessment [1]. Our test plan modules included assessments of inclusivity (ability to generate true positive results), commonlymore » encountered hoax powders (which can cause potential interferences or false positives), and estimation of limit of detection (LOD) (sensitivity) testing.« less
ERIC Educational Resources Information Center
Cournoyer, Amy Beth
2014-01-01
This case study investigated case-based pedagogy using student-teacher-generated cases as an instructional tool in the preparation of 12 pre-service ESL, Bilingual, and Modern Foreign Language teachers enrolled in a Student Teaching Seminar at a post-secondary institution. In the fall methods course, each participant generated a case study based…
Dynamic Analysis and Test Results for an STC Stirling Generator
NASA Astrophysics Data System (ADS)
Qiu, Songgang; Peterson, Allen A.
2004-02-01
Long-life, high-efficiency generators based on free-piston Stirling machines are a future energy-conversion solution for both space and commercial applications. To aid in design and system integration efforts, Stirling Technology Company (STC) has developed dynamic simulation models for the internal moving subassemblies and for complete Stirling convertor assemblies. These dynamic models have been validated using test data from operating prototypes. Simplified versions of these models are presented to help explain the operating characteristics of the Stirling convertor. Power spectrum analysis is presented for the test data for casing acceleration, piston motion, displacer motion, and controller current/voltage during full power operation. The harmonics of a Stirling convertor and its moving components are identified for the STC zener-diode control scheme. The dynamic behavior of each moving component and its contribution to the system dynamics and resultant vibration forces are discussed. Additionally, the effects of a passive balancer and external suspension are predicted by another simplified system model.
Marchetti, Antonio; Pace, Maria Vittoria; Di Lorito, Alessia; Canarecci, Sara; Felicioni, Lara; D'Antuono, Tommaso; Liberatore, Marcella; Filice, Giampaolo; Guetti, Luigi; Mucilli, Felice; Buttitta, Fiamma
2016-09-01
Anaplastic Lymphoma Kinase (ALK) gene rearrangements have been described in 3-5% of lung adenocarcinomas (ADC) and their identification is essential to select patients for treatment with ALK tyrosine kinase inhibitors. For several years, fluorescent in situ hybridization (FISH) has been considered as the only validated diagnostic assay. Currently, alternative methods are commercially available as diagnostic tests. A series of 217 ADC comprising 196 consecutive resected tumors and 21 ALK FISH-positive cases from an independent series of 702 ADC were investigated. All specimens were screened by IHC (ALK-D5F3-CDx-Ventana), FISH (Vysis ALK Break-Apart-Abbott) and RT-PCR (ALK RGQ RT-PCR-Qiagen). Results were compared and discordant cases subjected to Next Generation Sequencing. Thirty-nine of 217 samples were positive by the ALK RGQ RT-PCR assay, using a threshold cycle (Ct) cut-off ≤35.9, as recommended. Of these positive samples, 14 were negative by IHC and 12 by FISH. ALK RGQ RT-PCR/FISH discordant cases were analyzed by the NGS assay with results concordant with FISH data. In order to obtain the maximum level of agreement between FISH and ALK RGQ RT-PCR data, we introduced a new scoring algorithm based on the ΔCt value. A ΔCt cut-off level ≤3.5 was used in a pilot series. Then the algorithm was tested on a completely independent validation series. By using the new scoring algorithm and FISH as reference standard, the sensitivity and the specificity of the ALK RGQ RT-PCR(ΔCt) assay were 100% and 100%, respectively. Our results suggest that the ALK RGQ RT-PCR test could be useful in clinical practice as a complementary assay in multi-test diagnostic algorithms or even, if our data will be confirmed in independent studies, as a standalone or screening test for the selection of patients to be treated with ALK inhibitors. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Tests of selection in pooled case-control data: an empirical study.
Udpa, Nitin; Zhou, Dan; Haddad, Gabriel G; Bafna, Vineet
2011-01-01
For smaller organisms with faster breeding cycles, artificial selection can be used to create sub-populations with different phenotypic traits. Genetic tests can be employed to identify the causal markers for the phenotypes, as a precursor to engineering strains with a combination of traits. Traditional approaches involve analyzing crosses of inbred strains to test for co-segregation with genetic markers. Here we take advantage of cheaper next generation sequencing techniques to identify genetic signatures of adaptation to the selection constraints. Obtaining individual sequencing data is often unrealistic due to cost and sample issues, so we focus on pooled genomic data. We explore a series of statistical tests for selection using pooled case (under selection) and control populations. The tests generally capture skews in the scaled frequency spectrum of alleles in a region, which are indicative of a selective sweep. Extensive simulations are used to show that these approaches work well for a wide range of population divergence times and strong selective pressures. Control vs control simulations are used to determine an empirical False Positive Rate, and regions under selection are determined using a 1% FPR level. We show that pooling does not have a significant impact on statistical power. The tests are also robust to reasonable variations in several different parameters, including window size, base-calling error rate, and sequencing coverage. We then demonstrate the viability (and the challenges) of one of these methods in two independent Drosophila populations (Drosophila melanogaster) bred under selection for hypoxia and accelerated development, respectively. Testing for extreme hypoxia tolerance showed clear signals of selection, pointing to loci that are important for hypoxia adaptation. Overall, we outline a strategy for finding regions under selection using pooled sequences, then devise optimal tests for that strategy. The approaches show promise for detecting selection, even several generations after fixation of the beneficial allele has occurred.
Papa, Frank J; Li, Feiming
2015-12-01
Two core dual processing theory (DPT) System I constructs (Exemplars and Prototypes) were used to: 1) formulate a training exercise designed to improve diagnostic performance in year one medical students, and 2) explore whether any observed performance improvements were associated with preferential use of exemplars or prototypes. With IRB approval, 117 year one medical students participated in an acute chest pain diagnostic training exercise. A pre- and post-training test containing the same 27 case vignettes was used to determine if the subjects' diagnostic performance improved via training in both exemplars and prototypes. Exemplar and Prototype theory was also used to generate a unique typicality estimate for each case vignette. Because these estimates produce different performance predictions, differences in the subjects' observed performance would make it possible to infer whether subjects were preferentially using Exemplars or Prototypes. Pre- vs. post-training comparison revealed a significant performance improvement; t=14.04, p<0.001, Cohen's d=1.32. Pre-training, paired t-testing demonstrated that performance against the most typical vignettes>mid typical vignettes: t=4.94, p<0.001; and mid typical>least typical: t=5.16, p<0.001. Post-training, paired t-testing again demonstrated that performance against the most typical vignettes>mid typical: t=2.94, p<0.01; and mid typical>least typical: t=6.64, p<0.001. These findings are more consistent with the performance predictions generated via Prototype theory than Exemplar theory. DPT is useful in designing and evaluating the utility of new approaches to diagnostic training, and, investigating the cognitive factors driving diagnostic capabilities among early medical students.
A multimedia patient simulation for teaching and assessing endodontic diagnosis.
Littlefield, John H; Demps, Elaine L; Keiser, Karl; Chatterjee, Lipika; Yuan, Cheng H; Hargreaves, Kenneth M
2003-06-01
Teaching and assessing diagnostic skills are difficult due to relatively small numbers of total clinical experiences and a shortage of clinical faculty. Patient simulations could help teach and assess diagnosis by displaying a well-defined diagnostic task, then providing informative feedback and opportunities for repetition and correction of errors. This report describes the development and initial evaluation of SimEndo I, a multimedia patient simulation program that could be used for teaching or assessing endodontic diagnosis. Students interact with a graphical interface that has four pull-down menus and related submenus. In response to student requests, the program presents patient information. Scoring is based on diagnosis of each case by endodontists. Pilot testing with seventy-four junior dental students identified numerous needed improvements to the user interface program. A multi-school field test of the interface program using three patient cases addressed three research questions: 1) How did the field test students evaluate SimEndo I? Overall mean evaluation was 8.1 on a 0 to 10 scale; 2) How many cases are needed to generate a reproducible diagnostic proficiency score for an individual student using the Rimoldi scoring procedure? Mean diagnostic proficiency scores by case ranged from .27 to .40 on a 0 to 1 scale; five cases would produce a score with a 0.80 reliability coefficient; and 3) Did students accurately diagnose each case? Mean correct diagnosis scores by case ranged from .54 to .78 on a 0 to 1 scale. We conclude that multimedia patient simulations offer a promising alternative for teaching and assessing student diagnostic skills.
PLNoise: a package for exact numerical simulation of power-law noises
NASA Astrophysics Data System (ADS)
Milotti, Edoardo
2006-08-01
Many simulations of stochastic processes require colored noises: here I describe a small program library that generates samples with a tunable power-law spectral density: the algorithm can be modified to generate more general colored noises, and is exact for all time steps, even when they are unevenly spaced (as may often happen in the case of astronomical data, see e.g. [N.R. Lomb, Astrophys. Space Sci. 39 (1976) 447]. The method is exact in the sense that it reproduces a process that is theoretically guaranteed to produce a range-limited power-law spectrum 1/f with -1<β⩽1. The algorithm has a well-behaved computational complexity, it produces a nearly perfect Gaussian noise, and its computational efficiency depends on the required degree of noise Gaussianity. Program summaryTitle of program: PLNoise Catalogue identifier:ADXV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXV_v1_0.html Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Programming language used: ANSI C Computer: Any computer with an ANSI C compiler: the package has been tested with gcc version 3.2.3 on Red Hat Linux 3.2.3-52 and gcc version 4.0.0 and 4.0.1 on Apple Mac OS X-10.4 Operating system: All operating systems capable of running an ANSI C compiler No. of lines in distributed program, including test data, etc.:6238 No. of bytes in distributed program, including test data, etc.:52 387 Distribution format:tar.gz RAM: The code of the test program is very compact (about 50 Kbytes), but the program works with list management and allocates memory dynamically; in a typical run (like the one discussed in Section 4 in the long write-up) with average list length 2ṡ10, the RAM taken by the list is 200 Kbytes. External routines: The package needs external routines to generate uniform and exponential deviates. The implementation described here uses the random number generation library ranlib freely available from Netlib [B.W. Brown, J. Lovato, K. Russell, ranlib, available from Netlib, http://www.netlib.org/random/index.html, select the C version ranlib.c], but it has also been successfully tested with the random number routines in Numerical Recipes [W.H. Press, S.A. Teulkolsky, W.T. Vetterling, B.P. Flannery, Numerical Recipes in C: The Art of Scientific Computing, second ed., Cambridge Univ. Press, Cambridge, 1992, pp. 274-290]. Notice that ranlib requires a pair of routines from the linear algebra package LINPACK, and that the distribution of ranlib includes the C source of these routines, in case LINPACK is not installed on the target machine. Nature of problem: Exact generation of different types of Gaussian colored noise. Solution method: Random superposition of relaxation processes [E. Milotti, Phys. Rev. E 72 (2005) 056701]. Unusual features: The algorithm is theoretically guaranteed to be exact, and unlike all other existing generators it can generate samples with uneven spacing. Additional comments: The program requires an initialization step; for some parameter sets this may become rather heavy. Running time: Running time varies widely with different input parameters, however in a test run like the one in Section 4 in this work, the generation routine took on average about 7 ms for each sample.
Systems tunnel linear shaped charge lightning strike
NASA Technical Reports Server (NTRS)
Cook, M.
1989-01-01
Simulated lightning strike testing of the systems tunnel linear shaped charge (LSC) was performed at the Thiokol Lightning Test Complex in Wendover, Utah, on 23 Jun. 1989. The test article consisted of a 160-in. section of the LSC enclosed within a section of the systems tunnel. The systems tunnel was bonded to a section of a solid rocket motor case. All test article components were full scale. The systems tunnel cover of the test article was subjected to three discharges (each discharge was over a different grounding strap) from the high-current generator. The LSC did not detonate. All three grounding straps debonded and violently struck the LSC through the openings in the systems tunnel floor plates. The LSC copper surface was discolored around the areas of grounding strap impact, and arcing occurred at the LSC clamps and LSC ends. This test verified that the present flight configuration of the redesigned solid rocket motor systems tunnel, when subjected to simulated lightning strikes with peak current levels within 71 percent of the worst-case lightning strike condition of NSTS-07636, is adequate to prevent LSC ignition. It is therefore recommended that the design remain unchanged.
NASA Astrophysics Data System (ADS)
Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin
2018-05-01
This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC = 0.65 ± 0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p < 0.01). Thus, this study demonstrated that CAD-generated false-positives might include valuable information, which needs to be further explored for identifying and/or developing more effective imaging markers for predicting short-term breast cancer risk.
Empirical Analysis and Refinement of Expert System Knowledge Bases
1988-08-31
refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct
Carbon-monoxide poisoning in young drug addicts due to indoor use of a gasoline-powered generator.
Marc, B; Bouchez-Buvry, A; Wepierre, J L; Boniol, L; Vaquero, P; Garnier, M
2001-06-01
We report six fatal cases of unintentional carbon-monoxide poisoning which occurred in a house occupied by young people. The source of carbon monoxide was a gasoline-powered generator. For all victims, an external body examination was carried out and blood and urine samples collected. Blood carboxyhaemoglobin (COHb) was performed using an automated visible spectrophotometric analysis. Blood-alcohol level quantification was performed using gas chromatography and drug screening in urine was performed by a one-step manual qualitative immunochromatography (Syva Rapid test, Behring Diagnostics Inc.) for benzoylecgonine (the main metabolite of cocaine in urine), morphine, 11-nor-Delta(9)-THC-9-COOH (cannabinoids) and d-methamphetamine. In all victims the COHb value was as high or higher than 65%. No alcohol was found in blood samples, but urine samples were positive for methamphetamine, cocaine and cannabis in five cases and for opiates in one case. In four victims, the urine sample was positive for at least three drugs. The availability and accuracy of rapid toxicological screening is an important tool for the medical examiner at the immediate scene of a clinical forensic examination.
Anonymity and Electronics: Adapting Preparation for Radiology Resident Examination.
Chapman, Teresa; Reid, Janet R; O'Conner, Erin E
2017-06-01
Diagnostic radiology resident assessment has evolved from a traditional oral examination to computerized testing. Teaching faculty struggle to reconcile the differences between traditional teaching methods and residents' new preferences for computerized testing models generated by new examination styles. We aim to summarize the collective experiences of senior residents at three different teaching hospitals who participated in case review sessions using a computer-based, interactive, anonymous teaching tool, rather than the Socratic method. Feedback was collected from radiology residents following participation in a senior resident case review session using Nearpod, which allows residents to anonymously respond to the teaching material. Subjective resident feedback was uniformly enthusiastic. Ninety percent of residents favor a case-based board review incorporating multiple-choice questions, and 94% favor an anonymous response system. Nearpod allows for inclusion of multiple-choice questions while also providing direct feedback to the teaching faculty, helping to direct the instruction and clarify residents' gaps in knowledge before the Core Examination. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Generation of “Virtual” Control Groups for Single Arm Prostate Cancer Adjuvant Trials
Koziol, James A.; Chen, Xin; Xia, Xiao-Qin; Wang, Yipeng; Skarecky, Douglas; Sutton, Manuel; Sawyers, Anne; Ruckle, Herbert; Carpenter, Philip M.; Wang-Rodriguez, Jessica; Jiang, Jun; Deng, Mingsen; Pan, Cong; Zhu, Jian-guo; McLaren, Christine E.; Gurley, Michael J.; Lee, Chung; McClelland, Michael; Ahlering, Thomas; Kattan, Michael W.; Mercola, Dan
2014-01-01
It is difficult to construct a control group for trials of adjuvant therapy (Rx) of prostate cancer after radical prostatectomy (RP) due to ethical issues and patient acceptance. We utilized 8 curve-fitting models to estimate the time to 60%, 65%, … 95% chance of progression free survival (PFS) based on the data derived from Kattan post-RP nomogram. The 8 models were systematically applied to a training set of 153 post-RP cases without adjuvant Rx to develop 8 subsets of cases (reference case sets) whose observed PFS times were most accurately predicted by each model. To prepare a virtual control group for a single-arm adjuvant Rx trial, we first select the optimal model for the trial cases based on the minimum weighted Euclidean distance between the trial case set and the reference case set in terms of clinical features, and then compare the virtual PFS times calculated by the optimum model with the observed PFSs of the trial cases by the logrank test. The method was validated using an independent dataset of 155 post-RP patients without adjuvant Rx. We then applied the method to patients on a Phase II trial of adjuvant chemo-hormonal Rx post RP, which indicated that the adjuvant Rx is highly effective in prolonging PFS after RP in patients at high risk for prostate cancer recurrence. The method can accurately generate control groups for single-arm, post-RP adjuvant Rx trials for prostate cancer, facilitating development of new therapeutic strategies. PMID:24465467
Generation of "virtual" control groups for single arm prostate cancer adjuvant trials.
Jia, Zhenyu; Lilly, Michael B; Koziol, James A; Chen, Xin; Xia, Xiao-Qin; Wang, Yipeng; Skarecky, Douglas; Sutton, Manuel; Sawyers, Anne; Ruckle, Herbert; Carpenter, Philip M; Wang-Rodriguez, Jessica; Jiang, Jun; Deng, Mingsen; Pan, Cong; Zhu, Jian-Guo; McLaren, Christine E; Gurley, Michael J; Lee, Chung; McClelland, Michael; Ahlering, Thomas; Kattan, Michael W; Mercola, Dan
2014-01-01
It is difficult to construct a control group for trials of adjuvant therapy (Rx) of prostate cancer after radical prostatectomy (RP) due to ethical issues and patient acceptance. We utilized 8 curve-fitting models to estimate the time to 60%, 65%, … 95% chance of progression free survival (PFS) based on the data derived from Kattan post-RP nomogram. The 8 models were systematically applied to a training set of 153 post-RP cases without adjuvant Rx to develop 8 subsets of cases (reference case sets) whose observed PFS times were most accurately predicted by each model. To prepare a virtual control group for a single-arm adjuvant Rx trial, we first select the optimal model for the trial cases based on the minimum weighted Euclidean distance between the trial case set and the reference case set in terms of clinical features, and then compare the virtual PFS times calculated by the optimum model with the observed PFSs of the trial cases by the logrank test. The method was validated using an independent dataset of 155 post-RP patients without adjuvant Rx. We then applied the method to patients on a Phase II trial of adjuvant chemo-hormonal Rx post RP, which indicated that the adjuvant Rx is highly effective in prolonging PFS after RP in patients at high risk for prostate cancer recurrence. The method can accurately generate control groups for single-arm, post-RP adjuvant Rx trials for prostate cancer, facilitating development of new therapeutic strategies.
Huang, Qing; Fu, Wei-Ling; You, Jian-Ping; Mao, Qing
2016-10-01
Ebola virus disease (EVD), caused by Ebola virus (EBOV), is a potent acute infectious disease with a high case-fatality rate. Etiological and serological EBOV detection methods, including techniques that involve the detection of the viral genome, virus-specific antigens and anti-virus antibodies, are standard laboratory diagnostic tests that facilitate confirmation or exclusion of EBOV infection. In addition, routine blood tests, liver and kidney function tests, electrolytes and coagulation tests and other diagnostic examinations are important for the clinical diagnosis and treatment of EVD. Because of the viral load in body fluids and secretions from EVD patients, all body fluids are highly contagious. As a result, biosafety control measures during the collection, transport and testing of clinical specimens obtained from individuals scheduled to undergo EBOV infection testing (including suspected, probable and confirmed cases) are crucial. This report has been generated following extensive work experience in the China Ebola Treatment Center (ETC) in Liberia and incorporates important information pertaining to relevant diagnostic standards, clinical significance, operational procedures, safety controls and other issues related to laboratory testing of EVD. Relevant opinions and suggestions are presented in this report to provide contextual awareness associated with the development of standards and/or guidelines related to EVD laboratory testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soria-Lara, Julio A., E-mail: julio.soria-lara@ouce.ox.ac.uk; Bertolini, Luca, E-mail: l.bertolini@uva.nl; Brömmelstroet, Marco te, E-mail: M.C.G.teBrommelstroet@uva.nl
The integration of knowledge from stakeholders and the public at large is seen as one of the biggest process-related barriers during the scoping phase of EIA application in transport planning. While the academic literature offers abundant analyses, discussions and suggestions how to overcome this problem, the proposed solutions are yet to be adequately tested in practice. In order to address this gap, we test the effectiveness of a set of interventions and trigger mechanisms for improving different aspects of knowledge integration. The interventions are tested in an experiential study with two sequential cases, representing “close-to-real-life” conditions, in the context ofmore » two cities in Andalusia, Spain. In general terms, the participants perceived that the integration of knowledge improved during the simulation of the EIA scoping phase. Certain shortcomings were also discussed, fundamentally related to how the time spent during the scoping phase was crucial to lead an effective learning process between the involved people. The study concludes with a reflection on the effectiveness of the tested interventions according to similarities and differences obtained from the two experiential case studies, as well as with a discussion of the potential to generate new knowledge through the use of experiential studies in EIA practice. - Highlights: • It tests a set of interventions and mechanisms to improve the integration of knowledge. • The scoping phase of EIA is simulated to assess the effectiveness of interventions. • Two sequential case studies are used.« less
A Testing Framework for Critical Space SW
NASA Astrophysics Data System (ADS)
Fernandez, Ignacio; Di Cerbo, Antonio; Dehnhardt, Erik; Massimo, Tipaldi; Brünjes, Bernhard
2015-09-01
This paper describes a testing framework for critical space SW named Technical Specification Validation Framework (TSVF). It provides a powerful and flexible means and can be used throughout the SW test activities (test case specification & implementation, test execution and test artifacts analysis). In particular, tests can be run in an automated and/or step-by-step mode. The TSVF framework is currently used for the validation of the Satellite Control Software (SCSW), which runs on the Meteosat Third Generation (MTG) satellite on-board computer. The main purpose of the SCSW is to control the spacecraft along with its various subsystems (AOCS, Payload, Electrical Power, Telemetry Tracking & Command, etc.) in a way that guarantees a high degree of flexibility and autonomy. The TSVF framework serves the challenging needs of the SCSW project, where a plan-driven approach has been combined with an agile process in order to produce preliminary SW versions (with a reduced scope of implemented functionality) in order to fulfill the stakeholders needs ([1]). The paper has been organised as follows. Section 2 gives an overview of the TSVF architecture and interfaces versus the test bench along with the technology used for its implementation. Section 3 describes the key elements of the XML based language for the test case implementation. Section 4 highlights all the benefits compared to conventional test environments requiring a manual test script development, whereas section 5 concludes the paper.
Tri-Service Corrosion Conference
2002-01-18
PREVENTION / CASE STUDIES 63 Issues in the Measurement of Volatile Organic Compounds (VOC’S) in New- 64 Generation Low-VOC Marine Coatings for...Bell Lab’s Corrosion Preventive Compound (MIL-L-87177A Grade B) 95 David H. Horne,ChE., P.E. The Operational Testing of the CPC ACF-50 on the...A. Matzdorf Low Volatile Organic Compound (VOC) Chemical Agent Resistant Coating 601 (CARC) Application Demonstration/Validation Lisa Weiser
2015-01-31
from a wireless joystick console broadcasting at 2.4 GHz. Figure 6. GTRI Airborne Unmanned Sensor System As shown in Figure 7 the autopilot has a...generating wind turbines , and video reconnaissance systems on unmanned aerial vehicles (UAVs). The most basic decision problem in designing a...chosen test UAV case was the GTRI Aerial Unmanned Sensor System (GAUSS) aircraft. The GAUSS platform is a small research UAV with a widely used
2016-04-01
environment. Modeling is suitable for well- characterized parts, and stochastic modeling techniques can be used for sensitivity analysis and generating a...large cohort of trials to spot unusual cases. However, deployment repeatability is inherently a nonlinear phenomenon, which makes modeling difficult...recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the U.S. Air Force. 1. Test the flight model
Liu, Chang; Dobson, Jacob; Cawley, Peter
2017-03-01
Permanently installed guided wave monitoring systems are attractive for monitoring large structures. By frequently interrogating the test structure over a long period of time, such systems have the potential to detect defects much earlier than with conventional one-off inspection, and reduce the time and labour cost involved. However, for the systems to be accepted under real operational conditions, their damage detection performance needs to be evaluated in these practical settings. The receiver operating characteristic (ROC) is an established performance metric for one-off inspections, but the generation of the ROC requires many test structures with realistic damage growth at different locations and different environmental conditions, and this is often impractical. In this paper, we propose an evaluation framework using experimental data collected over multiple environmental cycles on an undamaged structure with synthetic damage signatures added by superposition. Recent advances in computation power enable examples covering a wide range of practical scenarios to be generated, and for multiple cases of each scenario to be tested so that the statistics of the performance can be evaluated. The proposed methodology has been demonstrated using data collected from a laboratory pipe specimen over many temperature cycles, superposed with damage signatures predicted for a flat-bottom hole growing at different rates at various locations. Three damage detection schemes, conventional baseline subtraction, singular value decomposition (SVD) and independent component analysis (ICA), have been evaluated. It has been shown that in all cases, the component methods perform significantly better than the residual method, with ICA generally the better of the two. The results have been validated using experimental data monitoring a pipe in which a flat-bottom hole was drilled and enlarged over successive temperature cycles. The methodology can be used to evaluate the performance of an installed monitoring system and to show whether it is capable of detecting particular damage growth at any given location. It will enable monitoring results to be evaluated rigorously and will be valuable in the development of safety cases.
Dobson, Jacob; Cawley, Peter
2017-01-01
Permanently installed guided wave monitoring systems are attractive for monitoring large structures. By frequently interrogating the test structure over a long period of time, such systems have the potential to detect defects much earlier than with conventional one-off inspection, and reduce the time and labour cost involved. However, for the systems to be accepted under real operational conditions, their damage detection performance needs to be evaluated in these practical settings. The receiver operating characteristic (ROC) is an established performance metric for one-off inspections, but the generation of the ROC requires many test structures with realistic damage growth at different locations and different environmental conditions, and this is often impractical. In this paper, we propose an evaluation framework using experimental data collected over multiple environmental cycles on an undamaged structure with synthetic damage signatures added by superposition. Recent advances in computation power enable examples covering a wide range of practical scenarios to be generated, and for multiple cases of each scenario to be tested so that the statistics of the performance can be evaluated. The proposed methodology has been demonstrated using data collected from a laboratory pipe specimen over many temperature cycles, superposed with damage signatures predicted for a flat-bottom hole growing at different rates at various locations. Three damage detection schemes, conventional baseline subtraction, singular value decomposition (SVD) and independent component analysis (ICA), have been evaluated. It has been shown that in all cases, the component methods perform significantly better than the residual method, with ICA generally the better of the two. The results have been validated using experimental data monitoring a pipe in which a flat-bottom hole was drilled and enlarged over successive temperature cycles. The methodology can be used to evaluate the performance of an installed monitoring system and to show whether it is capable of detecting particular damage growth at any given location. It will enable monitoring results to be evaluated rigorously and will be valuable in the development of safety cases. PMID:28413339
Jurkowska, Monika; Gos, Aleksandra; Ptaszyński, Konrad; Michej, Wanda; Tysarowski, Andrzej; Zub, Renata; Siedlecki, Janusz A; Rutkowski, Piotr
2015-01-01
The study compares detection rates of oncogenic BRAF mutations in a homogenous group of 236 FFPE cutaneous melanoma lymph node metastases, collected in one cancer center. BRAF mutational status was verified by two independent in-house PCR/Sanger sequencing tests, and the Cobas® 4800 BRAF V600 Mutation Test. The best of two sequencing approaches returned results for 230/236 samples. In 140 (60.9%), the mutation in codon 600 of BRAF was found. 91.4% of all mutated cases (128 samples) represented p.V600E. Both Sanger-based tests gave reproducible results although they differed significantly in the percentage of amplifiable samples: 230/236 to 109/143. Cobas generated results in all 236 cases, mutations changing codon V600 were detected in 144 of them (61.0%), including 5 not amplifiable and 5 negative in the standard sequencing. However, 6 cases positive in sequencing turned out to be negative in Cobas. Both tests provided us with the same BRAF V600 mutational status in 219 out of 230 cases with valid results (95.2%). The total BRAF V600 mutation detection rate didn't differ significantly between the two methodological approaches (60.9% vs. 61.0%). Sequencing was a reproducible method of V600 mutation detection and more powerful to detect mutations other than p.V600E, while Cobas test proved to be less susceptible to the poor DNA quality or investigator's bias. The study underlined an important role of pathologists in quality assurance of molecular diagnostics.
An extension of the directed search domain algorithm to bilevel optimization
NASA Astrophysics Data System (ADS)
Wang, Kaiqiang; Utyuzhnikov, Sergey V.
2017-08-01
A method is developed for generating a well-distributed Pareto set for the upper level in bilevel multiobjective optimization. The approach is based on the Directed Search Domain (DSD) algorithm, which is a classical approach for generation of a quasi-evenly distributed Pareto set in multiobjective optimization. The approach contains a double-layer optimizer designed in a specific way under the framework of the DSD method. The double-layer optimizer is based on bilevel single-objective optimization and aims to find a unique optimal Pareto solution rather than generate the whole Pareto frontier on the lower level in order to improve the optimization efficiency. The proposed bilevel DSD approach is verified on several test cases, and a relevant comparison against another classical approach is made. It is shown that the approach can generate a quasi-evenly distributed Pareto set for the upper level with relatively low time consumption.
Stanley, Pamela; Sundaram, Subha
2014-06-03
Glycosylation engineering is used to generate glycoproteins, glycolipids, or proteoglycans with a more defined complement of glycans on their glycoconjugates. For example, a mammalian cell glycosylation mutant lacking a specific glycosyltransferase generates glycoproteins, and/or glycolipids, and/or proteoglycans with truncated glycans missing the sugar transferred by that glycosyltransferase, as well as those sugars that would be added subsequently. In some cases, an alternative glycosyltransferase may then use the truncated glycans as acceptors, thereby generating a new or different glycan subset in the mutant cell. Another type of glycosylation mutant arises from gain-of-function mutations that, for example, activate a silent glycosyltransferase gene. In this case, glycoconjugates will have glycans with additional sugar(s) that are more elaborate than the glycans of wild type cells. Mutations in other genes that affect glycosylation, such as nucleotide sugar synthases or transporters, will alter the glycan complement in more general ways that usually affect several types of glycoconjugates. There are now many strategies for generating a precise mutation in a glycosylation gene in a mammalian cell. Large-volume cultures of mammalian cells may also generate spontaneous mutants in glycosylation pathways. This article will focus on how to rapidly characterize mammalian cells with an altered glycosylation activity. The key reagents for the protocols described are plant lectins that bind mammalian glycans with varying avidities, depending on the specific structure of those glycans. Cells with altered glycosylation generally become resistant or hypersensitive to lectin toxicity, and have reduced or increased lectin or antibody binding. Here we describe rapid assays to compare the cytotoxicity of lectins in a lectin resistance test, and the binding of lectins or antibodies by flow cytometry in a glycan-binding assay. Based on these tests, glycosylation changes expressed by a cell can be revealed, and glycosylation mutants classified into phenotypic groups that may reflect a loss-of-function or gain-of-function mutation in a specific gene involved in glycan synthesis. Copyright © 2014 John Wiley & Sons, Inc.
Ultrafast treatment plan optimization for volumetric modulated arc therapy (VMAT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Men Chunhua; Romeijn, H. Edwin; Jia Xun
2010-11-15
Purpose: To develop a novel aperture-based algorithm for volumetric modulated arc therapy (VMAT) treatment plan optimization with high quality and high efficiency. Methods: The VMAT optimization problem is formulated as a large-scale convex programming problem solved by a column generation approach. The authors consider a cost function consisting two terms, the first enforcing a desired dose distribution and the second guaranteeing a smooth dose rate variation between successive gantry angles. A gantry rotation is discretized into 180 beam angles and for each beam angle, only one MLC aperture is allowed. The apertures are generated one by one in a sequentialmore » way. At each iteration of the column generation method, a deliverable MLC aperture is generated for one of the unoccupied beam angles by solving a subproblem with the consideration of MLC mechanic constraints. A subsequent master problem is then solved to determine the dose rate at all currently generated apertures by minimizing the cost function. When all 180 beam angles are occupied, the optimization completes, yielding a set of deliverable apertures and associated dose rates that produce a high quality plan. Results: The algorithm was preliminarily tested on five prostate and five head-and-neck clinical cases, each with one full gantry rotation without any couch/collimator rotations. High quality VMAT plans have been generated for all ten cases with extremely high efficiency. It takes only 5-8 min on CPU (MATLAB code on an Intel Xeon 2.27 GHz CPU) and 18-31 s on GPU (CUDA code on an NVIDIA Tesla C1060 GPU card) to generate such plans. Conclusions: The authors have developed an aperture-based VMAT optimization algorithm which can generate clinically deliverable high quality treatment plans at very high efficiency.« less
Ultrafast treatment plan optimization for volumetric modulated arc therapy (VMAT).
Men, Chunhua; Romeijn, H Edwin; Jia, Xun; Jiang, Steve B
2010-11-01
To develop a novel aperture-based algorithm for volumetric modulated are therapy (VMAT) treatment plan optimization with high quality and high efficiency. The VMAT optimization problem is formulated as a large-scale convex programming problem solved by a column generation approach. The authors consider a cost function consisting two terms, the first enforcing a desired dose distribution and the second guaranteeing a smooth dose rate variation between successive gantry angles. A gantry rotation is discretized into 180 beam angles and for each beam angle, only one MLC aperture is allowed. The apertures are generated one by one in a sequential way. At each iteration of the column generation method, a deliverable MLC aperture is generated for one of the unoccupied beam angles by solving a subproblem with the consideration of MLC mechanic constraints. A subsequent master problem is then solved to determine the dose rate at all currently generated apertures by minimizing the cost function. When all 180 beam angles are occupied, the optimization completes, yielding a set of deliverable apertures and associated dose rates that produce a high quality plan. The algorithm was preliminarily tested on five prostate and five head-and-neck clinical cases, each with one full gantry rotation without any couch/collimator rotations. High quality VMAT plans have been generated for all ten cases with extremely high efficiency. It takes only 5-8 min on CPU (MATLAB code on an Intel Xeon 2.27 GHz CPU) and 18-31 s on GPU (CUDA code on an NVIDIA Tesla C1060 GPU card) to generate such plans. The authors have developed an aperture-based VMAT optimization algorithm which can generate clinically deliverable high quality treatment plans at very high efficiency.
Lee, Q U
2014-10-01
A 10% cross-reactivity rate is commonly cited between penicillins and cephalosporins. However, this figure originated from studies in the 1960s and 1970s which included first-generation cephalosporins with similar side-chains to penicillins. Cephalosporins were frequently contaminated by trace amount of penicillins at that time. The side-chain hypothesis for beta-lactam hypersensitivity is supported by abundant scientific evidence. Newer generations of cephalosporins possess side-chains that are dissimilar to those of penicillins, leading to low cross-reactivity. In the assessment of cross-reactivity between penicillins and cephalosporins, one has to take into account the background beta-lactam hypersensitivity, which occurs in up to 10% of patients. Cross-reactivity based on skin testing or in-vitro test occurs in up to 50% and 69% of cases, respectively. Clinical reactivity and drug challenge test suggest an average cross-reactivity rate of only 4.3%. For third- and fourth-generation cephalosporins, the rate is probably less than 1%. Recent international guidelines are in keeping with a low cross-reactivity rate. Despite that, the medical community in Hong Kong remains unnecessarily skeptical. Use of cephalosporins in patients with penicillin hypersensitivity begins with detailed history and physical examination. Clinicians can choose a cephalosporin with a different side-chain. Skin test for penicillin is not predictive of cephalosporin hypersensitivity, while cephalosporin skin test is not sensitive. Drug provocation test by experienced personnel remains the best way to exclude or confirm the diagnosis of drug hypersensitivity and to find a safe alternative for future use. A personalised approach to cross-reactivity is advocated.
Bastida, Jose Maria; González-Porras, Jose Ramon; Jiménez, Cristina; Benito, Rocio; Ordoñez, Gonzalo R; Álvarez-Román, Maria Teresa; Fontecha, M Elena; Janusz, Kamila; Castillo, David; Fisac, Rosa María; García-Frade, Luis Javier; Aguilar, Carlos; Martínez, María Paz; Bermejo, Nuria; Herrero, Sonia; Balanzategui, Ana; Martin-Antorán, Jose Manuel; Ramos, Rafael; Cebeiro, Maria Jose; Pardal, Emilia; Aguilera, Carmen; Pérez-Gutierrez, Belen; Prieto, Manuel; Riesco, Susana; Mendoza, Maria Carmen; Benito, Ana; Hortal Benito-Sendin, Ana; Jiménez-Yuste, Víctor; Hernández-Rivas, Jesus Maria; García-Sanz, Ramon; González-Díaz, Marcos; Sarasquete, Maria Eugenia
2017-01-05
Currently, molecular diagnosis of haemophilia A and B (HA and HB) highlights the excess risk-inhibitor development associated with specific mutations, and enables carrier testing of female relatives and prenatal or preimplantation genetic diagnosis. Molecular testing for HA also helps distinguish it from von Willebrand disease (VWD). Next-generation sequencing (NGS) allows simultaneous investigation of several complete genes, even though they may span very extensive regions. This study aimed to evaluate the usefulness of a molecular algorithm employing an NGS approach for sequencing the complete F8, F9 and VWF genes. The proposed algorithm includes the detection of inversions of introns 1 and 22, an NGS custom panel (the entire F8, F9 and VWF genes), and multiplex ligation-dependent probe amplification (MLPA) analysis. A total of 102 samples (97 FVIII- and FIX-deficient patients, and five female carriers) were studied. IVS-22 screening identified 11 out of 20 severe HA patients and one female carrier. IVS-1 analysis did not reveal any alterations. The NGS approach gave positive results in 88 cases, allowing the differential diagnosis of mild/moderate HA and VWD in eight cases. MLPA confirmed one large exon deletion. Only one case did have no pathogenic variants. The proposed algorithm had an overall success rate of 99 %. In conclusion, our evaluation demonstrates that this algorithm can reliably identify pathogenic variants and diagnose patients with HA, HB or VWD.
Aeroacoustic simulation of a linear cascade by a prefactored compact scheme
NASA Astrophysics Data System (ADS)
Ghillani, Pietro
This work documents the development of a three-dimensional high-order prefactored compact finite-difference solver for computational aeroacoustics (CAA) based on the inviscid Euler equations. This time explicit scheme is applied to representative problems of sound generation by flow interacting with solid boundaries. Four aeroacoustic problems are explored and the results validated against available reference analytical solution. Selected mesh convergence studies are conducted to determine the effective order of accuracy of the complete scheme. The first test case simulates the noise emitted by a still cylinder in an oscillating field. It provides a simple validation for the CAA-compatible solid wall condition used in the remainder of the work. The following test cases are increasingly complex versions of the turbomachinery rotor-stator interaction problem taken from NASA CAA workshops. In all the cases the results are compared against the available literature. The numerical method features some appreciable contributions to computational aeroacoustics. A reduced data exchange technique for parallel computations is implemented, which requires the exchange of just two values for each boundary node, independently of the size of the zone overlap. A modified version of the non-reflecting buffer layer by Chen is used to allow aerodynamic perturbations at the through flow boundaries. The Giles subsonic boundary conditions are extended to three-dimensional curvilinear coordinates. These advances have enabled to resolve the aerodynamic noise generation and near-field propagation on a representative cascade geometry with a time-marching scheme, with accuracy similar to spectral methods..
Multi-GPU implementation of a VMAT treatment plan optimization algorithm.
Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B
2015-06-01
Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.
Notes on the diversity of the properties of radio bursts observed on the nightside of Venus
NASA Technical Reports Server (NTRS)
Sonwalkar, Vikas S.; Carpenter, D. L.
1995-01-01
We report on further studies of radio wave bursts detected by the Orbiting Electric Field Detector (OEFD) on the Pioneer Venus Orbiter (PVO) in the nightside ionosphere of Venus. We have tested a total of 25 cases of wave burst activity for evidence of whistler-mode propagation to the spacecraft from impulsive subionospheric sources. As in a previous study of 11 of these cases (Sonwalkar et al., 1991) we find at least two distinct classes of events, one, mostly involving bursts at 100 Hz only, that passes certain tests for whistler-mode propagation, and another, mostly involving bursts in two or more of the four PVO narrowband channels (at 100 Hz, 730 Hz, 5.4 kHz, and 30 kHz), that fails to pass the tests. The subionospheric lightning hypothesis continues to be tenable as a candidate explanation for many of the 100 Hz-only events, but its number of 100 Hz-only cases that do no pass all the applicable whistler-mode tests, as well as the existence at a wide range of altitudes of multichannel cases that are clearly not propagating whistler-mode waves. The wideband bursts are often observed at altitudes above 1000 km and frequently occur in regions of locally reduced electron density. Those observed at high altitude (and possibly low altitude as well) are believed to be generated near the spacecraft, possibly by an as yet unknown mechanism responsible for similar burst observations made near Earth and other planets.
A Preliminary Study of Building a Transmission Overlay for Regional US Power Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lei, Yin; Li, Yalong; Liu, Yilu
2015-01-01
Many European countries have taken steps toward a Supergrid in order to transmit large amount of intermittent and remote renewable energy over long distance to load centers. In the US, as the expected increase in renewable generation and electricity demand, similar problem arises. A potential solution is to upgrade the transmission system at a higher voltage by constructing a new overlay grid. This paper will first address basic requirements for such an overlay grid. Potential transmission technologies will also be discussed. A multi-terminal VSC HVDC model is developed in DSATools to implement the overlay grid and a test case onmore » a regional NPCC system will be simulated. Another test system of entire US power grid, with three different interconnections tied together using back-to-back HVDC, is also introduced in this paper. Building an overlay system on top of this test case is ongoing, and will be discussed in future work.« less
NGS testing for cardiomyopathy: Utility of adding RASopathy-associated genes.
Ceyhan-Birsoy, Ozge; Miatkowski, Maya M; Hynes, Elizabeth; Funke, Birgit H; Mason-Suares, Heather
2018-04-25
RASopathies include a group of syndromes caused by pathogenic germline variants in RAS-MAPK pathway genes and typically present with facial dysmorphology, cardiovascular disease, and musculoskeletal anomalies. Recently, variants in RASopathy-associated genes have been reported in individuals with apparently nonsyndromic cardiomyopathy, suggesting that subtle features may be overlooked. To determine the utility and burden of adding RASopathy-associated genes to cardiomyopathy panels, we tested 11 RASopathy-associated genes by next-generation sequencing (NGS), including NGS-based copy number variant assessment, in 1,111 individuals referred for genetic testing for hypertrophic cardiomyopathy (HCM) or dilated cardiomyopathy (DCM). Disease-causing variants were identified in 0.6% (four of 692) of individuals with HCM, including three missense variants in the PTPN11, SOS1, and BRAF genes. Overall, 36 variants of uncertain significance (VUSs) were identified, averaging ∼3VUSs/100 cases. This study demonstrates that adding a subset of the RASopathy-associated genes to cardiomyopathy panels will increase clinical diagnoses without significantly increasing the number of VUSs/case. © 2018 Wiley Periodicals, Inc.
Critical joints in large composite aircraft structure
NASA Technical Reports Server (NTRS)
Nelson, W. D.; Bunin, B. L.; Hart-Smith, L. J.
1983-01-01
A program was conducted at Douglas Aircraft Company to develop the technology for critical structural joints of composite wing structure that meets design requirements for a 1990 commercial transport aircraft. The prime objective of the program was to demonstrate the ability to reliably predict the strength of large bolted composite joints. Ancillary testing of 180 specimens generated data on strength and load-deflection characteristics which provided input to the joint analysis. Load-sharing between fasteners in multirow bolted joints was computed by the nonlinear analysis program A4EJ. This program was used to predict strengths of 20 additional large subcomponents representing strips from a wing root chordwise splice. In most cases, the predictions were accurate to within a few percent of the test results. In some cases, the observed mode of failure was different than anticipated. The highlight of the subcomponent testing was the consistent ability to achieve gross-section failure strains close to 0.005. That represents a considerable improvement over the state of the art.
Influence of the sex of the transmitting grandparent in congenital myotonic dystrophy.
López de Munain, A; Cobo, A M; Poza, J J; Navarrete, D; Martorell, L; Palau, F; Emparanza, J I; Baiget, M
1995-09-01
To analyse the influence of the sex of the transmitting grandparents on the occurrence of the congenital form of myotonic dystrophy (CDM), we have studied complete three generation pedigrees of 49 CDM cases, analysing: (1) the sex distribution in the grandparents' generation, and (2) the intergenerational amplification of the CTG repeat, measured in its absolute and relative values, between grandparents and the mothers of CDM patients and between the latter and their CDM children. The mean relative intergenerational increase in the 32 grandparent-mother pairs was significantly greater than in the 56 mother-CDM pairs (Mann-Whitney U test, p < 0.001). The mean expansion of the grandfathers (103 CTG repeats) was also significantly different from that seen in the grandmothers' group (154 CTG repeats) (Mann-Whitney U test, p < 0.01). This excess of non-manifesting males between the CDM grandparents' generation with a smaller CTG length than the grandmothers could suggest that the premutation has to be transmitted by a male to reach the degree of instability responsible for subsequent intergenerational CTG expansions without size constraints characteristic of the CDM range.
NASA Technical Reports Server (NTRS)
Xiang, Xuwu; Smith, Eric A.; Tripoli, Gregory J.
1992-01-01
A hybrid statistical-physical retrieval scheme is explored which combines a statistical approach with an approach based on the development of cloud-radiation models designed to simulate precipitating atmospheres. The algorithm employs the detailed microphysical information from a cloud model as input to a radiative transfer model which generates a cloud-radiation model database. Statistical procedures are then invoked to objectively generate an initial guess composite profile data set from the database. The retrieval algorithm has been tested for a tropical typhoon case using Special Sensor Microwave/Imager (SSM/I) data and has shown satisfactory results.
voom: precision weights unlock linear model analysis tools for RNA-seq read counts
2014-01-01
New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods. PMID:24485249
voom: Precision weights unlock linear model analysis tools for RNA-seq read counts.
Law, Charity W; Chen, Yunshun; Shi, Wei; Smyth, Gordon K
2014-02-03
New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods.
NASA Technical Reports Server (NTRS)
Snyder, Gregory A.; Taylor, Lawrence A.; Neal, Clive R.
1992-01-01
A chemical model for simulating the sources of the lunar mare basalts was developed by considering a modified mafic cumulate source formed during the combined equilibrium and fractional crystallization of a lunar magma ocean (LMO). The parameters which influence the initial LMO and its subsequent crystallization are examined, and both trace and major elements are modeled. It is shown that major elements tightly constrain the composition of mare basalt sources and the pathways to their creation. The ability of this LMO model to generate viable mare basalt source regions was tested through a case study involving the high-Ti basalts.
Feasibility of potable water generators to meet vessel numeric ballast water discharge limits.
Albert, Ryan J; Viveiros, Edward; Falatko, Debra S; Tamburri, Mario N
2017-07-15
Ballast water is taken on-board vessels into ballast water tanks to maintain vessel draft, buoyancy, and stability. Unmanaged ballast water contains aquatic organisms that, when transported and discharged to non-native waters, may establish as invasive species. Technologies capable of achieving regulatory limits designed to decrease the likelihood of invasion include onboard ballast water management systems. However, to date, the treatment development and manufacturing marketplace is limited to large vessels with substantial ballast requirements. For smaller vessels or vessels with reduced ballast requirements, we evaluated the feasibility of meeting the discharge limits by generating ballast water using onboard potable water generators. Case studies and parametric analyses demonstrated the architectural feasibility of installing potable water generators onboard actual vessels with minimal impacts for most vessel types evaluated. Furthermore, land-based testing of a potable water generator demonstrated capability to meet current numeric discharge limits for living organisms in all size classes. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Zorila, Alexandru; Stratan, Aurel; Nemes, George
2018-01-01
We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.
Thermal Analysis of the Mound One Kilowatt Package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Or, Chuen T.
The Mound One Kilowatt (1 KW) package was designed for the shipment of plutonium (Pu-238) with not more than 1 kW total heat dissipation. To comply with regulations, the Mound 1 kW package has to pass all the requirements under Normal Conditions of Transport (NCT; 38 degrees C ambient temperature) and Hypothetical Accident Conditions (HAC; package engulfed in fire for 30 minutes). Analytical and test results were presented in the Safety Analysis Report for Packaging (SARP) for the Mound 1 kW package, revision 1, April 1991. Some issues remained unresolved in that revision. In March 1992, Fairchild Space and Defensemore » Corporation was commissioned by the Department of Energy to perform the thermal analyses. 3-D thermal models were created to perform the NCT and HAC analyses. Four shipping configurations in the SARP revision 3 were analyzed. They were: (1) The GPHS graphite impact shell (GIS) in the threaded product can (1000 W total heat generation); (2) The fueled clads in the welded product can (1000 W total heat generation); (3) The General Purpose Heat Source (GPHS) module (750 W total heat generation); and (4) The Multi-Hundred Watt (MHW) spheres (810 W total heat generation). Results from the four cases show that the GIS or fuel clad in the product can is the worse case. The temperatures predicted under NCT and HAC in all four cases are within the design limits. The use of helium instead of argon as cover gas provides a bigger safety margin. There is a duplicate copy.« less
Statechart Analysis with Symbolic PathFinder
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2012-01-01
We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.
NASA Astrophysics Data System (ADS)
Charles, Christine; Boswell, Roderick; Bish, Andrew; Khayms, Vadim; Scholz, Edwin
2016-05-01
Gas flow heating using radio frequency plasmas offers the possibility of depositing power in the centre of the flow rather than on the outside, as is the case with electro-thermal systems where thermal wall losses lower efficiency. Improved systems for space propulsion are one possible application and we have tested a prototype micro-thruster on a thrust balance in vacuum. For these initial tests, a fixed component radio frequency matching network weighing 90 grams was closely attached to the thruster in vacuum with the frequency agile radio frequency generator power being delivered via a 50 Ohm cable. Without accounting for system losses (estimated at around 50%), for a few 10s of Watts from the radio frequency generator the specific impulse was tripled to ˜48 seconds and the thrust tripled from 0.8 to 2.4 milli-Newtons.
A reference protocol for comparing the biocidal properties of gas plasma generating devices
NASA Astrophysics Data System (ADS)
Shaw, A.; Seri, P.; Borghi, C. A.; Shama, G.; Iza, F.
2015-12-01
Growing interest in the use of non-thermal, atmospheric pressure gas plasmas for decontamination purposes has resulted in a multiplicity of plasma-generating devices. There is currently no universally approved method of comparing the biocidal performance of such devices and in the work described here spores of the Gram positive bacterium Bacillus subtilis (ATCC 6633) are proposed as a suitable reference biological agent. In order to achieve consistency in the form in which the biological agent in question is presented to the plasma, a polycarbonate membrane loaded with a monolayer of spores is proposed. The advantages of the proposed protocol are evaluated by comparing inactivation tests in which an alternative microorganism (methicillin resistant Staphylococcus aureus—MRSA) and the widely-used sample preparation technique of directly pipetting cell suspensions onto membranes are employed. In all cases, inactivation tests with either UV irradiation or plasma exposure were more reproducible when the proposed protocol was followed.
NASA Technical Reports Server (NTRS)
Antle, John M.; Basso, Bruno; Conant, Richard T.; Godfray, H. Charles J.; Jones, James W.; Herrero, Mario; Howitt, Richard E.; Keating, Brian A.; Munoz-Carpena, Rafael; Rosenzweig, Cynthia
2016-01-01
This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.
Antle, John M; Basso, Bruno; Conant, Richard T; Godfray, H Charles J; Jones, James W; Herrero, Mario; Howitt, Richard E; Keating, Brian A; Munoz-Carpena, Rafael; Rosenzweig, Cynthia; Tittonell, Pablo; Wheeler, Tim R
2017-07-01
This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.
A novel tablet computer platform for advanced language mapping during awake craniotomy procedures.
Morrison, Melanie A; Tam, Fred; Garavaglia, Marco M; Golestanirad, Laleh; Hare, Gregory M T; Cusimano, Michael D; Schweizer, Tom A; Das, Sunit; Graham, Simon J
2016-04-01
A computerized platform has been developed to enhance behavioral testing during intraoperative language mapping in awake craniotomy procedures. The system is uniquely compatible with the environmental demands of both the operating room and preoperative functional MRI (fMRI), thus providing standardized testing toward improving spatial agreement between the 2 brain mapping techniques. Details of the platform architecture, its advantages over traditional testing methods, and its use for language mapping are described. Four illustrative cases demonstrate the efficacy of using the testing platform to administer sophisticated language paradigms, and the spatial agreement between intraoperative mapping and preoperative fMRI results. The testing platform substantially improved the ability of the surgeon to detect and characterize language deficits. Use of a written word generation task to assess language production helped confirm areas of speech apraxia and speech arrest that were inadequately characterized or missed with the use of traditional paradigms, respectively. Preoperative fMRI of the analogous writing task was also assistive, displaying excellent spatial agreement with intraoperative mapping in all 4 cases. Sole use of traditional testing paradigms can be limiting during awake craniotomy procedures. Comprehensive assessment of language function will require additional use of more sophisticated and ecologically valid testing paradigms. The platform presented here provides a means to do so.
The clinical implications of mixed lymphocyte reaction with leukemic cells.
Kim, Hee-Je; Kim, Tai-Gyu; Cho, Hyun Il; Han, Hoon; Min, Woo-Sung; Kim, Chun-Choo
2002-11-01
To evaluate the clinical implications of a mixed lymphocyte reaction between leukemic cells and lymphocytes from HLA-matched sibling donors, we attempted to generate donor-derived, graft-versus-leukemia-effective cells and to define their characteristics. We studied 8 patients with chronic myelogenous leukemia (CML), including 5 patients in the chronic phase (CP), 3 patients in the accelerated phase (AP), and 2 patients with acute myelogenous leukemia (AML) in their first complete remission. Cells from these patients were used as stimulators in a mixed lymphocyte reaction.The effects of natural killer (NK) cells and cytotoxic T-lymphocytes (CTLs) were separated by observing tests for cytotoxicity to target cells, including K562 cells, the patient's leukemic cells, and phytohemagglutinin (PHA) blasts. Donor-derived antileukemic CTLs againstthe patient's own leukemic cells are productive in vitro. The efficacy of generating CTLs against leukemic target cells was (in decreasing order) AML, CML-CP, and CML-AP. Cytotoxic activity against leukemic targets was prominent in 4 cases--2 CML-CP and the 2 AML cases. On the contrary, the 3 cases of CML-AP showed low CTL activity. In cases showing 1 positive result among 3 targets (K562 cells, the patient's leukemic cells, and PHA blasts), the relapse rate was significantly lower (P = .022) on follow-up (median, 33 months; 7-40 months) after hematopoietic stem cell transplantation. By a combined analysis of the cytotoxicity effects for all 3 target cells, we were able to demonstrate a correlation between leukemic relapse and the variable degree of the cytotoxicity test results. Although the total sample numbers for this study were low, we speculate that these results may come from differences in the individual characteristics of the leukemic cells that are in line with their clinical disease status.
Challenges with controlling varicella in prison settings: Experience of California, 2010–2011
Leung, Jessica; Lopez, Adriana S.; Tootell, Elena; Baumrind, Nikki; Mohle-Boetani, Janet; Leistikow, Bruce; Harriman, Kathleen H.; Preas, Christopher P.; Cosentino, Giorgio; Bialek, Stephanie R.; Marin, Mona
2015-01-01
We describe the epidemiology of varicella in one state prison in California during 2010–2011, control measures implemented, and associated costs. Eleven varicella cases were reported, 9 associated with 2 outbreaks. One outbreak consisted of 3 cases and the second consisted of 6 cases with 2 generations of spread. Among exposed inmates serologically tested, 98% (643/656) were VZV sero-positive. The outbreaks resulted in >1,000 inmates exposed, 444 staff exposures, and >$160,000 in costs. We documented the challenges and costs associated with controlling and managing varicella in a prison setting. A screening policy for evidence of varicella immunity for incoming inmates and staff and vaccination of susceptible persons has the potential to mitigate the impact of future outbreaks and reduce resources necessary for managing cases and outbreaks. PMID:25201912
Promoting Creativity in International Business Education: A Protocol for Student-Constructed Cases
ERIC Educational Resources Information Center
Riordan, Diane A.; Sullivan, M. Cathy; Fink, Danny
2003-01-01
Case studies, including "archival cases," "documentary cases," "living cases," and "learner-generated cases," are popular teaching methods in the international business curriculum. In this paper we present a protocol for student-constructed cases, an extension of the learner-generated case, and provide an example using foreign currency exchange…
76 FR 6381 - Fee-Generating Cases
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-04
... LEGAL SERVICES CORPORATION 45 CFR Part 1609 Fee-Generating Cases AGENCY: Legal Services...) proposes to amend the Legal Services Corporation's regulation on fee-generating cases to clarify that it... Counsel, Office of Legal Affairs, Legal Services Corporation, 3333 K Street, NW., Washington, DC 20007...
Parametric vs. non-parametric daily weather generator: validation and comparison
NASA Astrophysics Data System (ADS)
Dubrovsky, Martin
2016-04-01
As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30 years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database.
Teaching clinical reasoning: case-based and coached.
Kassirer, Jerome P
2010-07-01
Optimal medical care is critically dependent on clinicians' skills to make the right diagnosis and to recommend the most appropriate therapy, and acquiring such reasoning skills is a key requirement at every level of medical education. Teaching clinical reasoning is grounded in several fundamental principles of educational theory. Adult learning theory posits that learning is best accomplished by repeated, deliberate exposure to real cases, that case examples should be selected for their reflection of multiple aspects of clinical reasoning, and that the participation of a coach augments the value of an educational experience. The theory proposes that memory of clinical medicine and clinical reasoning strategies is enhanced when errors in information, judgment, and reasoning are immediately pointed out and discussed. Rather than using cases artificially constructed from memory, real cases are greatly preferred because they often reflect the false leads, the polymorphisms of actual clinical material, and the misleading test results encountered in everyday practice. These concepts foster the teaching and learning of the diagnostic process, the complex trade-offs between the benefits and risks of diagnostic tests and treatments, and cognitive errors in clinical reasoning. The teaching of clinical reasoning need not and should not be delayed until students gain a full understanding of anatomy and pathophysiology. Concepts such as hypothesis generation, pattern recognition, context formulation, diagnostic test interpretation, differential diagnosis, and diagnostic verification provide both the language and the methods of clinical problem solving. Expertise is attainable even though the precise mechanisms of achieving it are not known.
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
Rapidly shifting environmental baselines among fishers of the Gulf of California
Sáenz-Arroyo, Andrea; Roberts, Callum M; Torre, Jorge; Cariño-Olvera, Micheline; Enríquez-Andrade, Roberto R
2005-01-01
Shifting environmental baselines are inter-generational changes in perception of the state of the environment. As one generation replaces another, people's perceptions of what is natural change even to the extent that they no longer believe historical anecdotes of past abundance or size of species. Although widely accepted, this phenomenon has yet to be quantitatively tested. Here we survey three generations of fishers from Mexico's Gulf of California (N=108), where fish populations have declined steeply over the last 60 years, to investigate how far and fast their environmental baselines are shifting. Compared to young fishers, old fishers named five times as many species and four times as many fishing sites as once being abundant/productive but now depleted (Kruskal–Wallis tests, both p<0.001) with no evidence of a slowdown in rates of loss experienced by younger compared to older generations (Kruskal–Wallis test, n.s. in both cases). Old fishers caught up to 25 times as many Gulf grouper Mycteroperca jordani as young fishers on their best ever fishing day (regression r2=0.62, p<0.001). Despite times of plentiful large fish still being within living memory, few young fishers appreciated that large species had ever been common or nearshore sites productive. Such rapid shifts in perception of what is natural help explain why society is tolerant of the creeping loss of biodiversity. They imply a large educational hurdle in efforts to reset expectations and targets for conservation. PMID:16191603
PM, carbon, and PAH emissions from a diesel generator fuelled with soy-biodiesel blends.
Tsai, Jen-Hsiung; Chen, Shui-Jen; Huang, Kuo-Lin; Lin, Yuan-Chung; Lee, Wen-Jhy; Lin, Chih-Chung; Lin, Wen-Yinn
2010-07-15
Biodiesels have received increasing attention as alternative fuels for diesel engines and generators. This study investigates the emissions of particulate matter (PM), total carbon (TC), e.g., organic/elemental carbons, and polycyclic aromatic hydrocarbons (PAHs) from a diesel generator fuelled with soy-biodiesel blends. Among the tested diesel blends (B0, B10 (10 vol% soy-biodiesel), B20, and B50), B20 exhibited the lowest PM emission concentration despite the loads (except the 5 kW case), whereas B10 displayed lower PM emission factors when operating at 0 and 10 kW than the other fuel blends. The emission concentrations or factors of EC, OC, and TC were the lowest when B10 or B20 was used regardless of the loading. Under all tested loads, the average concentrations of total-PAHs emitted from the generator using the B10 and B20 were lower (by 38% and 28%, respectively) than those using pure petroleum diesel fuel (B0), while the emission factors of total-PAHs decreased with an increasing ratio of biodiesel to premium diesel. With an increasing loading, although the brake specific fuel consumption decreased, the energy efficiency increased despite the bio/petroleum diesel ratio. Therefore, soy-biodiesel is promising for use as an alternative fuel for diesel generators to increase energy efficiency and reduce the PM, carbon, and PAH emissions. 2010 Elsevier B.V. All rights reserved.
Lange, M; Siemen, H; Blome, S; Thulke, H-H
2014-11-15
African swine fever (ASF) is a highly lethal viral disease of domestic pigs and wild boar. ASF was introduced into the southern Russian Federation in 2007 and is now reported to be spreading in populations of wild and domestic suids. An endemic situation in the local wild boar population would significantly complicate management of the disease in the livestock population. To date no sound method exists for identifying the characteristic pattern of an endemic situation, which describes infection persisting from generation to generation in the same population. To support urgent management decisions at the wildlife-livestock interface, a new algorithm was constructed to test the hypothesis of an endemic disease situation in wildlife on the basis of case reports. The approach described here uses spatial and temporal associations between observed diagnostic data to discriminate between endemic and non-endemic patterns of case occurrence. The algorithm was validated with data from an epidemiological simulation model and applied to ASF case data from southern Russia. Based on the algorithm and the diagnostic data available, the null hypothesis of an endemic situation of ASF in wild boar of the region was rejected. Copyright © 2014 Elsevier B.V. All rights reserved.
Compost in plant microbial fuel cell for bioelectricity generation.
Moqsud, M A; Yoshitake, J; Bushra, Q S; Hyodo, M; Omine, K; Strik, David
2015-02-01
Recycling of organic waste is an important topic in developing countries as well as developed countries. Compost from organic waste has been used for soil conditioner. In this study, an experiment has been carried out to produce green energy (bioelectricity) by using paddy plant microbial fuel cells (PMFCs) in soil mixed with compost. A total of six buckets filled with the same soil were used with carbon fiber as the electrodes for the test. Rice plants were planted in five of the buckets, with the sixth bucket containing only soil and an external resistance of 100 ohm was used for all cases. It was observed that the cells with rice plants and compost showed higher values of voltage and power density with time. The highest value of voltage showed around 700 mV when a rice plant with 1% compost mixed soil was used, however it was more than 95% less in the case of no rice plant and without compost. Comparing cases with and without compost but with the same number of rice plants, cases with compost depicted higher voltage to as much as 2 times. The power density was also 3 times higher when the compost was used in the paddy PMFCs which indicated the influence of compost on bio-electricity generation. Copyright © 2014 Elsevier Ltd. All rights reserved.
LEWICE3D/GlennHT Particle Analysis of the Honeywell Al502 Low Pressure Compressor
NASA Technical Reports Server (NTRS)
Bidwell, Colin S.; Rigby, David L.
2015-01-01
A flow and ice particle trajectory analysis was performed for the booster of the Honeywell AL502 engine. The analysis focused on two closely related conditions one of which produced a rollback and another which did not rollback during testing in the Propulsion Systems Lab at NASA Glenn Research Center. The flow analysis was generated using the NASA Glenn GlennHT flow solver and the particle analysis was generated using the NASA Glenn LEWICE3D v3.56 ice accretion software. The flow and particle analysis used a 3D steady flow, mixing plane approach to model the transport of flow and particles through the engine. The inflow conditions for the rollback case were: airspeed, 145 ms; static pressure, 33,373 Pa; static temperature, 253.3 K. The inflow conditions for the non-roll-back case were: airspeed, 153 ms; static pressure, 34,252 Pa; static temperature, 260.1 K. Both cases were subjected to an ice particle cloud with a median volume diameter of 24 microns, an ice water content of 2.0 gm3 and a relative humidity of 100 percent. The most significant difference between the rollback and non-rollback conditions was the inflow static temperature which was 6.8 K higher for the non-rollback case.
A random approach of test macro generation for early detection of hotspots
NASA Astrophysics Data System (ADS)
Lee, Jong-hyun; Kim, Chin; Kang, Minsoo; Hwang, Sungwook; Yang, Jae-seok; Harb, Mohammed; Al-Imam, Mohamed; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe
2016-03-01
Multiple-Patterning Technology (MPT) is still the preferred choice over EUV for the advanced technology nodes, starting the 20nm node. Down the way to 7nm and 5nm nodes, Self-Aligned Multiple Patterning (SAMP) appears to be one of the effective multiple patterning techniques in terms of achieving small pitch of printed lines on wafer, yet its yield is in question. Predicting and enhancing the yield in the early stages of technology development are some of the main objectives for creating test macros on test masks. While conventional yield ramp techniques for a new technology node have relied on using designs from previous technology nodes as a starting point to identify patterns for Design of Experiment (DoE) creation, these techniques are challenging to apply in the case of introducing an MPT technique like SAMP that did not exist in previous nodes. This paper presents a new strategy for generating test structures based on random placement of unit patterns that can construct more meaningful bigger patterns. Specifications governing the relationships between those unit patterns can be adjusted to generate layout clips that look like realistic SAMP designs. A via chain can be constructed to connect the random DoE of SAMP structures through a routing layer to external pads for electrical measurement. These clips are decomposed according to the decomposition rules of the technology into the appropriate mandrel and cut masks. The decomposed clips can be tested through simulations, or electrically on silicon to discover hotspots. The hotspots can be used in optimizing the fabrication process and models to fix them. They can also be used as learning patterns for DFM deck development. By expanding the size of the randomly generated test structures, more hotspots can be detected. This should provide a faster way to enhance the yield of a new technology node.
A moist aquaplanet variant of the Held–Suarez test for atmospheric model dynamical cores
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thatcher, Diana R.; Jablonowski, Christiane
A moist idealized test case (MITC) for atmospheric model dynamical cores is presented. The MITC is based on the Held–Suarez (HS) test that was developed for dry simulations on “a flat Earth” and replaces the full physical parameterization package with a Newtonian temperature relaxation and Rayleigh damping of the low-level winds. This new variant of the HS test includes moisture and thereby sheds light on the nonlinear dynamics–physics moisture feedbacks without the complexity of full-physics parameterization packages. In particular, it adds simplified moist processes to the HS forcing to model large-scale condensation, boundary-layer mixing, and the exchange of latent and sensible heat betweenmore » the atmospheric surface and an ocean-covered planet. Using a variety of dynamical cores of the National Center for Atmospheric Research (NCAR)'s Community Atmosphere Model (CAM), this paper demonstrates that the inclusion of the moist idealized physics package leads to climatic states that closely resemble aquaplanet simulations with complex physical parameterizations. This establishes that the MITC approach generates reasonable atmospheric circulations and can be used for a broad range of scientific investigations. This paper provides examples of two application areas. First, the test case reveals the characteristics of the physics–dynamics coupling technique and reproduces coupling issues seen in full-physics simulations. In particular, it is shown that sudden adjustments of the prognostic fields due to moist physics tendencies can trigger undesirable large-scale gravity waves, which can be remedied by a more gradual application of the physical forcing. Second, the moist idealized test case can be used to intercompare dynamical cores. These examples demonstrate the versatility of the MITC approach and suggestions are made for further application areas. Furthermore, the new moist variant of the HS test can be considered a test case of intermediate complexity.« less
A moist aquaplanet variant of the Held–Suarez test for atmospheric model dynamical cores
Thatcher, Diana R.; Jablonowski, Christiane
2016-04-04
A moist idealized test case (MITC) for atmospheric model dynamical cores is presented. The MITC is based on the Held–Suarez (HS) test that was developed for dry simulations on “a flat Earth” and replaces the full physical parameterization package with a Newtonian temperature relaxation and Rayleigh damping of the low-level winds. This new variant of the HS test includes moisture and thereby sheds light on the nonlinear dynamics–physics moisture feedbacks without the complexity of full-physics parameterization packages. In particular, it adds simplified moist processes to the HS forcing to model large-scale condensation, boundary-layer mixing, and the exchange of latent and sensible heat betweenmore » the atmospheric surface and an ocean-covered planet. Using a variety of dynamical cores of the National Center for Atmospheric Research (NCAR)'s Community Atmosphere Model (CAM), this paper demonstrates that the inclusion of the moist idealized physics package leads to climatic states that closely resemble aquaplanet simulations with complex physical parameterizations. This establishes that the MITC approach generates reasonable atmospheric circulations and can be used for a broad range of scientific investigations. This paper provides examples of two application areas. First, the test case reveals the characteristics of the physics–dynamics coupling technique and reproduces coupling issues seen in full-physics simulations. In particular, it is shown that sudden adjustments of the prognostic fields due to moist physics tendencies can trigger undesirable large-scale gravity waves, which can be remedied by a more gradual application of the physical forcing. Second, the moist idealized test case can be used to intercompare dynamical cores. These examples demonstrate the versatility of the MITC approach and suggestions are made for further application areas. Furthermore, the new moist variant of the HS test can be considered a test case of intermediate complexity.« less
Springer, Jan; White, P Lewis; Hamilton, Shanna; Michel, Denise; Barnes, Rosemary A; Einsele, Hermann; Löffler, Juergen
2016-03-01
Standardized methodologies for the molecular detection of invasive aspergillosis (IA) have been established by the European Aspergillus PCR Initiative for the testing of whole blood, serum, and plasma. While some comparison of the performance of Aspergillus PCR when testing these different sample types has been performed, no single study has evaluated all three using the recommended protocols. Standardized Aspergillus PCR was performed on 423 whole-blood pellets (WBP), 583 plasma samples, and 419 serum samples obtained from hematology patients according to the recommendations. This analysis formed a bicenter retrospective anonymous case-control study, with diagnosis according to the revised European Organization for Research and Treatment of Cancer/Invasive Fungal Infections Cooperative Group and National Institute of Allergy and Infectious Diseases Mycoses Study Group (EORTC/MSG) consensus definitions (11 probable cases and 36 controls). Values for clinical performance using individual and combined samples were calculated. For all samples, PCR positivity was significantly associated with cases of IA (for plasma, P = 0.0019; for serum, P = 0.0049; and for WBP, P = 0.0089). Plasma PCR generated the highest sensitivity (91%); the sensitivities for serum and WBP PCR were 80% and 55%, respectively. The highest specificity was achieved when testing WBP (96%), which was significantly superior to the specificities achieved when testing serum (69%, P = 0.0238) and plasma (53%, P = 0.0002). No cases were PCR negative in all specimen types, and no controls were PCR positive in all specimens. This study confirms that Aspergillus PCR testing of plasma provides robust performance while utilizing commercial automated DNA extraction processes. Combining PCR testing of different blood fractions allows IA to be both confidently diagnosed and excluded. A requirement for multiple PCR-positive plasma samples provides similar diagnostic utility and is technically less demanding. Time to diagnosis may be enhanced by testing multiple contemporaneously obtained sample types. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Verifying Digital Components of Physical Systems: Experimental Evaluation of Test Quality
NASA Astrophysics Data System (ADS)
Laputenko, A. V.; López, J. E.; Yevtushenko, N. V.
2018-03-01
This paper continues the study of high quality test derivation for verifying digital components which are used in various physical systems; those are sensors, data transfer components, etc. We have used logic circuits b01-b010 of the package of ITC'99 benchmarks (Second Release) for experimental evaluation which as stated before, describe digital components of physical systems designed for various applications. Test sequences are derived for detecting the most known faults of the reference logic circuit using three different approaches to test derivation. Three widely used fault types such as stuck-at-faults, bridges, and faults which slightly modify the behavior of one gate are considered as possible faults of the reference behavior. The most interesting test sequences are short test sequences that can provide appropriate guarantees after testing, and thus, we experimentally study various approaches to the derivation of the so-called complete test suites which detect all fault types. In the first series of experiments, we compare two approaches for deriving complete test suites. In the first approach, a shortest test sequence is derived for testing each fault. In the second approach, a test sequence is pseudo-randomly generated by the use of an appropriate software for logic synthesis and verification (ABC system in our study) and thus, can be longer. However, after deleting sequences detecting the same set of faults, a test suite returned by the second approach is shorter. The latter underlines the fact that in many cases it is useless to spend `time and efforts' for deriving a shortest distinguishing sequence; it is better to use the test minimization afterwards. The performed experiments also show that the use of only randomly generated test sequences is not very efficient since such sequences do not detect all the faults of any type. After reaching the fault coverage around 70%, saturation is observed, and the fault coverage cannot be increased anymore. For deriving high quality short test suites, the approach that is the combination of randomly generated sequences together with sequences which are aimed to detect faults not detected by random tests, allows to reach the good fault coverage using shortest test sequences.
Simultaneous Detection of Multiple Disease States.
1990-02-14
analytes to be assayed in a panel forma And due to its simplicity, OIA has been demonstrated to be generally applicable to a wide range of testing...into two distinct formats on the basis of signal generation: visual and instrumented. In both cases monocrystalline silicon wafers are employed as...Due to the limited surface area available on the monocrystalline silicon wafers, attention must be paid to efficient immobilization to ensure
Satellite angular velocity estimation based on star images and optical flow techniques.
Fasano, Giancarmine; Rufino, Giancarlo; Accardo, Domenico; Grassi, Michele
2013-09-25
An optical flow-based technique is proposed to estimate spacecraft angular velocity based on sequences of star-field images. It does not require star identification and can be thus used to also deliver angular rate information when attitude determination is not possible, as during platform de tumbling or slewing. Region-based optical flow calculation is carried out on successive star images preprocessed to remove background. Sensor calibration parameters, Poisson equation, and a least-squares method are then used to estimate the angular velocity vector components in the sensor rotating frame. A theoretical error budget is developed to estimate the expected angular rate accuracy as a function of camera parameters and star distribution in the field of view. The effectiveness of the proposed technique is tested by using star field scenes generated by a hardware-in-the-loop testing facility and acquired by a commercial-off-the shelf camera sensor. Simulated cases comprise rotations at different rates. Experimental results are presented which are consistent with theoretical estimates. In particular, very accurate angular velocity estimates are generated at lower slew rates, while in all cases the achievable accuracy in the estimation of the angular velocity component along boresight is about one order of magnitude worse than the other two components.
Satellite Angular Velocity Estimation Based on Star Images and Optical Flow Techniques
Fasano, Giancarmine; Rufino, Giancarlo; Accardo, Domenico; Grassi, Michele
2013-01-01
An optical flow-based technique is proposed to estimate spacecraft angular velocity based on sequences of star-field images. It does not require star identification and can be thus used to also deliver angular rate information when attitude determination is not possible, as during platform de tumbling or slewing. Region-based optical flow calculation is carried out on successive star images preprocessed to remove background. Sensor calibration parameters, Poisson equation, and a least-squares method are then used to estimate the angular velocity vector components in the sensor rotating frame. A theoretical error budget is developed to estimate the expected angular rate accuracy as a function of camera parameters and star distribution in the field of view. The effectiveness of the proposed technique is tested by using star field scenes generated by a hardware-in-the-loop testing facility and acquired by a commercial-off-the shelf camera sensor. Simulated cases comprise rotations at different rates. Experimental results are presented which are consistent with theoretical estimates. In particular, very accurate angular velocity estimates are generated at lower slew rates, while in all cases the achievable accuracy in the estimation of the angular velocity component along boresight is about one order of magnitude worse than the other two components. PMID:24072023
Goddard, Katrina A B; Whitlock, Evelyn P; Berg, Jonathan S; Williams, Marc S; Webber, Elizabeth M; Webster, Jennifer A; Lin, Jennifer S; Schrader, Kasmintan A; Campos-Outcalt, Doug; Offit, Kenneth; Feigelson, Heather Spencer; Hollombe, Celine
2013-09-01
The aim of this study was to develop, operationalize, and pilot test a transparent, reproducible, and evidence-informed method to determine when to report incidental findings from next-generation sequencing technologies. Using evidence-based principles, we proposed a three-stage process. Stage I "rules out" incidental findings below a minimal threshold of evidence and is evaluated using inter-rater agreement and comparison with an expert-based approach. Stage II documents criteria for clinical actionability using a standardized approach to allow experts to consistently consider and recommend whether results should be routinely reported (stage III). We used expert opinion to determine the face validity of stages II and III using three case studies. We evaluated the time and effort for stages I and II. For stage I, we assessed 99 conditions and found high inter-rater agreement (89%), and strong agreement with a separate expert-based method. Case studies for familial adenomatous polyposis, hereditary hemochromatosis, and α1-antitrypsin deficiency were all recommended for routine reporting as incidental findings. The method requires <3 days per topic. We establish an operational definition of clinically actionable incidental findings and provide documentation and pilot testing of a feasible method that is scalable to the whole genome.
Liu, Xiaofeng; Bai, Fang; Ouyang, Sisheng; Wang, Xicheng; Li, Honglin; Jiang, Hualiang
2009-03-31
Conformation generation is a ubiquitous problem in molecule modelling. Many applications require sampling the broad molecular conformational space or perceiving the bioactive conformers to ensure success. Numerous in silico methods have been proposed in an attempt to resolve the problem, ranging from deterministic to non-deterministic and systemic to stochastic ones. In this work, we described an efficient conformation sampling method named Cyndi, which is based on multi-objective evolution algorithm. The conformational perturbation is subjected to evolutionary operation on the genome encoded with dihedral torsions. Various objectives are designated to render the generated Pareto optimal conformers to be energy-favoured as well as evenly scattered across the conformational space. An optional objective concerning the degree of molecular extension is added to achieve geometrically extended or compact conformations which have been observed to impact the molecular bioactivity (J Comput -Aided Mol Des 2002, 16: 105-112). Testing the performance of Cyndi against a test set consisting of 329 small molecules reveals an average minimum RMSD of 0.864 A to corresponding bioactive conformations, indicating Cyndi is highly competitive against other conformation generation methods. Meanwhile, the high-speed performance (0.49 +/- 0.18 seconds per molecule) renders Cyndi to be a practical toolkit for conformational database preparation and facilitates subsequent pharmacophore mapping or rigid docking. The copy of precompiled executable of Cyndi and the test set molecules in mol2 format are accessible in Additional file 1. On the basis of MOEA algorithm, we present a new, highly efficient conformation generation method, Cyndi, and report the results of validation and performance studies comparing with other four methods. The results reveal that Cyndi is capable of generating geometrically diverse conformers and outperforms other four multiple conformer generators in the case of reproducing the bioactive conformations against 329 structures. The speed advantage indicates Cyndi is a powerful alternative method for extensive conformational sampling and large-scale conformer database preparation.
NASA Astrophysics Data System (ADS)
Béguin, A.; Nicolet, C.; Hell, J.; Moreira, C.
2017-04-01
The paper explores the improvement in ancillary services that variable speed technologies can provide for the case of an existing pumped storage power plant of 2x210 MVA which conversion from fixed speed to variable speed is investigated with a focus on the power step performances of the units. First two motor-generator variable speed technologies are introduced, namely the Doubly Fed Induction Machine (DFIM) and the Full Scale Frequency Converter (FSFC). Then a detailed numerical simulation model of the investigated power plant used to simulate power steps response and comprising the waterways, the pump-turbine unit, the motor-generator, the grid connection and the control systems is presented. Hydroelectric system time domain simulations are performed in order to determine the shortest response time achievable, taking into account the constraints from the maximum penstock pressure and from the rotational speed limits. It is shown that the maximum instantaneous power step response up and down depends on the hydro-mechanical characteristics of the pump-turbine unit and of the motor-generator speed limits. As a results, for the investigated test case, the FSFC solution offer the best power step response performances.
Grid generation and adaptation via Monge-Kantorovich optimization in 2D and 3D
NASA Astrophysics Data System (ADS)
Delzanno, Gian Luca; Chacon, Luis; Finn, John M.
2008-11-01
In a recent paper [1], Monge-Kantorovich (MK) optimization was proposed as a method of grid generation/adaptation in two dimensions (2D). The method is based on the minimization of the L2 norm of grid point displacement, constrained to producing a given positive-definite cell volume distribution (equidistribution constraint). The procedure gives rise to the Monge-Amp'ere (MA) equation: a single, non-linear scalar equation with no free-parameters. The MA equation was solved in Ref. [1] with the Jacobian Free Newton-Krylov technique and several challenging test cases were presented in squared domains in 2D. Here, we extend the work of Ref. [1]. We first formulate the MK approach in physical domains with curved boundary elements and in 3D. We then show the results of applying it to these more general cases. We show that MK optimization produces optimal grids in which the constraint is satisfied numerically to truncation error. [1] G.L. Delzanno, L. Chac'on, J.M. Finn, Y. Chung, G. Lapenta, A new, robust equidistribution method for two-dimensional grid generation, submitted to Journal of Computational Physics (2008).
Evaluation of the entropy consistent euler flux on 1D and 2D test problems
NASA Astrophysics Data System (ADS)
Roslan, Nur Khairunnisa Hanisah; Ismail, Farzad
2012-06-01
Perhaps most CFD simulations may yield good predictions of pressure and velocity when compared to experimental data. Unfortunately, these results will most likely not adhere to the second law of thermodynamics hence comprising the authenticity of predicted data. Currently, the test of a good CFD code is to check how much entropy is generated in a smooth flow and hope that the numerical entropy produced is of the correct sign when a shock is encountered. Herein, a shock capturing code written in C++ based on a recent entropy consistent Euler flux is developed to simulate 1D and 2D flows. Unlike other finite volume schemes in commercial CFD code, this entropy consistent flux (EC) function precisely satisfies the discrete second law of thermodynamics. This EC flux has an entropy-conserved part, preserving entropy for smooth flows and a numerical diffusion part that will accurately produce the proper amount of entropy, consistent with the second law. Several numerical simulations of the entropy consistent flux have been tested on two dimensional test cases. The first case is a Mach 3 flow over a forward facing step. The second case is a flow over a NACA 0012 airfoil while the third case is a hypersonic flow passing over a 2D cylinder. Local flow quantities such as velocity and pressure are analyzed and then compared with mainly the Roe flux. The results herein show that the EC flux does not capture the unphysical rarefaction shock unlike the Roe-flux and does not easily succumb to the carbuncle phenomenon. In addition, the EC flux maintains good performance in cases where the Roe flux is known to be superior.
Yan, Song; Li, Yun
2014-02-15
Despite its great capability to detect rare variant associations, next-generation sequencing is still prohibitively expensive when applied to large samples. In case-control studies, it is thus appealing to sequence only a subset of cases to discover variants and genotype the identified variants in controls and the remaining cases under the reasonable assumption that causal variants are usually enriched among cases. However, this approach leads to inflated type-I error if analyzed naively for rare variant association. Several methods have been proposed in recent literature to control type-I error at the cost of either excluding some sequenced cases or correcting the genotypes of discovered rare variants. All of these approaches thus suffer from certain extent of information loss and thus are underpowered. We propose a novel method (BETASEQ), which corrects inflation of type-I error by supplementing pseudo-variants while keeps the original sequence and genotype data intact. Extensive simulations and real data analysis demonstrate that, in most practical situations, BETASEQ leads to higher testing powers than existing approaches with guaranteed (controlled or conservative) type-I error. BETASEQ and associated R files, including documentation, examples, are available at http://www.unc.edu/~yunmli/betaseq
NASA Technical Reports Server (NTRS)
Retallick, F. D.
1980-01-01
Directly-fired, separately-fired, and oxygen-augmented MHD power plants incorporating a disk geometry for the MHD generator were studied. The base parameters defined for four near-optimum-performance MHD steam power systems of various types are presented. The finally selected systems consisted of (1) two directly fired cases, one at 1920 K (2996F) preheat and the other at 1650 K (2500 F) preheat, (2) a separately-fired case where the air is preheated to the same level as the higher temperature directly-fired cases, and (3) an oxygen augmented case with the same generator inlet temperature of 2839 (4650F) as the high temperature directly-fired and separately-fired cases. Supersonic Mach numbers at the generator inlet, gas inlet swirl, and constant Hall field operation were specified based on disk generator optimization. System pressures were based on optimization of MHD net power. Supercritical reheat stream plants were used in all cases. Open and closed cycle component costs are summarized and compared.
Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies
NASA Astrophysics Data System (ADS)
Brune, Ryan Carl
Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.
Using cytology to increase small animal practice revenue.
Hodges, Joanne
2013-11-01
Diagnostic cytology is a useful, noninvasive test with practical foundations in high-quality medicine and applications to practice building. Cytology will generate practice revenue whether assessed in-house or sent to a clinical pathologist. Thorough in-house evaluation is adequate in some cases, but expert opinion is important in many cases. Specimen slides should at least be reviewed in-house for assessment of cellularity and potential artifacts before submission to a reference laboratory. Reference laboratories also provide special stains and advanced molecular diagnostics to help further characterize many neoplastic processes, search for organisms, identify pigments, and address other important aspects of the lesion. Copyright © 2013 Elsevier Inc. All rights reserved.
Yeo, Lami; Romero, Roberto; Jodicke, Cristiano; Kim, Sun Kwon; Gonzalez, Juan M.; Oggè, Giovanna; Lee, Wesley; Kusanovic, Juan Pedro; Vaisbuch, Edi; Hassan, Sonia S.
2010-01-01
Objective To describe a novel and simple technique (STAR: Simple Targeted Arterial Rendering) to visualize the fetal cardiac outflow tracts from dataset volumes obtained with spatiotemporal image correlation (STIC) and applying a new display technology (OmniView). Methods We developed a technique to image the outflow tracts by drawing three dissecting lines through the four-chamber view of the heart contained in a STIC volume dataset. Each line generated the following plane: 1) Line 1: ventricular septum “en face” with both great vessels (pulmonary artery anterior to the aorta); 2) Line 2: pulmonary artery with continuation into the longitudinal view of the ductal arch; and 3) Line 3: long axis view of the aorta arising from the left ventricle. The pattern formed by all 3 lines intersecting approximately through the crux of the heart resembles a “star”. The technique was then tested in 50 normal hearts (15.3 – 40.4 weeks of gestation). To determine if the technique could identify planes that departed from the normal images, we tested the technique in 4 cases with proven congenital heart defects (ventricular septal defect, transposition of great vessels, tetralogy of Fallot, and pulmonary atresia with intact ventricular septum). Results The STAR technique was able to generate the intended planes in all 50 normal cases. In the abnormal cases, the STAR technique allowed identification of the ventricular septal defect, demonstrated great vessel anomalies, and displayed views that deviated from what was expected from the examination of normal hearts. Conclusions This novel and simple technique can be used to visualize the outflow tracts and ventricular septum “en face” in normal fetal hearts. The inability to obtain expected views or the appearance of abnormal views in the generated planes should raise the index of suspicion for congenital heart disease involving the great vessels and/or the ventricular septum. The STAR technique may simplify examination of the fetal heart and could reduce operator dependency. PMID:20878672
Woon, See-Tarn; Ameratunga, Rohan
2016-01-01
New Zealand is a developed geographically isolated country in the South Pacific with a population of 4.4 million. Genetic diagnosis is the standard of care for most patients with primary immunodeficiency disorders (PIDs). Since 2005, we have offered a comprehensive genetic testing service for PIDs and other immune-related disorders with a published sequence. Here we present results for this program, over the first decade, between 2005 and 2014. We undertook testing in 228 index cases and 32 carriers during this time. The three most common test requests were for X-linked lymphoproliferative (XLP), tumour necrosis factor receptor associated periodic syndrome (TRAPS) and haemophagocytic lymphohistiocytosis (HLH). Of the 32 suspected XLP cases, positive diagnoses were established in only 2 patients. In contrast, genetic defects in 8 of 11 patients with suspected X-linked agammaglobulinemia (XLA) were confirmed. Most XLA patients were initially identified from absence of B cells. Overall, positive diagnoses were made in about 23% of all tests requested. The diagnostic rate was lowest for several conditions with locus heterogeneity. Thorough clinical characterisation of patients can assist in prioritising which genes should be tested. The clinician-driven customised comprehensive genetic service has worked effectively for New Zealand. Next generation sequencing will play an increasing role in disorders with locus heterogeneity.
Ecosystem Services Insights into Water Resources Management in China: A Case of Xi'an City.
Liu, Jingya; Li, Jing; Gao, Ziyi; Yang, Min; Qin, Keyu; Yang, Xiaonan
2016-11-24
Global climate and environmental changes are endangering global water resources; and several approaches have been tested to manage and reduce the pressure on these decreasing resources. This study uses the case study of Xi'an City in China to test reasonable and effective methods to address water resource shortages. The study generated a framework combining ecosystem services and water resource management. Seven ecosystem indicators were classified as supply services, regulating services, or cultural services. Index values for each indicator were calculated, and based on questionnaire results, each index's weight was calculated. Using the Likert method, we calculated ecosystem service supplies in every region of the city. We found that the ecosystem's service capability is closely related to water resources, providing a method for managing water resources. Using Xi'an City as an example, we apply the ecosystem services concept to water resources management, providing a method for decision makers.
Evaluation of third generation anti-HCV enzyme immunoassays.
Panigrahi, A K; Nayak, B; Dixit, R; Acharya, S K; Panda, S K
1998-01-01
The Hepatitis C Virus (HCV) is a major cause of post transfusion hepatitis. The introduction of HCV antibody screening has reduced the risk of post transfusion hepatitis significantly. However, the test is yet to be used routinely in blood banks of several developing countries with limited resources. We have developed an Enzyme immunoassay using synthetic peptides. The test was compared to seven commercial tests available in the Indian market. The test was evaluated using a panel of 90 sera which were chosen from an earlier panel based on detection of HCV RNA by Reverse Transcription Polymerase Chain Reaction RT-PCR. In case of any discrepancy the sera were further analysed by Line immunoassay (LIA). The sensitivity of the in house EIA was 90%. The specificity of the commercial EIAs varied.
System-Level Testing of the Advanced Stirling Radioisotope Generator Engineering Hardware
NASA Technical Reports Server (NTRS)
Chan, Jack; Wiser, Jack; Brown, Greg; Florin, Dominic; Oriti, Salvatore M.
2014-01-01
To support future NASA deep space missions, a radioisotope power system utilizing Stirling power conversion technology was under development. This development effort was performed under the joint sponsorship of the Department of Energy and NASA, until its termination at the end of 2013 due to budget constraints. The higher conversion efficiency of the Stirling cycle compared with that of the Radioisotope Thermoelectric Generators (RTGs) used in previous missions (Viking, Pioneer, Voyager, Galileo, Ulysses, Cassini, Pluto New Horizons and Mars Science Laboratory) offers the advantage of a four-fold reduction in Pu-238 fuel, thereby extending its limited domestic supply. As part of closeout activities, system-level testing of flight-like Advanced Stirling Convertors (ASCs) with a flight-like ASC Controller Unit (ACU) was performed in February 2014. This hardware is the most representative of the flight design tested to date. The test fully demonstrates the following ACU and system functionality: system startup; ASC control and operation at nominal and worst-case operating conditions; power rectification; DC output power management throughout nominal and out-of-range host voltage levels; ACU fault management, and system command / telemetry via MIL-STD 1553 bus. This testing shows the viability of such a system for future deep space missions and bolsters confidence in the maturity of the flight design.
Investigation, Analysis, and Testing of Self-contained Oxygen Generators
NASA Technical Reports Server (NTRS)
Keddy, Christopher P.; Haas, Jon P.; Starritt, Larry
2008-01-01
Self Contained Oxygen Generators (SCOGs) have widespread use in providing emergency breathing oxygen in a variety of environments including mines, submarines, spacecraft, and aircraft. These devices have definite advantages over storing of gaseous or liquid oxygen. The oxygen is not generated until a chemical briquette containing a chlorate or perchlorate oxidizer and a solid metallic fuel such as iron is ignited starting a thermal decomposition process allowing gaseous oxygen to be produced. These devices are typically very safe to store, easy to operate, and have primarily only a thermal hazard to the operator that can be controlled by barriers or furnaces. Tens of thousands of these devices are operated worldwide every year without major incident. This report examines the rare case of a SCOG whose behavior was both abnormal and lethal. This particular type of SCOG reviewed is nearly identical to a flight qualified version of SCOG slated for use on manned space vehicles. This Investigative Report is a compilation of a NASA effort in conjunction with other interested parties including military and aerospace to understand the causes of the particular SCOG accident and what preventative measures can be taken to ensure this incident is not repeated. This report details the incident and examines the root causes of the observed SCOG behavior from forensic evidence. A summary of chemical and numerical analysis is provided as a background to physical testing of identical SCOG devices. The results and findings of both small scale and full scale testing are documented on a test-by-test basis along with observations and summaries. Finally, conclusions are presented on the findings of this investigation, analysis, and testing along with suggestions on preventative measures for any entity interested in the safe use of these devices.
Model-Based Development of Automotive Electronic Climate Control Software
NASA Astrophysics Data System (ADS)
Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan
With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Bradley, E-mail: brma7253@colorado.edu; Fornberg, Bengt, E-mail: Fornberg@colorado.edu
In a previous study of seismic modeling with radial basis function-generated finite differences (RBF-FD), we outlined a numerical method for solving 2-D wave equations in domains with material interfaces between different regions. The method was applicable on a mesh-free set of data nodes. It included all information about interfaces within the weights of the stencils (allowing the use of traditional time integrators), and was shown to solve problems of the 2-D elastic wave equation to 3rd-order accuracy. In the present paper, we discuss a refinement of that method that makes it simpler to implement. It can also improve accuracy formore » the case of smoothly-variable model parameter values near interfaces. We give several test cases that demonstrate the method solving 2-D elastic wave equation problems to 4th-order accuracy, even in the presence of smoothly-curved interfaces with jump discontinuities in the model parameters.« less
Quadrupedal Robot Locomotion: A Biologically Inspired Approach and Its Hardware Implementation
Espinal, A.; Rostro-Gonzalez, H.; Carpio, M.; Guerra-Hernandez, E. I.; Ornelas-Rodriguez, M.; Puga-Soberanes, H. J.; Sotelo-Figueroa, M. A.; Melin, P.
2016-01-01
A bioinspired locomotion system for a quadruped robot is presented. Locomotion is achieved by a spiking neural network (SNN) that acts as a Central Pattern Generator (CPG) producing different locomotion patterns represented by their raster plots. To generate these patterns, the SNN is configured with specific parameters (synaptic weights and topologies), which were estimated by a metaheuristic method based on Christiansen Grammar Evolution (CGE). The system has been implemented and validated on two robot platforms; firstly, we tested our system on a quadruped robot and, secondly, on a hexapod one. In this last one, we simulated the case where two legs of the hexapod were amputated and its locomotion mechanism has been changed. For the quadruped robot, the control is performed by the spiking neural network implemented on an Arduino board with 35% of resource usage. In the hexapod robot, we used Spartan 6 FPGA board with only 3% of resource usage. Numerical results show the effectiveness of the proposed system in both cases. PMID:27436997
Quadrupedal Robot Locomotion: A Biologically Inspired Approach and Its Hardware Implementation.
Espinal, A; Rostro-Gonzalez, H; Carpio, M; Guerra-Hernandez, E I; Ornelas-Rodriguez, M; Puga-Soberanes, H J; Sotelo-Figueroa, M A; Melin, P
2016-01-01
A bioinspired locomotion system for a quadruped robot is presented. Locomotion is achieved by a spiking neural network (SNN) that acts as a Central Pattern Generator (CPG) producing different locomotion patterns represented by their raster plots. To generate these patterns, the SNN is configured with specific parameters (synaptic weights and topologies), which were estimated by a metaheuristic method based on Christiansen Grammar Evolution (CGE). The system has been implemented and validated on two robot platforms; firstly, we tested our system on a quadruped robot and, secondly, on a hexapod one. In this last one, we simulated the case where two legs of the hexapod were amputated and its locomotion mechanism has been changed. For the quadruped robot, the control is performed by the spiking neural network implemented on an Arduino board with 35% of resource usage. In the hexapod robot, we used Spartan 6 FPGA board with only 3% of resource usage. Numerical results show the effectiveness of the proposed system in both cases.
NASA Astrophysics Data System (ADS)
Martin, Bradley; Fornberg, Bengt
2017-04-01
In a previous study of seismic modeling with radial basis function-generated finite differences (RBF-FD), we outlined a numerical method for solving 2-D wave equations in domains with material interfaces between different regions. The method was applicable on a mesh-free set of data nodes. It included all information about interfaces within the weights of the stencils (allowing the use of traditional time integrators), and was shown to solve problems of the 2-D elastic wave equation to 3rd-order accuracy. In the present paper, we discuss a refinement of that method that makes it simpler to implement. It can also improve accuracy for the case of smoothly-variable model parameter values near interfaces. We give several test cases that demonstrate the method solving 2-D elastic wave equation problems to 4th-order accuracy, even in the presence of smoothly-curved interfaces with jump discontinuities in the model parameters.
Flexible, fast and accurate sequence alignment profiling on GPGPU with PaSWAS.
Warris, Sven; Yalcin, Feyruz; Jackson, Katherine J L; Nap, Jan Peter
2015-01-01
To obtain large-scale sequence alignments in a fast and flexible way is an important step in the analyses of next generation sequencing data. Applications based on the Smith-Waterman (SW) algorithm are often either not fast enough, limited to dedicated tasks or not sufficiently accurate due to statistical issues. Current SW implementations that run on graphics hardware do not report the alignment details necessary for further analysis. With the Parallel SW Alignment Software (PaSWAS) it is possible (a) to have easy access to the computational power of NVIDIA-based general purpose graphics processing units (GPGPUs) to perform high-speed sequence alignments, and (b) retrieve relevant information such as score, number of gaps and mismatches. The software reports multiple hits per alignment. The added value of the new SW implementation is demonstrated with two test cases: (1) tag recovery in next generation sequence data and (2) isotype assignment within an immunoglobulin 454 sequence data set. Both cases show the usability and versatility of the new parallel Smith-Waterman implementation.
The role of vocal individuality in conservation
Terry, Andrew MR; Peake, Tom M; McGregor, Peter K
2005-01-01
Identifying the individuals within a population can generate information on life history parameters, generate input data for conservation models, and highlight behavioural traits that may affect management decisions and error or bias within census methods. Individual animals can be discriminated by features of their vocalisations. This vocal individuality can be utilised as an alternative marking technique in situations where the marks are difficult to detect or animals are sensitive to disturbance. Vocal individuality can also be used in cases were the capture and handling of an animal is either logistically or ethically problematic. Many studies have suggested that vocal individuality can be used to count and monitor populations over time; however, few have explicitly tested the method in this role. In this review we discuss methods for extracting individuality information from vocalisations and techniques for using this to count and monitor populations over time. We present case studies in birds where vocal individuality has been applied to conservation and we discuss its role in mammals. PMID:15960848
Chitty, Lyn S; Mason, Sarah; Barrett, Angela N; McKay, Fiona; Lench, Nicholas; Daley, Rebecca; Jenkins, Lucy A
2015-01-01
Abstract Objective Accurate prenatal diagnosis of genetic conditions can be challenging and usually requires invasive testing. Here, we demonstrate the potential of next-generation sequencing (NGS) for the analysis of cell-free DNA in maternal blood to transform prenatal diagnosis of monogenic disorders. Methods Analysis of cell-free DNA using a PCR and restriction enzyme digest (PCR–RED) was compared with a novel NGS assay in pregnancies at risk of achondroplasia and thanatophoric dysplasia. Results PCR–RED was performed in 72 cases and was correct in 88.6%, inconclusive in 7% with one false negative. NGS was performed in 47 cases and was accurate in 96.2% with no inconclusives. Both approaches were used in 27 cases, with NGS giving the correct result in the two cases inconclusive with PCR–RED. Conclusion NGS provides an accurate, flexible approach to non-invasive prenatal diagnosis of de novo and paternally inherited mutations. It is more sensitive than PCR–RED and is ideal when screening a gene with multiple potential pathogenic mutations. These findings highlight the value of NGS in the development of non-invasive prenatal diagnosis for other monogenic disorders. © 2015 The Authors. Prenatal Diagnosis published by John Wiley & Sons, Ltd. What's already known about this topic? Non-invasive prenatal diagnosis (NIPD) using PCR-based methods has been reported for the detection or exclusion of individual paternally inherited or de novo alleles in maternal plasma. What does this study add? NIPD using next generation sequencing provides an accurate, more sensitive approach which can be used to detect multiple mutations in a single assay and so is ideal when screening a gene with multiple potential pathogenic mutations. Next generation sequencing thus provides a flexible approach to non-invasive prenatal diagnosis ideal for use in a busy service laboratory. PMID:25728633
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hui, C; Suh, Y; Robertson, D
Purpose: To develop a novel algorithm to generate internal respiratory signals for sorting of four-dimensional (4D) computed tomography (CT) images. Methods: The proposed algorithm extracted multiple time resolved features as potential respiratory signals. These features were taken from the 4D CT images and its Fourier transformed space. Several low-frequency locations in the Fourier space and selected anatomical features from the images were used as potential respiratory signals. A clustering algorithm was then used to search for the group of appropriate potential respiratory signals. The chosen signals were then normalized and averaged to form the final internal respiratory signal. Performance ofmore » the algorithm was tested in 50 4D CT data sets and results were compared with external signals from the real-time position management (RPM) system. Results: In almost all cases, the proposed algorithm generated internal respiratory signals that visibly matched the external respiratory signals from the RPM system. On average, the end inspiration times calculated by the proposed algorithm were within 0.1 s of those given by the RPM system. Less than 3% of the calculated end inspiration times were more than one time frame away from those given by the RPM system. In 3 out of the 50 cases, the proposed algorithm generated internal respiratory signals that were significantly smoother than the RPM signals. In these cases, images sorted using the internal respiratory signals showed fewer artifacts in locations corresponding to the discrepancy in the internal and external respiratory signals. Conclusion: We developed a robust algorithm that generates internal respiratory signals from 4D CT images. In some cases, it even showed the potential to outperform the RPM system. The proposed algorithm is completely automatic and generally takes less than 2 min to process. It can be easily implemented into the clinic and can potentially replace the use of external surrogates.« less
Vangheluwe, Marnix L. U.; Verdonck, Frederik A. M.; Besser, John M.; Brumbaugh, William G.; Ingersoll, Christopher G.; Schlekat, Christan E.; Rogevich Garman, Emily
2013-01-01
Within the framework of European Union chemical legislations an extensive data set on the chronic toxicity of sediment nickel has been generated. In the initial phase of testing, tests were conducted with 8 taxa of benthic invertebrates in 2 nickel-spiked sediments, including 1 reasonable worst-case sediment with low concentrations of acid-volatile sulfide (AVS) and total organic carbon. The following species were tested: amphipods (Hyalella azteca, Gammarus pseudolimnaeus), mayflies (Hexagenia sp.), oligochaetes (Tubifex tubifex, Lumbriculus variegatus), mussels (Lampsilis siliquoidea), and midges (Chironomus dilutus, Chironomus riparius). In the second phase, tests were conducted with the most sensitive species in 6 additional spiked sediments, thus generating chronic toxicity data for a total of 8 nickel-spiked sediments. A species sensitivity distribution was elaborated based on 10% effective concentrations yielding a threshold value of 94 mg Ni/kg dry weight under reasonable worst-case conditions. Data from all sediments were used to model predictive bioavailability relationships between chronic toxicity thresholds (20% effective concentrations) and AVS and Fe, and these models were used to derive site-specific sediment-quality criteria. Normalization of toxicity values reduced the intersediment variability in toxicity values significantly for the amphipod species Hyalella azteca and G. pseudolimnaeus, but these relationships were less clearly defined for the mayfly Hexagenia sp. Application of the models to prevailing local conditions resulted in threshold values ranging from 126 mg to 281 mg Ni/kg dry weight, based on the AVS model, and 143 mg to 265 mg Ni/kg dry weight, based on the Fe model
Sato, Yosuke; Uzuka, Takeo; Aoki, Hiroshi; Natsumeda, Manabu; Oishi, Makoto; Fukuda, Masafumi; Fujii, Yukihiko
2012-02-29
Near-infrared spectroscopy (NIRS) has proven to be useful for the evaluation of language lateralization in healthy subjects, infants, and epileptic patients. This study for the first time investigated the expressive and receptive language functions separately, using NIRS in presurgical glioma patients. We also describe a special case with dissociated pattern of language functions. Ten glioma patients were examined. Using NIRS, the hemodynamic changes during a verb generation task or story listening task were measured in the cerebral hemisphere on either side covering the language areas. Following the NIRS study, the Wada test was performed in all the patients. The NIRS study revealed increases of oxyhemoglobin and decreases of deoxyhemoglobin in the language areas elicited by both tasks. In 9 patients, who were all right-handed, the expressive and receptive language functions were lateralized to the left hemisphere. The results of the NIRS study were completely consistent with those of the Wada test. In the remaining 1 patient with a right sided insular glioma, who was right-handed, the NIRS study revealed stronger activation of the right inferior frontal region during the verb generation task, and stronger activation of the left superior temporal region during the story listening task. This dissociated language function was validated by the Wada test and the postoperative neurological course. These results demonstrate that a NIRS study using our technique is extremely valuable for preoperative assessment of the language functions and exemplifies how a preoperative NIRS study can allow detection of unforeseen language lateralization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
A novel compact low impedance Marx generator with quasi-rectangular pulse output
NASA Astrophysics Data System (ADS)
Liu, Hongwei; Jiang, Ping; Yuan, Jianqiang; Wang, Lingyun; Ma, Xun; Xie, Weiping
2018-04-01
In this paper, a novel low impedance compact Marx generator with near-square pulse output based on the Fourier theory is developed. Compared with the traditional Marx generator, capacitors with different capacity have been used. It can generate a high-voltage quasi-rectangular pulse with a width of 100 ns at low impedance load, and it also has high energy density and power density. The generator consists of 16 modules. Each module comprises an integrative single-ended plastic case capacitor with a nominal value of 54 nF, four ceramic capacitors with a nominal value of 1.5 nF, a gas switch, a charging inductor, a grounding inductor, and insulators which provide mechanical support for all elements. In the module, different discharge periods from different capacitors add to the main circuit to form a quasi-rectangular pulse. The design process of the generator is analyzed, and the test results are provided here. The generator achieved pulse output with a rise time of 32 ns, pulse width of 120 ns, flat-topped width (95%-95%) of 50 ns, voltage of 550 kV, and power of 20 GW.
Acoustic wave generation by microwaves and applications to nondestructive evaluation.
Hosten, Bernard; Bacon, Christophe; Guilliorit, Emmanuel
2002-05-01
Although acoustic wave generation by electromagnetic waves has been widely studied in the case of laser-generated ultrasounds, the literature on acoustic wave generation by thermal effects due to electromagnetic microwaves is very sparse. Several mechanisms have been suggested to explain the phenomenon of microwave generation, i.e. radiation pressure, electrostriction or thermal expansion. Now it is known that the main cause is the thermal expansion due to the microwave absorption. This paper will review the recent advances in the theory and experiments that introduce a new way to generate ultrasonic waves without contact for the purpose of nondestructive evaluation and control. The unidirectional theory based on Maxwell's equations, heat equation and thermoviscoelasticity predicts the generation of acoustic waves at interfaces and inside stratified materials. Acoustic waves are generated by a pulsed electromagnetic wave or a burst at a chosen frequency such that materials can be excited with a broad or narrow frequency range. Experiments show the generation of acoustic waves in water, viscoelastic polymers and composite materials shaped as rod and plates. From the computed and measured accelerations at interfaces, the viscoelastic and electromagnetic properties of materials such as polymers and composites can be evaluated (NDE). Preliminary examples of non-destructive testing applications are presented.
Lorenzo, Laura; Rogel, Ramon; Sanchez-Gonzalez, Jose V; Perez-Ardavin, Javier; Moreno, Elena; Lujan, Saturnino; Broseta, Enrique; Boronat, Francisco
2016-08-01
To evaluate the clinic characteristics, diagnosis, management, and costs of the adult acute scrotum in the emergency room (ER). Acute scrotum is a syndrome characterized by intense, acute scrotal pain that may be accompanied by other symptoms. It is usual in children and commonly found as well in adults, with different causal pathologies between these groups. Between November 2013 and September 2014, 669 cases of adult acute scrotum who presented to our ER were prospectively analyzed. Patients under 15 years of age were excluded. Patient age, reason for consultation, investigations performed, final diagnosis, management, and costs were evaluated. For the statistical analysis, the Mann-Whitney, Kruskal-Wallis U, and chi-square tests were used. A total of 669 cases of acute scrotum were analyzed. The mean age at presentation was 40.2 ± 17.3 years. The most presented diagnoses were orchiepididymitis (28.7%), epididymitis (28.4%), symptoms of uncertain etiology (25.1%), and orchitis (10.3%). Diagnostic tests were carried out in 57.8% of cases. Most cases were treated as outpatients (94.2%), with 5.83% admitted and 1% undergoing surgical treatment. Overall, 13.3% of patients represented to the ER. Abnormal results in blood and urine tests were more common among older patients and infectious pathologies. The average cost generated by an acute scrotum ER consult was 195.03€. Infectious pathologies are the most common causes of acute scrotum at ER. Abnormal blood and urine tests are unusual and are more common in older patients and infectious pathologies. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Chen, Mo; Hyppa-Martin, Jolene K.; Reichle, Joe E.; Symons, Frank J.
2016-01-01
Meaningfully synthesizing single case experimental data from intervention studies comprised of individuals with low incidence conditions and generating effect size estimates remains challenging. Seven effect size metrics were compared for single case design (SCD) data focused on teaching speech generating device use to individuals with…
Challenges with controlling varicella in prison settings: experience of California, 2010 to 2011.
Leung, Jessica; Lopez, Adriana S; Tootell, Elena; Baumrind, Nikki; Mohle-Boetani, Janet; Leistikow, Bruce; Harriman, Kathleen H; Preas, Christopher P; Cosentino, Giorgio; Bialek, Stephanie R; Marin, Mona
2014-10-01
This article describes the epidemiology of varicella in one state prison in California during 2010 and 2011, control measures implemented, and associated costs. Eleven varicella cases were reported, of which nine were associated with two outbreaks. One outbreak consisted of three cases and the second consisted of six cases with two generations of spread. Among exposed inmates serologically tested, 98% (643/656) were varicella-zoster virus seropositive. The outbreaks resulted in > 1,000 inmates exposed, 444 staff exposures, and > $160,000 in costs. The authors documented the challenges and costs associated with controlling and managing varicella in a prison setting. A screening policy for evidence of varicella immunity for incoming inmates and staff and vaccination of susceptible persons has the potential to mitigate the impact of future outbreaks and reduce resources necessary to manage cases and outbreaks. © The Author(s) 2014.
Mexican-American Males Providing Personal Care for their Mothers
Evans, Bronwynne C.; Belyea, Michael J.; Ume, Ebere
2011-01-01
We know little about Mexican-American (MA) family adaptation to critical events in the informal caregiving experience but, in these days of economic and social turmoil, sons must sometimes step up to provide personal care for their aging mothers. This article compares two empirically real cases of MA males who provided such care, in lieu of a female relative. The cases are selected from a federally-funded, descriptive, longitudinal, mixed methods study of 110 MA caregivers and their care recipients. In case-oriented research, investigators can generate propositions (connected sets of statements) that reflect their findings and conclusions, and can be tested against subsequent cases: Caregiving strain and burden in MA males may have more to do with physical and emotional costs than financial ones; MA males providing personal care for their mothers adopt a matter-of-fact approach as they act “against taboo”; and this approach is a new way to fulfill family obligations. PMID:21643486
[No X-chromosome linked juvenile foveal retinoschisis].
Pérez Alvarez, M J; Clement Fernández, F
2002-08-01
To describe the clinical characteristics of two cases of juvenile foveal retinoschisis in women with an atypical hereditary pattern, no X-chromosome linked. An autosomal recessive inheritance is proposed. Two generations of a family (5 members) in which only two sisters were evaluated. The complete examination of these two cases includes retinography, fluorescein angiography, automated perimetry, color vision testing, electroretinogram, electrooculogram and visually evoked potentials. Comparing our cases with the classic form of X-linked juvenile retinoschisis, they are less severely affected. The best visual acuity and the less disturbed or even normal electroretinogram confirm this fact. We emphasise the existence of isolated plaques of retinal pigment epithelium atrophy with perivascular pigment clumps without foveal schisis in one patient, which could represent an evolved form of this entity. The hereditary foveal juvenile retinoschisis in women suggests an autosomal inheritance (autosomal recessive in our cases) and presents less severe involvement (Arch Soc Esp Oftalmol 2002; 77: 443-448).
Cundy, Thomas P; Gattas, Nicholas E; White, Alan D; Najmaldin, Azad S
2015-08-01
The cumulative summation (CUSUM) method for learning curve analysis remains under-utilized in the surgical literature in general, and is described in only a small number of publications within the field of pediatric surgery. This study introduces the CUSUM analysis technique and applies it to evaluate the learning curve for pediatric robot-assisted laparoscopic pyeloplasty (RP). Clinical data were prospectively recorded for consecutive pediatric RP cases performed by a single-surgeon. CUSUM charts and tests were generated for set-up time, docking time, console time, operating time, total operating room time, and postoperative complications. Conversions and avoidable operating room delay were separately evaluated with respect to case experience. Comparisons between case experience and time-based outcomes were assessed using the Student's t-test and ANOVA for bi-phasic and multi-phasic learning curves respectively. Comparison between case experience and complication frequency was assessed using the Kruskal-Wallis test. A total of 90 RP cases were evaluated. The learning curve transitioned beyond the learning phase at cases 10, 15, 42, 57, and 58 for set-up time, docking time, console time, operating time, and total operating room time respectively. All comparisons of mean operating times between the learning phase and subsequent phases were statistically significant (P=<0.001-0.01). No significant difference was observed between case experience and frequency of post-operative complications (P=0.125), although the CUSUM chart demonstrated a directional change in slope for the last 12 cases in which there were high proportions of re-do cases and patients <6 months of age. The CUSUM method has a valuable role for learning curve evaluation and outcome quality monitoring. In applying this statistical technique to the largest reported single surgeon series of pediatric RP, we demonstrate numerous distinctly shaped learning curves and well-defined learning phase transition points. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Vorobyov, A. M.; Abdurashidov, T. O.; Bakulev, V. L.; But, A. B.; Kuznetsov, A. B.; Makaveev, A. T.
2015-04-01
The present work experimentally investigates suppression of acoustic fields generated by supersonic jets of the rocket-launch vehicles at the initial period of launch by water injection. Water jets are injected to the combined jet along its perimeter at an angle of 0° and 60°. The solid rocket motor with the rocket-launch vehicles simulator case is used at tests. Effectiveness of reduction of acoustic loads on the rocket-launch vehicles surface by way of creation of water barrier was proved. It was determined that injection angle of 60° has greater effectiveness to reduce pressure pulsation levels.
NASA Astrophysics Data System (ADS)
Hasbullah Mohd Isa, Wan; Fikri Muhammad, Khairul; Mohd Khairuddin, Ismail; Ishak, Ismayuzri; Razlan Yusoff, Ahmad
2016-02-01
This paper presents the new form of coils for electromagnetic energy harvesting system based on topology optimization method which look-liked a cap to maximize the power output. It could increase the number of magnetic flux linkage interception of a cylindrical permanent magnet which in this case is of 10mm diameter. Several coils with different geometrical properties have been build and tested on a vibration generator with frequency of 100Hz. The results showed that the coil with lowest number of winding transduced highest power output of 680μW while the highest number of windings generated highest voltage output of 0.16V.
Pattacini, Laura; Baeten, Jared M.; Thomas, Katherine K.; Fluharty, Tayler R.; Murnane, Pamela M.; Donnell, Deborah; Bukusi, Elizabeth; Ronald, Allan; Mugo, Nelly; Lingappa, Jairam R.; Celum, Connie; McElrath, M. Juliana; Lund, Jennifer M.
2015-01-01
Objective Two distinct hypotheses have been proposed for T-cell involvement in protection from HIV-1 acquisition. First, HIV-1-specific memory T-cell responses generated upon HIV-1 exposure could mount an efficient response to HIV-1 and inhibit the establishment of an infection. Second, a lower level of immune activation could reduce the numbers of activated, HIV-1-susceptible CD4+ T-cells, thereby diminishing the likelihood of infection. Methods To test these hypotheses, we conducted a prospective study among high-risk heterosexual men and women, and tested peripheral blood samples from individuals who subsequently acquired HIV-1 during follow-up (cases) and from a subset of those who remained HIV-1 uninfected (controls). Results We found no difference in HIV-1-specific immune responses between cases and controls, but Treg frequency was higher in controls as compared to cases and was negatively associated with frequency of effector memory CD4+ T-cells. Conclusions Our findings support the hypothesis that low immune activation assists in protection from HIV-1 infection. PMID:26656786
Vortical Flow Prediction Using an Adaptive Unstructured Grid Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2001-01-01
A computational fluid dynamics (CFD) method has been employed to compute vortical flows around slender wing/body configurations. The emphasis of the paper is on the effectiveness of an adaptive grid procedure in "capturing" concentrated vortices generated at sharp edges or flow separation lines of lifting surfaces flying at high angles of attack. The method is based on a tetrahedral unstructured grid technology developed at the NASA Langley Research Center. Two steady-state, subsonic, inviscid and Navier-Stokes flow test cases are presented to demonstrate the applicability of the method for solving practical vortical flow problems. The first test case concerns vortex flow over a simple 65deg delta wing with different values of leading-edge bluntness, and the second case is that of a more complex fighter configuration. The superiority of the adapted solutions in capturing the vortex flow structure over the conventional unadapted results is demonstrated by comparisons with the windtunnel experimental data. The study shows that numerical prediction of vortical flows is highly sensitive to the local grid resolution and that the implementation of grid adaptation is essential when applying CFD methods to such complicated flow problems.
Windshear database for forward-looking systems certification
NASA Technical Reports Server (NTRS)
Switzer, G. F.; Proctor, F. H.; Hinton, D. A.; Aanstoos, J. V.
1993-01-01
This document contains a description of a comprehensive database that is to be used for certification testing of airborne forward-look windshear detection systems. The database was developed by NASA Langley Research Center, at the request of the Federal Aviation Administration (FAA), to support the industry initiative to certify and produce forward-look windshear detection equipment. The database contains high resolution, three dimensional fields for meteorological variables that may be sensed by forward-looking systems. The database is made up of seven case studies which have been generated by the Terminal Area Simulation System, a state-of-the-art numerical system for the realistic modeling of windshear phenomena. The selected cases represent a wide spectrum of windshear events. General descriptions and figures from each of the case studies are included, as well as equations for F-factor, radar-reflectivity factor, and rainfall rate. The document also describes scenarios and paths through the data sets, jointly developed by NASA and the FAA, to meet FAA certification testing objectives. Instructions for reading and verifying the data from tape are included.
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F. A.
2014-12-01
Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.
Lee, David; Heo, Giseon; El-Bialy, Tarek; Carey, Jason P; Major, Paul W; Romanyk, Dan L
2017-07-01
To investigate initial forces acting on teeth around the arch during en masse retraction using an in vitro Orthodontic SIMulator (OSIM). The OSIM was used to represent the full maxillary arch in a case wherein both first premolars had been extracted. Dental and skeletal anchorage to a posted archwire and skeletal anchorage to a 10-mm power arm were all simulated. A 0.019 × 0.025-inch stainless steel archwire was used in all cases, and 15-mm light nickel-titanium springs were activated to approximately 150 g on both sides of the arch. A sample size of n = 40 springs were tested for each of the three groups. Multivariate analysis of variance (α = 0.05) was used to determine differences between treatment groups. In the anterior segment, it was found that skeletal anchorage with power arms generated the largest retraction force (P < .001). The largest vertical forces on the unit were generated using skeletal anchorage, followed by skeletal anchorage with power arms, and finally dental anchorage. Power arms were found to generate larger intrusive forces on the lateral incisors and extrusive forces on the canines than on other groups. For the posterior anchorage unit, dental anchorage generated the largest protraction and palatal forces. Negligible forces were measured for both skeletal anchorage groups. Vertical forces on the posterior unit were minimal in all cases (<0.1 N). All retraction methods produced sufficient forces to retract the anterior teeth during en masse retraction. Skeletal anchorage reduced forces on the posterior teeth but introduced greater vertical forces on the anterior teeth.
2014-07-15
content typing, rep-PCR, pulsed-field gel electrophoresis, optical mapping, and antimicrobial susceptibility testing (G. Gault et al., 2011; P...Tremlett, G, Pidd, 2011). This case demonstrates the vulnerability of our food supply and why unusual outbreaks involving endemic microbes must be taken as... food products to malevolent tampering, and the widespread international economic consequences that can occur even from limited product contamination
Autonomous satellite navigation by stellar refraction
NASA Technical Reports Server (NTRS)
Gounley, R.; White, R.; Gai, E.
1983-01-01
This paper describes an error analysis of an autonomous navigator using refraction measurements of starlight passing through the upper atmosphere. The analysis is based on a discrete linear Kalman filter. The filter generated steady-state values of navigator performance for a variety of test cases. Results of these simulations show that in low-earth orbit position-error standard deviations of less than 0.100 km may be obtained using only 40 star sightings per orbit.
Recent Developments in OVERGRID, OVERFLOW-2 and Chimera Grid Tools Scripts
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
OVERGRID and OVERFLOW-2 feature easy to use multiple-body dynamics. The new features of OVERGRID include a preliminary chemistry interface, standard atmosphere and mass properties calculators, a simple unsteady solution viewer, and a debris tracking interface. Script library development in Chimera Grid Tools has applications in turbopump grid generation. This viewgraph presentation profiles multiple component dynamics, validation test cases for a sphere, cylinder, and oscillating airfoil, and debris analysis.
Experimental Testing of a Van De Graaff Generator as an Electromagnetic Pulse Generator
2016-07-01
EXPERIMENTAL TESTING OF A VAN DE GRAAFF GENERATOR AS AN ELECTROMAGNETIC PULSE GENERATOR THESIS...protection in the United States AFIT-ENP-MS-16-S-075 EXPERIMENTAL TESTING OF A VAN DE GRAAFF GENERATOR AS AN ELECTROMAGNETIC PULSE GENERATOR...RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENP-MS-16-S-075 EXPERIMENTAL TESTING OF A VAN DE GRAAFF GENERATOR AS AN ELECTROMAGNETIC PULSE GENERATOR
West, Allison H; Blazer, Kathleen R; Stoll, Jessica; Jones, Matthew; Weipert, Caroline M; Nielsen, Sarah M; Kupfer, Sonia S; Weitzel, Jeffrey N; Olopade, Olufunmilayo I
2018-02-14
Comprehensive genomic cancer risk assessment (GCRA) helps patients, family members, and providers make informed choices about cancer screening, surgical and chemotherapeutic risk reduction, and genetically targeted cancer therapies. The increasing availability of multigene panel tests for clinical applications allows testing of well-defined high-risk genes, as well as moderate-risk genes, for which the penetrance and spectrum of cancer risk are less well characterized. Moderate-risk genes are defined as genes that, when altered by a pathogenic variant, confer a 2 to fivefold relative risk of cancer. Two such genes included on many comprehensive cancer panels are the DNA repair genes ATM and CHEK2, best known for moderately increased risk of breast cancer development. However, the impact of screening and preventative interventions and spectrum of cancer risk beyond breast cancer associated with ATM and/or CHEK2 variants remain less well characterized. We convened a large, multidisciplinary, cross-sectional panel of GCRA clinicians to review challenging, peer-submitted cases of patients identified with ATM or CHEK2 variants. This paper summarizes the inter-professional case discussion and recommendations generated during the session, the level of concordance with respect to recommendations between the academic and community clinician participants for each case, and potential barriers to implementing recommended care in various practice settings.
Robust stochastic optimization for reservoir operation
NASA Astrophysics Data System (ADS)
Pan, Limeng; Housh, Mashor; Liu, Pan; Cai, Ximing; Chen, Xin
2015-01-01
Optimal reservoir operation under uncertainty is a challenging engineering problem. Application of classic stochastic optimization methods to large-scale problems is limited due to computational difficulty. Moreover, classic stochastic methods assume that the estimated distribution function or the sample inflow data accurately represents the true probability distribution, which may be invalid and the performance of the algorithms may be undermined. In this study, we introduce a robust optimization (RO) approach, Iterative Linear Decision Rule (ILDR), so as to provide a tractable approximation for a multiperiod hydropower generation problem. The proposed approach extends the existing LDR method by accommodating nonlinear objective functions. It also provides users with the flexibility of choosing the accuracy of ILDR approximations by assigning a desired number of piecewise linear segments to each uncertainty. The performance of the ILDR is compared with benchmark policies including the sampling stochastic dynamic programming (SSDP) policy derived from historical data. The ILDR solves both the single and multireservoir systems efficiently. The single reservoir case study results show that the RO method is as good as SSDP when implemented on the original historical inflows and it outperforms SSDP policy when tested on generated inflows with the same mean and covariance matrix as those in history. For the multireservoir case study, which considers water supply in addition to power generation, numerical results show that the proposed approach performs as well as in the single reservoir case study in terms of optimal value and distributional robustness.
Evaluation of near-critical overdamping effects in slug-test response
Weeks, Edwin P.; Clark, Arthur C.
2013-01-01
A slug test behaves as a harmonic oscillator, subject to both inertial effects and viscous damping. When viscous and inertial forces are closely balanced, the system is nearly critically damped, and water-level recovery is affected by inertial effects, but does not exhibit oscillation. These effects were investigated by use of type curves, generated both by modification of Kipp's (1985) computer program and by use of the Butler-Zhan (2004) model. Utility of the type curves was verified by re-analysis of the Regina slug test previously analyzed by Kipp. These type curves indicate that near-critical inertial effects result in early-time delayed water-level response followed by merger with, or more rapid recovery than, response for the fully damped case. Because of this early time response, slug tests in the moderately over-damped range are best analyzed using log-log type curves of (1 − H/H0) vs. Tt/. Failure to recognize inertial effects in slug test data could result in an over-estimate of transmissivity, and a too-small estimate of storage coefficient or too-large estimate of well skin. However, application of the widely used but highly empirical Hvorslev (1951) method to analyze both the Regina slug test and type-curve generated data indicate that such analyses provide T values within a factor of 2 of the true value.
Design of Test Support Hardware for Advanced Space Suits
NASA Technical Reports Server (NTRS)
Watters, Jeffrey A.; Rhodes, Richard
2013-01-01
As a member of the Space Suit Assembly Development Engineering Team, I designed and built test equipment systems to support the development of the next generation of advanced space suits. During space suit testing it is critical to supply the subject with two functions: (1) cooling to remove metabolic heat, and (2) breathing air to pressurize the space suit. The objective of my first project was to design, build, and certify an improved Space Suit Cooling System for manned testing in a 1-G environment. This design had to be portable and supply a minimum cooling rate of 2500 BTU/hr. The Space Suit Cooling System is a robust, portable system that supports very high metabolic rates. It has a highly adjustable cool rate and is equipped with digital instrumentation to monitor the flowrate and critical temperatures. It can supply a variable water temperature down to 34 deg., and it can generate a maximum water flowrate of 2.5 LPM. My next project was to design and build a Breathing Air System that was capable of supply facility air to subjects wearing the Z-2 space suit. The system intakes 150 PSIG breathing air and regulates it to two operating pressures: 4.3 and 8.3 PSIG. It can also provide structural capabilities at 1.5x operating pressure: 6.6 and 13.2 PSIG, respectively. It has instrumentation to monitor flowrate, as well as inlet and outlet pressures. The system has a series of relief valves to fully protect itself in case of regulator failure. Both projects followed a similar design methodology. The first task was to perform research on existing concepts to develop a sufficient background knowledge. Then mathematical models were developed to size components and simulate system performance. Next, mechanical and electrical schematics were generated and presented at Design Reviews. After the systems were approved by the suit team, all the hardware components were specified and procured. The systems were then packaged, fabricated, and thoroughly tested. The next step was to certify the equipment for manned used, which included generating a Hazard Analysis and giving a presentation to the Test Readiness Review Board. Both of these test support systems will perform critical roles in the development of next-generation space suits. They will used on a regular basis to test the NASA's new Z-2 Space Suit. The Space Suit Cooling System is now the primary cooling system for all advanced suit tests.
Design and Manufacture of a Highly Reliable, Miniaturized and Low Mass Shutter Mechanism
NASA Technical Reports Server (NTRS)
Manhart, M.; Zeh, T.; Preibler, G.; Hurni, A.; Walter, I.; Helbert, J.; Hiesinger, H.
2010-01-01
This paper describes the development, manufacturing and testing of a lightweight shutter mechanism made of titanium for the MERTIS Instrument. MERTIS is a thermal infrared imaging spectrometer onboard ESA's future BepiColombo mission to Mercury. The mechanism is built as a parallelogram arrangement of flexible hinges, actuated by a voice coil. In a first test run, it was shown that the selected EDM processing led to the generation of titanium oxides and an oxygen-enriched surface layer on the substrate (so called alpha-case layer). In the revised version of the shutter, it was possible to manufacture the complex geometry by micro-milling and an adjacent pickling procedure. The adequacy of this approach was verified by lifetime and vibration testing.
Lightweight engine containment. [Kevlar shielding
NASA Technical Reports Server (NTRS)
Weaver, A. T.
1977-01-01
Kevlar fabric styles and weaves were studied, as well as methods of application for advanced gas turbine engines. The Kevlar material was subjected to high speed impacts by simple projectiles fired from a rifle, as well as more complex shapes such as fan blades released from gas turbine rotors in a spin pit. Just contained data was developed for a variety of weave and/or application techniques, and a comparative containment weight efficiency was established for Kevlar containment applications. The data generated during these tests is being incorporated into an analytical design system so that blade containment trade-off studies between Kevlar and metal case engine structures can be made. Laboratory tests and engine environment tests were performed to determine the survivability of Kevlar in a gas turbine environment.
Primary growth hormone insensitivity (Laron syndrome) and acquired hypothyroidism: a case report
2011-01-01
Introduction Primary growth hormone resistance or growth hormone insensitivity syndrome, also known as Laron syndrome, is a hereditary disease caused by deletions or different types of mutations in the growth hormone receptor gene or by post-receptor defects. This disorder is characterized by a clinical appearance of severe growth hormone deficiency with high levels of circulating growth hormone in contrast to low serum insulin-like growth factor 1 values. Case presentation We report the case of a 15-year-old Caucasian girl who was diagnosed with Silver-Russell syndrome at the age of four and a half years. Recombinant growth hormone was administered for 18 months without an appropriate increase in growth velocity. At the age of seven years, her serum growth hormone levels were high, and an insulin-like growth factor 1 generation test did not increase insulin-like growth factor 1 levels (baseline insulin-like growth factor 1 levels, 52 μg/L; reference range, 75 μg/L to 365 μg/L; and peak, 76 μg/L and 50 μg/L after 12 and 84 hours, respectively, from baseline). The genetic analysis showed that the patient was homozygous for the R217X mutation in the growth hormone receptor gene, which is characteristic of Laron syndrome. On the basis of these results, the diagnosis of primary growth hormone insensitivity syndrome was made, and recombinant insulin-like growth factor 1 therapy was initiated. The patient's treatment was well tolerated, but unexplained central hypothyroidism occurred at the age of 12.9 years. At the age of 15 years, when the patient's sexual development was almost completed and her menstrual cycle occurred irregularly, her height was 129.8 cm, which is 4.71 standard deviations below the median for normal girls her age. Conclusion The most important functional tests for the diagnosis of growth hormone insensitivity are the insulin-like growth factor 1 generation test and genetic analysis. Currently, the only effective treatment is daily administration of recombinant insulin-like growth factor 1 starting from early childhood. However, these patients show a dramatically impaired final height. In our case, unexplained central hypothyroidism occurred during treatment. PMID:21745362
77 FR 11421 - Airworthiness Directives; Pratt & Whitney Canada, Auxiliary Power Units
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... separation of the rear gas generator case and release of high energy debris. This proposed AD would require modifications of the rear gas generator case, exhaust duct support, and turbine exhaust duct flanges. We are proposing this AD to prevent separation of the rear gas generator case and release of high energy debris...
Chandrasekharan, Subhashini; McGuire, Amy L.; Van den Veyver, Ignatia B.
2015-01-01
Thousands of patents have been awarded that claim human gene sequences and their uses, and some have been challenged in court. In a recent high-profile case, Association for Molecular Pathology, et al. vs. Myriad Genetics, Inc., et al., the United States Supreme Court ruled that genes are natural occurring substances and therefore not patentable through “composition of matter” claims. The consequences of this ruling will extend well beyond ending Myriad's monopoly over BRCA testing, and may affect similar monopolies of other commercial laboratories for tests involving other genes. It could also simplify intellectual property issues surrounding genome-wide clinical sequencing, which can generate results for genes covered by intellectual property. Non-invasive prenatal testing (NIPT) for common aneuploidies using cell-free fetal (cff) DNA in maternal blood is currently offered through commercial laboratories and is also the subject of ongoing patent litigation. The recent Supreme Court decision in the Myriad case has already been invoked by a lower district court in NIPT litigation and resulted in invalidation of primary claims in a patent on currently marketed cffDNA-based testing for chromosomal aneuploidies. PMID:24989832
Studies of aerothermal loads generated in regions of shock/shock interaction in hypersonic flow
NASA Technical Reports Server (NTRS)
Holden, Michael S.; Moselle, John R.; Lee, Jinho
1991-01-01
Experimental studies were conducted to examine the aerothermal characteristics of shock/shock/boundary layer interaction regions generated by single and multiple incident shocks. The presented experimental studies were conducted over a Mach number range from 6 to 19 for a range of Reynolds numbers to obtain both laminar and turbulent interaction regions. Detailed heat transfer and pressure measurements were made for a range of interaction types and incident shock strengths over a transverse cylinder, with emphasis on the 3 and 4 type interaction regions. The measurements were compared with the simple Edney, Keyes, and Hains models for a range of interaction configurations and freestream conditions. The complex flowfields and aerothermal loads generated by multiple-shock impingement, while not generating as large peak loads, provide important test cases for code prediction. The detailed heat transfer and pressure measurements proved a good basis for evaluating the accuracy of simple prediction methods and detailed numerical solutions for laminar and transitional regions or shock/shock interactions.
Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.
2000-01-01
The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.
NASA Astrophysics Data System (ADS)
Bouley, Simon; François, Benjamin; Roger, Michel; Posson, Hélène; Moreau, Stéphane
2017-09-01
The present work deals with the analytical modeling of two aspects of outlet guide vane aeroacoustics in axial-flow fan and compressor rotor-stator stages. The first addressed mechanism is the downstream transmission of rotor noise through the outlet guide vanes, the second one is the sound generation by the impingement of the rotor wakes on the vanes. The elementary prescribed excitation of the stator is an acoustic wave in the first case and a hydrodynamic gust in the second case. The solution for the response of the stator is derived using the same unified approach in both cases, within the scope of a linearized and compressible inviscid theory. It is provided by a mode-matching technique: modal expressions are written in the various sub-domains upstream and downstream of the stator as well as inside the inter-vane channels, and matched according to the conservation laws of fluid dynamics. This quite simple approach is uniformly valid in the whole range of subsonic Mach numbers and frequencies. It is presented for a two-dimensional rectilinear-cascade of zero-staggered flat-plate vanes and completed by the implementation of a Kutta condition. It is then validated in sound generation and transmission test cases by comparing with a previously reported model based on the Wiener-Hopf technique and with reference numerical simulations. Finally it is used to analyze the tonal rotor-stator interaction noise in a typical low-speed fan architecture. The interest of the mode-matching technique is that it could be easily transposed to a three-dimensional annular cascade in cylindrical coordinates in a future work. This makes it an attractive alternative to the classical strip-theory approach.
Yoon, Jung-Ro; Yang, Jae-Hyuk
2018-03-20
The purpose of this retrospective study was to analyze and compare the clinical and radiologic outcomes of fixed bearing ultracongruent (UC) insert total knee arthroplasty (TKA) and mobile bearing (MB) floating platform TKA using the navigation-assisted gap balancing technique with a minimum follow-up of five years. The study retrospectively enrolled 105 patients who received the UC type fixed bearing insert (group 1) and 95 patients who received the floating platform MB insert (group 2) during the period from August 2009 to June 2012. All surgery was performed using the navigation-assisted gap balancing technique. For strict assessment of gap measurements, the offset-type-force-controlled-spreader-system was used. Radiologic and clinical outcomes were assessed before operation and at the most recent follow-up using the Knee Society Score (KSS) and the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) score. For statistical analysis, paired sample t tests were used. A p value less than 0.05 was considered significant. Although the radiologic alignments were satisfactory for both groups (99/105 [94%] cases were neutral for group 1 and 90/95 [94%] for group 2), the functional and total WOMAC scores were inferior in group 2 (p < 0.05). There were two cases of insert breakage in group 2 that required bearing exchange. The Kaplan-Meier survivorship rates for groups 1 and 2 at 77 months were 100.0 and 97.9%, respectively. Second-generation MB floating platform TKA cases did not have satisfactory outcomes. There were two cases of insert breakage, which required bearing exchange. Other patients who underwent surgery with second-generation MB floating platform were encouraged to avoid high knee flexion activities, resulting in lower clinical performance.
Sass, Chodon; Little, Damon P.; Stevenson, Dennis Wm.; Specht, Chelsea D.
2007-01-01
Barcodes are short segments of DNA that can be used to uniquely identify an unknown specimen to species, particularly when diagnostic morphological features are absent. These sequences could offer a new forensic tool in plant and animal conservation—especially for endangered species such as members of the Cycadales. Ideally, barcodes could be used to positively identify illegally obtained material even in cases where diagnostic features have been purposefully removed or to release confiscated organisms into the proper breeding population. In order to be useful, a DNA barcode sequence must not only easily PCR amplify with universal or near-universal reaction conditions and primers, but also contain enough variation to generate unique identifiers at either the species or population levels. Chloroplast regions suggested by the Plant Working Group of the Consortium for the Barcode of Life (CBoL), and two alternatives, the chloroplast psbA-trnH intergenic spacer and the nuclear ribosomal internal transcribed spacer (nrITS), were tested for their utility in generating unique identifiers for members of the Cycadales. Ease of amplification and sequence generation with universal primers and reaction conditions was determined for each of the seven proposed markers. While none of the proposed markers provided unique identifiers for all species tested, nrITS showed the most promise in terms of variability, although sequencing difficulties remain a drawback. We suggest a workflow for DNA barcoding, including database generation and management, which will ultimately be necessary if we are to succeed in establishing a universal DNA barcode for plants. PMID:17987130
Yeo, Lami; Romero, Roberto
2013-09-01
To describe a novel method (Fetal Intelligent Navigation Echocardiography (FINE)) for visualization of standard fetal echocardiography views from volume datasets obtained with spatiotemporal image correlation (STIC) and application of 'intelligent navigation' technology. We developed a method to: 1) demonstrate nine cardiac diagnostic planes; and 2) spontaneously navigate the anatomy surrounding each of the nine cardiac diagnostic planes (Virtual Intelligent Sonographer Assistance (VIS-Assistance®)). The method consists of marking seven anatomical structures of the fetal heart. The following echocardiography views are then automatically generated: 1) four chamber; 2) five chamber; 3) left ventricular outflow tract; 4) short-axis view of great vessels/right ventricular outflow tract; 5) three vessels and trachea; 6) abdomen/stomach; 7) ductal arch; 8) aortic arch; and 9) superior and inferior vena cava. The FINE method was tested in a separate set of 50 STIC volumes of normal hearts (18.6-37.2 weeks of gestation), and visualization rates for fetal echocardiography views using diagnostic planes and/or VIS-Assistance® were calculated. To examine the feasibility of identifying abnormal cardiac anatomy, we tested the method in four cases with proven congenital heart defects (coarctation of aorta, tetralogy of Fallot, transposition of great vessels and pulmonary atresia with intact ventricular septum). In normal cases, the FINE method was able to generate nine fetal echocardiography views using: 1) diagnostic planes in 78-100% of cases; 2) VIS-Assistance® in 98-100% of cases; and 3) a combination of diagnostic planes and/or VIS-Assistance® in 98-100% of cases. In all four abnormal cases, the FINE method demonstrated evidence of abnormal fetal cardiac anatomy. The FINE method can be used to visualize nine standard fetal echocardiography views in normal hearts by applying 'intelligent navigation' technology to STIC volume datasets. This method can simplify examination of the fetal heart and reduce operator dependency. The observation of abnormal echocardiography views in the diagnostic planes and/or VIS-Assistance® should raise the index of suspicion for congenital heart disease. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.
Smart-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Venkat K; Palmintier, Bryan S; Hodge, Brian S
The National Renewable Energy Laboratory (NREL) in collaboration with Massachusetts Institute of Technology (MIT), Universidad Pontificia Comillas (Comillas-IIT, Spain) and GE Grid Solutions, is working on an ARPA-E GRID DATA project, titled Smart-DS, to create: 1) High-quality, realistic, synthetic distribution network models, and 2) Advanced tools for automated scenario generation based on high-resolution weather data and generation growth projections. Through these advancements, the Smart-DS project is envisioned to accelerate the development, testing, and adoption of advanced algorithms, approaches, and technologies for sustainable and resilient electric power systems, especially in the realm of U.S. distribution systems. This talk will present themore » goals and overall approach of the Smart-DS project, including the process of creating the synthetic distribution datasets using reference network model (RNM) and the comprehensive validation process to ensure network realism, feasibility, and applicability to advanced use cases. The talk will provide demonstrations of early versions of synthetic models, along with the lessons learnt from expert engagements to enhance future iterations. Finally, the scenario generation framework, its development plans, and co-ordination with GRID DATA repository teams to house these datasets for public access will also be discussed.« less
Armanini, D G; Monk, W A; Carter, L; Cote, D; Baird, D J
2013-08-01
Evaluation of the ecological status of river sites in Canada is supported by building models using the reference condition approach. However, geography, data scarcity and inter-operability constraints have frustrated attempts to monitor national-scale status and trends. This issue is particularly true in Atlantic Canada, where no ecological assessment system is currently available. Here, we present a reference condition model based on the River Invertebrate Prediction and Classification System approach with regional-scale applicability. To achieve this, we used biological monitoring data collected from wadeable streams across Atlantic Canada together with freely available, nationally consistent geographic information system (GIS) environmental data layers. For the first time, we demonstrated that it is possible to use data generated from different studies, even when collected using different sampling methods, to generate a robust predictive model. This model was successfully generated and tested using GIS-based rather than local habitat variables and showed improved performance when compared to a null model. In addition, ecological quality ratio data derived from the model responded to observed stressors in a test dataset. Implications for future large-scale implementation of river biomonitoring using a standardised approach with global application are presented.
Application of homomorphic signal processing to stress wave factor analysis
NASA Technical Reports Server (NTRS)
Karagulle, H.; Williams, J. H., Jr.; Lee, S. S.
1985-01-01
The stress wave factor (SWF) signal, which is the output of an ultrasonic testing system where the transmitting and receiving transducers are coupled to the same face of the test structure, is analyzed in the frequency domain. The SWF signal generated in an isotropic elastic plate is modelled as the superposition of successive reflections. The reflection which is generated by the stress waves which travel p times as a longitudinal (P) wave and s times as a shear (S) wave through the plate while reflecting back and forth between the bottom and top faces of the plate is designated as the reflection with p, s. Short-time portions of the SWF signal are considered for obtaining spectral information on individual reflections. If the significant reflections are not overlapped, the short-time Fourier analysis is used. A summary of the elevant points of homomorphic signal processing, which is also called cepstrum analysis, is given. Homomorphic signal processing is applied to short-time SWF signals to obtain estimates of the log spectra of individual reflections for cases in which the reflections are overlapped. Two typical SWF signals generated in aluminum plates (overlapping and non-overlapping reflections) are analyzed.
McKerr, Caoimhe; Adak, Goutam K.; Nichols, Gordon; Gorton, Russell; Chalmers, Rachel M.; Kafatos, George; Cosford, Paul; Charlett, Andre; Reacher, Mark; Pollock, Kevin G.; Alexander, Claire L.; Morton, Stephen
2015-01-01
Background We report a widespread foodborne outbreak of Cryptosporidium parvum in England and Scotland in May 2012. Cases were more common in female adults, and had no history of foreign travel. Over 300 excess cases were identified during the period of the outbreak. Speciation and microbiological typing revealed the outbreak strain to be C. parvum gp60 subtype IIaA15G2R1. Methods Hypothesis generation questionnaires were administered and an unmatched case control study was undertaken to test the hypotheses raised. Cases and controls were interviewed by telephone. Controls were selected using sequential digit dialling. Information was gathered on demographics, foods consumed and retailers where foods were purchased. Results Seventy-four laboratory confirmed cases and 74 controls were included in analyses. Infection was found to be strongly associated with the consumption of pre-cut mixed salad leaves sold by a single retailer. This is the largest documented outbreak of cryptosporidiosis attributed to a food vehicle. PMID:26017538
McKerr, Caoimhe; Adak, Goutam K; Nichols, Gordon; Gorton, Russell; Chalmers, Rachel M; Kafatos, George; Cosford, Paul; Charlett, Andre; Reacher, Mark; Pollock, Kevin G; Alexander, Claire L; Morton, Stephen
2015-01-01
We report a widespread foodborne outbreak of Cryptosporidium parvum in England and Scotland in May 2012. Cases were more common in female adults, and had no history of foreign travel. Over 300 excess cases were identified during the period of the outbreak. Speciation and microbiological typing revealed the outbreak strain to be C. parvum gp60 subtype IIaA15G2R1. Hypothesis generation questionnaires were administered and an unmatched case control study was undertaken to test the hypotheses raised. Cases and controls were interviewed by telephone. Controls were selected using sequential digit dialling. Information was gathered on demographics, foods consumed and retailers where foods were purchased. Seventy-four laboratory confirmed cases and 74 controls were included in analyses. Infection was found to be strongly associated with the consumption of pre-cut mixed salad leaves sold by a single retailer. This is the largest documented outbreak of cryptosporidiosis attributed to a food vehicle.
Cecatti, J G; Costa, M L; Haddad, S M; Parpinelli, M A; Souza, J P; Sousa, M H; Surita, F G; Pinto E Silva, J L; Pacagnella, R C; Passini, R
2016-05-01
To identify cases of severe maternal morbidity (SMM) during pregnancy and childbirth, their characteristics, and to test the feasibility of scaling up World Health Organization criteria for identifying women at risk of a worse outcome. Multicentre cross-sectional study. Twenty-seven referral maternity hospitals from all regions of Brazil. Cases of SMM identified among 82 388 delivering women over a 1-year period. Prospective surveillance using the World Health Organization's criteria for potentially life-threatening conditions (PLTC) and maternal near-miss (MNM) identified and assessed cases with severe morbidity or death. Indicators of maternal morbidity and mortality; sociodemographic, clinical and obstetric characteristics; gestational and perinatal outcomes; main causes of morbidity and delays in care. Among 9555 cases of SMM, there were 140 deaths and 770 cases of MNM. The main determining cause of maternal complication was hypertensive disease. Criteria for MNM conditions were more frequent as the severity of the outcome increased, all combined in over 75% of maternal deaths. This study identified around 9.5% of MNM or death among all cases developing any severe maternal complication. Multicentre studies on surveillance of SMM, with organised collaboration and adequate study protocols can be successfully implemented, even in low-income and middle-income settings, generating important information on maternal health and care to be used to implement appropriate health policies and interventions. Surveillance of severe maternal morbidity was proved to be possible in a hospital network in Brazil. © 2015 Royal College of Obstetricians and Gynaecologists.
Validation of two (parametric vs non-parametric) daily weather generators
NASA Astrophysics Data System (ADS)
Dubrovsky, M.; Skalak, P.
2015-12-01
As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed-like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30-years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database. Acknowledgements: The weather generator is developed and validated within the frame of projects WG4VALUE (sponsored by the Ministry of Education, Youth and Sports of CR), and VALUE (COST ES 1102 action).
Thermal blanket metallic film groundstrap and second surface mirror vulnerability to arc discharges
NASA Technical Reports Server (NTRS)
Inouye, G. T.; Sanders, N. L.; Komatsu, G. K.; Valles, J. R.; Sellen, J. M., Jr.
1979-01-01
Available data on the geosynchronous orbit energetic plasma environment were examined, and a crude model was generated to permit an estimation to be made of the number of arc discharges per year to which a thermal blanket groundstrap would be subjected. Laboratory experiments and a survey of the literature on arc discharge characteristics were performed to define typical and worst case arc discharge current waveforms. In-air tests of different groundstrap configurations to a standardized test pulse were performed and a wide variability of durability values were found. A groundstrap technique, not used thus far, was found to be far superior than the others.
Machado, A A; Martinez, R; Haikal, A A; Rodrigues da Silva, M C
2001-01-01
In occupational accidents involving health professionals handling potentially contaminated material, the decision to start or to continue prophylactic medication against infection by Human Immunodeficiency Virus (HIV) has been based on the ELISA test applied to a blood sample from the source patient. In order to rationalize the prophylactic use of antiretroviral agents, a rapid serologic diagnostic test of HIV infection was tested by the enzymatic immunoabsorption method (SUDS HIV 1+2, MUREX) and compared to conventional ELISA (Abbott HIV-1/ HIV-2 3rd Generation plus EIA). A total of 592 cases of occupational accidents were recorded at the University Hospital of Ribeirão Preto from July 1998 to April 1999. Of these, 109 were simultaneously evaluated by the rapid test and by ELISA HIV. The rapid test was positive in three cases and was confirmed by ELISA and in one the result was inconclusive and later found to be negative by ELISA. In the 106 accidents in which the rapid test was negative no prophylactic medication was instituted, with an estimated reduction in costs of US$ 2,889.35. In addition to this advantage, the good correlation of the rapid test with ELISA, the shorter duration of stress and the absence of exposure of the health worker to the adverse effects of antiretroviral agents suggest the adoption of this test in Programs of Attention to Accidents with Potentially Contaminated Material.
Andrade, Edson de Oliveira; Andrade, Elizabeth Nogueira de; Gallo, José Hiran
2011-01-01
To present the experience of a health plan operator (Unimed-Manaus) in Manaus, Amazonas, Brazil, with the accreditation of imaging services and the demand induced by the supply of new services (Roemer's Law). This is a retrospective work studying a time series covering the period from January 1998 to June 2004, in which the computed tomography and the magnetic resonance imaging services were implemented as part of the services offered by that health plan operator. Statistical analysis consisted of a descriptive and an inferential part, with the latter using a mean parametric test (Student T-test and ANOVA) and the Pearson correlation test. A 5% alpha and a 95% confidence interval were adopted. At Unimed-Manaus, the supply of new imaging services, by itself, was identified as capable of generating an increased service demand, thus characterizing the phenomenon described by Roemer. The results underscore the need to be aware of the fact that the supply of new health services could bring about their increased use without a real demand.
Research on Generating Method of Embedded Software Test Document Based on Dynamic Model
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.
Standard Methods for Bolt-Bearing Testing of Textile Composites
NASA Technical Reports Server (NTRS)
Portanova, M. A.; Masters, J. E.
1995-01-01
The response of three 2-D braided materials to bolt bearing loading was evaluated using data generated by Boeing Defense and Space Group in Philadelphia, PA. Three test methods, stabilized single shear, unstabilized single shear, and double shear, were compared. In general, these textile composites were found to be sensitive to bolt bearing test methods. The stabilized single shear method yielded higher strengths than the unstabilized single shear method in all cases. The double shear test method always produced the highest strengths but these results may be somewhat misleading. It is therefore recommended that standard material comparisons be made using the stabilized single shear test method. The effects of two geometric parameters, W/D and e/D, were also studied. An evaluation of the effect of the specimen width (W) to hole diameter (D) ratio concluded that bolt bearing responses were consistent with open hole tension results. A W/D ratio of 6 or greater should be maintained. The proximity of the hole to the specimen edge significantly affected strength. In all cases, strength was improved by increasing the ratio of the distance from the hole center to the specimen edge (e) to the hole diameter (D) above 2. An e/D ratio of 3 or greater is recommended.
Reusable Solid Rocket Motor Nozzle Joint-4 Thermal Analysis
NASA Technical Reports Server (NTRS)
Clayton, J. Louie
2001-01-01
This study provides for development and test verification of a thermal model used for prediction of joint heating environments, structural temperatures and seal erosions in the Space Shuttle Reusable Solid Rocket Motor (RSRM) Nozzle Joint-4. The heating environments are a result of rapid pressurization of the joint free volume assuming a leak path has occurred in the filler material used for assembly gap close out. Combustion gases flow along the leak path from nozzle environment to joint O-ring gland resulting in local heating to the metal housing and erosion of seal materials. Analysis of this condition was based on usage of the NASA Joint Pressurization Routine (JPR) for environment determination and the Systems Improved Numerical Differencing Analyzer (SINDA) for structural temperature prediction. Model generated temperatures, pressures and seal erosions are compared to hot fire test data for several different leak path situations. Investigated in the hot fire test program were nozzle joint-4 O-ring erosion sensitivities to leak path width in both open and confined joint geometries. Model predictions were in generally good agreement with the test data for the confined leak path cases. Worst case flight predictions are provided using the test-calibrated model. Analysis issues are discussed based on model calibration procedures.
Scaled CMOS Technology Reliability Users Guide
NASA Technical Reports Server (NTRS)
White, Mark
2010-01-01
The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is presented revealing a power relationship. General models describing the soft error rates across scaled product generations are presented. The analysis methodology may be applied to other scaled microelectronic products and their key parameters.
Demand side management in recycling and electricity retail pricing
NASA Astrophysics Data System (ADS)
Kazan, Osman
This dissertation addresses several problems from the recycling industry and electricity retail market. The first paper addresses a real-life scheduling problem faced by a national industrial recycling company. Based on their practices, a scheduling problem is defined, modeled, analyzed, and a solution is approximated efficiently. The recommended application is tested on the real-life data and randomly generated data. The scheduling improvements and the financial benefits are presented. The second problem is from electricity retail market. There are well-known patterns in daily usage in hours. These patterns change in shape and magnitude by seasons and days of the week. Generation costs are multiple times higher during the peak hours of the day. Yet most consumers purchase electricity at flat rates. This work explores analytic pricing tools to reduce peak load electricity demand for retailers. For that purpose, a nonlinear model that determines optimal hourly prices is established based on two major components: unit generation costs and consumers' utility. Both are analyzed and estimated empirically in the third paper. A pricing model is introduced to maximize the electric retailer's profit. As a result, a closed-form expression for the optimal price vector is obtained. Possible scenarios are evaluated for consumers' utility distribution. For the general case, we provide a numerical solution methodology to obtain the optimal pricing scheme. The models recommended are tested under various scenarios that consider consumer segmentation and multiple pricing policies. The recommended model reduces the peak load significantly in most cases. Several utility companies offer hourly pricing to their customers. They determine prices using historical data of unit electricity cost over time. In this dissertation we develop a nonlinear model that determines optimal hourly prices with parameter estimation. The last paper includes a regression analysis of the unit generation cost function obtained from Independent Service Operators. A consumer experiment is established to replicate the peak load behavior. As a result, consumers' utility function is estimated and optimal retail electricity prices are computed.
Face and construct validation of a next generation virtual reality (Gen2-VR) surgical simulator.
Sankaranarayanan, Ganesh; Li, Baichun; Manser, Kelly; Jones, Stephanie B; Jones, Daniel B; Schwaitzberg, Steven; Cao, Caroline G L; De, Suvranu
2016-03-01
Surgical performance is affected by distractors and interruptions to surgical workflow that exist in the operating room. However, traditional surgical simulators are used to train surgeons in a skills laboratory that does not recreate these conditions. To overcome this limitation, we have developed a novel, immersive virtual reality (Gen2-VR) system to train surgeons in these environments. This study was to establish face and construct validity of our system. The study was a within-subjects design, with subjects repeating a virtual peg transfer task under three different conditions: Case I: traditional VR; Case II: Gen2-VR with no distractions and Case III: Gen2-VR with distractions and interruptions. In Case III, to simulate the effects of distractions and interruptions, music was played intermittently, the camera lens was fogged for 10 s and tools malfunctioned for 15 s at random points in time during the simulation. At the completion of the study subjects filled in a 5-point Likert scale feedback questionnaire. A total of sixteen subjects participated in this study. Friedman test showed significant difference in scores between the three conditions (p < 0.0001). Post hoc analysis using Wilcoxon signed-rank tests with Bonferroni correction further showed that all the three conditions were significantly different from each other (Case I, Case II, p < 0.0001), (Case I, Case III, p < 0.0001) and (Case II, Case III, p = 0.009). Subjects rated that fog (mean 4.18) and tool malfunction (median 4.56) significantly hindered their performance. The results showed that Gen2-VR simulator has both face and construct validity and that it can accurately and realistically present distractions and interruptions in a simulated OR, in spite of limitations of the current HMD hardware technology.
Face and Construct Validation of a Next Generation Virtual Reality (Gen2-VR©) Surgical Simulator
Sankaranarayanan, Ganesh; Li, Baichun; Manser, Kelly; Jones, Stephanie B.; Jones, Daniel B.; Schwaitzberg, Steven; Cao, Caroline G. L.; De, Suvranu
2015-01-01
Introduction Surgical performance is affected by distractors and interruptions to surgical workflow that exist in the operating room. However, traditional surgical simulators are used to train surgeons in a skills lab that does not recreate these conditions. To overcome this limitation, we have developed a novel, immersive virtual reality (Gen2-VR©) system to train surgeons in these environments. This study was to establish face and construct validity of our system. Methods and Procedures The study was a within-subjects design, with subjects repeating a virtual peg transfer task under three different conditions: CASE I: traditional VR; CASE II: Gen2-VR© with no distractions and CASE III: Gen2-VR© with distractions and interruptions.. In Case III, to simulate the effects of distractions and interruptions, music was played intermittently, the camera lens was fogged for 10 seconds and tools malfunctioned for 15 seconds at random points in time during the simulation. At the completion of the study subjects filled in a 5-point Likert scale feedback questionnaire. A total of sixteen subjects participated in this study. Results Friedman test showed significant difference in scores between the three conditions (p < 0.0001). Post hoc analysis using Wilcoxon Signed Rank tests with Bonferroni correction further showed that all the three conditions were significantly different from each other (Case I, Case II, p < 0.001), (Case I, Case III, p < 0.001) and (Case II, Case III, p = 0.009). Subjects rated that fog (mean= 4.18) and tool malfunction (median = 4.56) significantly hindered their performance. Conclusion The results showed that Gen2-VR© simulator has both face and construct validity and it can accurately and realistically present distractions and interruptions in a simulated OR, in spite of limitations of the current HMD hardware technology. PMID:26092010
NASA Technical Reports Server (NTRS)
Wheeler, J. T.
1990-01-01
The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.
NASA Astrophysics Data System (ADS)
Sandhu, Rajinder; Kaur, Jaspreet; Thapar, Vivek
2018-02-01
Dengue, also known as break-bone fever, is a tropical disease transmitted by mosquitoes. If the similarity between dengue infected users can be identified, it can help government's health agencies to manage the outbreak more effectively. To find similarity between cases affected by Dengue, user's personal and health information are the two fundamental requirements. Identification of similar symptoms, causes, effects, predictions and treatment procedures, is important. In this paper, an effective framework is proposed which finds similar patients suffering from dengue using keyword aware domain thesaurus and case base reasoning method. This paper focuses on the use of ontology dependent domain thesaurus technique to extract relevant keywords and then build cases with the help of case base reasoning method. Similar cases can be shared with users, nearby hospitals and health organizations to manage the problem more adequately. Two million case bases were generated to test the proposed similarity method. Experimental evaluations of proposed framework resulted in high accuracy and low error rate for finding similar cases of dengue as compared to UPCC and IPCC algorithms. The framework developed in this paper is for dengue but can easily be extended to other domains also.
The CARE guidelines: consensus-based clinical case reporting guideline development
Gagnier, Joel J; Kienle, Gunver; Altman, Douglas G; Moher, David; Sox, Harold; Riley, David
2013-01-01
A case report is a narrative that describes, for medical, scientific or educational purposes, a medical problem experienced by one or more patients. Case reports written without guidance from reporting standards are insufficiently rigorous to guide clinical practice or to inform clinical study design. Develop, disseminate and implement systematic reporting guidelines for case reports. We used a three-phase consensus process consisting of (1) premeeting literature review and interviews to generate items for the reporting guidelines, (2) a face-to-face consensus meeting to draft the reporting guidelines and (3) postmeeting feedback, review and pilot testing, followed by finalisation of the case report guidelines. This consensus process involved 27 participants and resulted in a 13-item checklist—a reporting guideline for case reports. The primary items of the checklist are title, key words, abstract, introduction, patient information, clinical findings, timeline, diagnostic assessment, therapeutic interventions, follow-up and outcomes, discussion, patient perspective and informed consent. We believe the implementation of the CARE (CAse REport) guidelines by medical journals will improve the completeness and transparency of published case reports and that the systematic aggregation of information from case reports will inform clinical study design, provide early signals of effectiveness and harms, and improve healthcare delivery. PMID:24155002
Development of the GPM Observatory Thermal Vacuum Test Model
NASA Technical Reports Server (NTRS)
Yang, Kan; Peabody, Hume
2012-01-01
A software-based thermal modeling process was documented for generating the thermal panel settings necessary to simulate worst-case on-orbit flight environments in an observatory-level thermal vacuum test setup. The method for creating such a thermal model involved four major steps: (1) determining the major thermal zones for test as indicated by the major dissipating components on the spacecraft, then mapping the major heat flows between these components; (2) finding the flight equivalent sink temperatures for these test thermal zones; (3) determining the thermal test ground support equipment (GSE) design and initial thermal panel settings based on the equivalent sink temperatures; and (4) adjusting the panel settings in the test model to match heat flows and temperatures with the flight model. The observatory test thermal model developed from this process allows quick predictions of the performance of the thermal vacuum test design. In this work, the method described above was applied to the Global Precipitation Measurement (GPM) core observatory spacecraft, a joint project between NASA and the Japanese Aerospace Exploration Agency (JAXA) which is currently being integrated at NASA Goddard Space Flight Center for launch in Early 2014. From preliminary results, the thermal test model generated from this process shows that the heat flows and temperatures match fairly well with the flight thermal model, indicating that the test model can simulate fairly accurately the conditions on-orbit. However, further analysis is needed to determine the best test configuration possible to validate the GPM thermal design before the start of environmental testing later this year. Also, while this analysis method has been applied solely to GPM, it should be emphasized that the same process can be applied to any mission to develop an effective test setup and panel settings which accurately simulate on-orbit thermal environments.
Baum, Jordan E; Zhang, Pan; Hoda, Rana S; Geraghty, Brian; Rennert, Hanna; Narula, Navneet; Fernandes, Helen D
2017-06-01
Minimally invasive diagnostic procedures such as needle-core biopsy and fine-needle aspiration provide adequate material for molecular analyses. Advances in precision oncology are trending toward the interrogation of limited amounts of genomic material to guide clinical and therapeutic decisions. The aim of this study was to investigate the minimum cellularity needed on cytologic smears for the identification of clinically relevant variants with next-generation sequencing (NGS). Thirty cases of cytologically diagnosed, resection-proven primary lung adenocarcinoma were identified. Nineteen of the 30 cases were known to harbor actionable variants. One Diff-Quik (DQ)-stained slide and 1 Papanicolaou (Pap)-stained slide were selected from each case. Cases were categorized as containing fewer than 100 tumor cells, 100 to 500 tumor cells, or more than 500 tumor cells. NGS was performed on the Ion Torrent platform. NGS was successfully performed on all cell blocks and on 90% of the smears. Paired DQ and Pap smears showed similar cellularity, and cases that differed in cellularity were within 1 category of each other. The cases with more than 100 tumor cells had a 93% success rate; this was significantly different from the situation for cases with fewer than 100 tumor cells, which were successfully sequenced only 67% of the time. Overall, NGS was able to provide clinically relevant information for 83% of DQ smears and for 90% of Pap smears tested. The data show a significantly higher likelihood of successful NGS with cytologic smears with more than 100 tumor cells. There was a trend for a higher NGS success rate with Pap smears versus DQ smears. Cancer Cytopathol 2017;125:398-406. © 2017 American Cancer Society. © 2017 American Cancer Society.
Characteristics of tuberculosis patients who generate secondary cases.
Rodrigo, T; Caylà, J A; García de Olalla, P; Galdós-Tangüis, H; Jansà, J M; Miranda, P; Brugal, T
1997-08-01
To determine the characteristics of smear positive tuberculosis (TB) patients who generate secondary TB cases. Those smear positive TB patients detected by the Barcelona Tuberculosis Program between 1990-1993, and for whom contact studies had been performed, were studied. We analyzed the predictive role of the variables: age, sex, intravenous drug use (IVDU), the presence of the acquired immune deficiency syndrome (AIDS), human immunodeficiency virus (HIV) infection, radiology pattern, district of residence, history of imprisonment, alcoholism, smoking, history of TB, treatment compliance and the number of secondary cases generated. Statistical analysis was based on the logistic regression model, calculating the odds ratios (OR) with 95% confidence intervals (CI). Of the 1079 patients studied, 78 (7.2%) had generated only one secondary case, and 30 (2.8%) two or more. The variables associated with generating two or more secondary cases were: IVDU (P < 0.001; OR = 4.06; CI: 1.80-9.15), cavitary radiology pattern (P = 0.002; OR = 3.69; CI: 1.62-8.43), and age (P = 0.016; OR = 0.98; CI: 0.96-0.99). When we examined those who had generated one or more secondary cases, the following variables were significant: IVDU (P = 0.043; OR = 1.75; CI: 1.02-3.02), cavitary radiology pattern (P < 0.001; OR = 3.07; CI: 1.98-4.77) and age (P < 0.001; OR = 0.98; CI: 0.97-0.99). The study of the contacts of smear positive TB patients allows us to detect an important number of secondary cases. Young adults, those with cavitary radiology pattern, and IVDU are more likely to generate secondary cases.
Validation of CFD/Heat Transfer Software for Turbine Blade Analysis
NASA Technical Reports Server (NTRS)
Kiefer, Walter D.
2004-01-01
I am an intern in the Turbine Branch of the Turbomachinery and Propulsion Systems Division. The division is primarily concerned with experimental and computational methods of calculating heat transfer effects of turbine blades during operation in jet engines and land-based power systems. These include modeling flow in internal cooling passages and film cooling, as well as calculating heat flux and peak temperatures to ensure safe and efficient operation. The branch is research-oriented, emphasizing the development of tools that may be used by gas turbine designers in industry. The branch has been developing a computational fluid dynamics (CFD) and heat transfer code called GlennHT to achieve the computational end of this analysis. The code was originally written in FORTRAN 77 and run on Silicon Graphics machines. However the code has been rewritten and compiled in FORTRAN 90 to take advantage of more modem computer memory systems. In addition the branch has made a switch in system architectures from SGI's to Linux PC's. The newly modified code therefore needs to be tested and validated. This is the primary goal of my internship. To validate the GlennHT code, it must be run using benchmark fluid mechanics and heat transfer test cases, for which there are either analytical solutions or widely accepted experimental data. From the solutions generated by the code, comparisons can be made to the correct solutions to establish the accuracy of the code. To design and create these test cases, there are many steps and programs that must be used. Before a test case can be run, pre-processing steps must be accomplished. These include generating a grid to describe the geometry, using a software package called GridPro. Also various files required by the GlennHT code must be created including a boundary condition file, a file for multi-processor computing, and a file to describe problem and algorithm parameters. A good deal of this internship will be to become familiar with these programs and the structure of the GlennHT code. Additional information is included in the original extended abstract.
NASA Technical Reports Server (NTRS)
Tam, C. K. W.; Burton, D. E.
1984-01-01
An investigation is conducted of the phenomenon of sound generation by spatially growing instability waves in high-speed flows. It is pointed out that this process of noise generation is most effective when the flow is supersonic relative to the ambient speed of sound. The inner and outer asymptotic expansions corresponding to an excited instability wave in a two-dimensional mixing layer and its associated acoustic fields are constructed in terms of the inner and outer spatial variables. In matching the solutions, the intermediate matching principle of Van Dyke and Cole is followed. The validity of the theory is tested by applying it to an axisymmetric supersonic jet and comparing the calculated results with experimental measurements. Very favorable agreements are found both in the calculated instability-wave amplitude distribution (the inner solution) and the near pressure field level contours (the outer solution) in each case.
Virtual engine management simulator for educational purposes
NASA Astrophysics Data System (ADS)
Drosescu, R.
2017-10-01
This simulator was conceived as a software program capable of generating complex control signals, identical to those in the electronic management systems of modern spark ignition or diesel engines. Speed in rpm and engine load percentage defined by throttle opening angle represent the input variables in the simulation program and are graphically entered by two-meter instruments from the simulator central block diagram. The output signals are divided into four categories: synchronization and position of each cylinder, spark pulses for spark ignition engines, injection pulses and, signals for generating the knock window for each cylinder in the case of a spark ignition engine. The simulation program runs in real-time so each signal evolution reflects the real behavior on a physically thermal engine. In this way, the generated signals (ignition or injection pulses) can be used with additionally drivers to control an engine on the test bench.
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Driel, Tim Brandt; Herrmann, Sven; Carini, Gabriella
The pulsed free-electron laser light sources represent a new challenge to photon area detectors due to the intrinsic spontaneous X-ray photon generation process that makes single-pulse detection necessary. Intensity fluctuations up to 100% between individual pulses lead to high linearity requirements in order to distinguish small signal changes. In real detectors, signal distortions as a function of the intensity distribution on the entire detector can occur. Here a robust method to correct this nonlinear response in an area detector is presented for the case of exposures to similar signals. The method is tested for the case of diffuse scattering frommore » liquids where relevant sub-1% signal changes appear on the same order as artifacts induced by the detector electronics.« less
Grazhdani, Dorina
2016-02-01
Economic development, urbanization, and improved living standards increase the quantity and complexity of generated solid waste. Comprehensive study of the variables influencing household solid waste production and recycling rate is crucial and fundamental for exploring the generation mechanism and forecasting future dynamics of household solid waste. The present study is employed in the case study of Prespa Park. A model, based on the interrelationships of economic, demographic, housing structure and waste management policy variables influencing the rate of solid waste generation and recycling is developed and employed. The empirical analysis is based on the information derived from a field questionnaire survey conducted in Prespa Park villages for the year 2014. Another feature of this study is to test whether a household's waste generation can be decoupled from its population growth. Descriptive statistics, bivariate correlation analysis and F-tests are used to know the relationship between variables. One-way and two-way fixed effects models data analysis techniques are used to identify variables that determine the effectiveness of waste generation and recycling at household level in the study area. The results reveal that households with heterogeneous characteristics, such as education level, mean building age and income, present different challenges of waste reduction goals. Numerically, an increase of 1% in education level of population corresponds to a waste reduction of 3kg on the annual per capita basis. A village with older buildings, in the case of one year older of the median building age, corresponds to a waste generation increase of 12kg. Other economic and policy incentives such as the mean household income, pay-as-you-throw, percentage of population with access to curbside recycling, the number of drop-off recycling facilities available per 1000 persons and cumulative expenditures on recycling education per capita are also found to be effective measures in waste reduction. The mean expenditure for recycling education spent on a person for years 2010 and 2014 is 12 and 14 cents, respectively and it vary from 0 to €1. For years 2010 and 2014, the mean percentage of population with access to curbside recycling services is 38.6% and 40.3%, and the mean number of drop-off recycling centers per 1000 persons in the population is 0.29 and 0.32, respectively. Empirical evidence suggests that population growth did not necessarily result in increases in waste generation. The results provided are useful when planning, changing or implementing sustainable municipal solid waste management. Copyright © 2015 Elsevier Ltd. All rights reserved.
Integrated platform for optimized solar PV system design and engineering plan set generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adeyemo, Samuel
2015-12-30
The Aurora team has developed software that allows users to quickly generate a three-dimensional model for a building, with a corresponding irradiance map, from any two-dimensional image with associated geo-coordinates. The purpose of this project is to build upon that technology by developing and distributing to solar installers a software platform that automatically retrieves engineering, financial and geographic data for a specific site, and quickly generates an optimal customer proposal and corresponding engineering plans for that site. At the end of the project, Aurora’s optimization platform would have been used to make at least one thousand proposals from at leastmore » ten unique solar installation companies, two of whom would sign economically viable contracts to use the software. Furthermore, Aurora’s algorithms would be tested to show that in at least seventy percent of cases, Aurora automatically generated a design equivalent to or better than what a human could have done manually. A ‘better’ design is one that generates more energy for the same cost, or that generates a higher return on investment, while complying with all site-specific aesthetic, electrical and spatial requirements.« less
Computational materials design of crystalline solids.
Butler, Keith T; Frost, Jarvist M; Skelton, Jonathan M; Svane, Katrine L; Walsh, Aron
2016-11-07
The modelling of materials properties and processes from first principles is becoming sufficiently accurate as to facilitate the design and testing of new systems in silico. Computational materials science is both valuable and increasingly necessary for developing novel functional materials and composites that meet the requirements of next-generation technology. A range of simulation techniques are being developed and applied to problems related to materials for energy generation, storage and conversion including solar cells, nuclear reactors, batteries, fuel cells, and catalytic systems. Such techniques may combine crystal-structure prediction (global optimisation), data mining (materials informatics) and high-throughput screening with elements of machine learning. We explore the development process associated with computational materials design, from setting the requirements and descriptors to the development and testing of new materials. As a case study, we critically review progress in the fields of thermoelectrics and photovoltaics, including the simulation of lattice thermal conductivity and the search for Pb-free hybrid halide perovskites. Finally, a number of universal chemical-design principles are advanced.
Integration of real-time non-surfactant emulsion fuel system on light duty lorry
NASA Astrophysics Data System (ADS)
Rashid, Muhammad Adib Abdul; Muhsin Ithnin, Ahmad; Jazair Yahya, Wira; Atiqah Ramlan, Nur; Aiyshah Mazlan, Nurul; Avianto Sugeng, Dhani
2017-10-01
Interest in water-in-diesel emulsion fuel (W/D) grows because of its advantages in improving fuel efficiency, reducing greenhouse emissions and retaining the quality of the lubrication oil. Recently, a device called Real-Time Non-Surfactant Emulsion Fuel System (RTES) have successfully created an emulsion without surfactant for a 5kW single-cylinder diesel engine generator. This study integrates the RTES into a light duty lorry, and the effect of the integration is investigated. The lorry was tested on a chassis dynamometer with a controlled 16.6% water ratio. The results show how fuel consumption is reduced by 7.1% compared to neat diesel. Moreover, the exhaust emission of Nitrogen Oxides (NOx) is reduced by 52%, while as observed in other works, carbon monoxides (CO) emission also increased, in this case by 41.6%. This integration concluded to retain similar benefits and disadvantages as tested on the 5.5kW diesel generator.
Krajden, Mel; Cook, Darrel; Mak, Annie; Chu, Ken; Chahil, Navdeep; Steinberg, Malcolm; Rekart, Michael; Gilbert, Mark
2014-09-01
We compared a 3rd generation (gen) and two 4th gen HIV enzyme immunoassays (EIA) to pooled nucleic acid testing (PNAT) for the identification of pre- and early seroconversion acute HIV infection (AHI). 9550 specimens from males >18 year from clinics attended by men who have sex with men were tested by Siemens ADVIA Centaur(®) HIV 1/O/2 (3rd gen) and HIV Combo (4th gen), as well as by Abbott ARCHITECT(®) HIV Ag/Ab Combo (4th gen). Third gen non-reactive specimens were also tested by Roche COBAS(®) Ampliprep/COBAS® TaqMan HIV-1 Test v.2 in pools of 24 samples. Sensitivity and specificity of the three EIAs for AHI detection were compared. 7348 persons contributed 9435 specimens and had no evidence of HIV infection, 79 (94 specimens) had established HIV infection, 6 (9 specimens) had pre-seroconversion AHI and 9 (12 specimens) had early seroconversion AHI. Pre-seroconversion AHI cases were not detected by 3rd gen EIA, whereas 2/6 (33.3%) were detected by Siemens 4th gen, 4/6 (66.7%) by Abbott 4th gen and 6/6 (100%) by PNAT. All three EIAs and PNAT detected all individuals with early seroconversion AHI. Overall sensitivity/specificity for the EIAs relative to WB or NAT resolved infection status was 93.6%/99.9% for Siemens 3rd gen, 95.7%/99.7% for Siemens 4th gen and 97.9%/99.2% for Abbott 4th gen. While both 4th gen EIAs demonstrated improved sensitivity for AHI compared to 3rd gen EIA, PNAT identified more AHI cases than either 4th gen assay. PNAT is likely to remain a useful strategy to identify AHI in high-risk populations. Copyright © 2014 Elsevier B.V. All rights reserved.
Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance
NASA Technical Reports Server (NTRS)
Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.
2014-01-01
This paper presents recent thermal model results of the Advanced Stirling Radioisotope Generator (ASRG). The three-dimensional (3D) ASRG thermal power model was built using the Thermal Desktop(trademark) thermal analyzer. The model was correlated with ASRG engineering unit test data and ASRG flight unit predictions from Lockheed Martin's (LM's) I-deas(trademark) TMG thermal model. The auxiliary cooling system (ACS) of the ASRG is also included in the ASRG thermal model. The ACS is designed to remove waste heat from the ASRG so that it can be used to heat spacecraft components. The performance of the ACS is reported under nominal conditions and during a Venus flyby scenario. The results for the nominal case are validated with data from Lockheed Martin. Transient thermal analysis results of ASRG for a Venus flyby with a representative trajectory are also presented. In addition, model results of an ASRG mounted on a Cassini-like spacecraft with a sunshade are presented to show a way to mitigate the high temperatures of a Venus flyby. It was predicted that the sunshade can lower the temperature of the ASRG alternator by 20 C for the representative Venus flyby trajectory. The 3D model also was modified to predict generator performance after a single Advanced Stirling Convertor failure. The geometry of the Microtherm HT insulation block on the outboard side was modified to match deformation and shrinkage observed during testing of a prototypic ASRG test fixture by LM. Test conditions and test data were used to correlate the model by adjusting the thermal conductivity of the deformed insulation to match the post-heat-dump steady state temperatures. Results for these conditions showed that the performance of the still-functioning inboard ACS was unaffected.
NASA Astrophysics Data System (ADS)
Andersson, P.; Bjelkenstedt, T.; Sundén, E. Andersson; Sjöstrand, H.; Jacobsson-Svärd, S.
Detailed knowledge of the lateral distribution of steam (void) and water in a nuclear fuel assembly is of great value for nuclear reactor operators and fuel manufacturers, with consequences for both reactor safety and economy of operation. Therefore, nuclear relevant two-phase flows are being studied at dedicated thermal-hydraulic test loop, using two-phase flow systems ranging from simplified geometries such as heated circular pipes to full scale mock-ups of nuclear fuel assemblies. Neutron tomography (NT) has been suggested for assessment of the lateral distribution of steam and water in such test loops, motivated by a good ability of neutrons to penetrate the metallic structures of metal pipes and nuclear fuel rod mock-ups, as compared to e.g. conventional X-rays, while the liquid water simultaneously gives comparatively good contrast. However, these stationary test loops require the measurement setup to be mobile, which is often not the case for NT setups. Here, it is acknowledged that fast neutrons of 14 MeV from mobile neutron generators constitute a viable option for a mobile NT system. We present details of the development of neutron tomography for this purpose at the division of Applied Nuclear Physics at Uppsala University. Our concept contains a portable neutron generator, exploiting the fusion reaction of deuterium and tritium, and a detector with plastic scintillator elements designed to achieveadequate spatial and energy resolution, all mounted in a light-weight frame without collimators or bulky moderation to allow for a mobile instrument that can be moved about the stationary thermal hydraulic test sections. The detector system stores event-to-event pulse-height information to allow for discrimination based on the energy deposition in the scintillator elements.
A compendium of controlled diffusion blades generated by an automated inverse design procedure
NASA Technical Reports Server (NTRS)
Sanz, Jose M.
1989-01-01
A set of sample cases was produced to test an automated design procedure developed at the NASA Lewis Research Center for the design of controlled diffusion blades. The range of application of the automated design procedure is documented. The results presented include characteristic compressor and turbine blade sections produced with the automated design code as well as various other airfoils produced with the base design method prior to the incorporation of the automated procedure.
1991-03-01
test cases are gathered, studied, and evaluated; industry and other national European programs are studied; and experience is gained. This evolution ...application callable layer. The CGM Generator can be used to record device-independent picture descriptions. conceptually in parallel with the...contributors: I Organization Peter R. Bono Associates, Inc. Secretarial Support Susan Bonde , Diane Bono, E!aine Bono, Brenda Carson, Gillian Hall
What was your question again? Types of medical studies.
Jupiter, Daniel C
2014-01-01
Medical literature comes in all shapes and sizes, from animal studies to prospective clinical trials to retrospective chart reviews. In all of these cases, it is worth asking: What are the intellectual goals of the study: description, hypothesis testing, or hypothesis generating? This differentiation can help shape our view of the success or failure of the scientific effort. Copyright © 2014 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.
Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard
2017-01-01
Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions. PMID:28752092
Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard; Laurinavicius, Arvydas
2017-01-01
Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions.
Israeli, Eran; Ilan, Yaron; Meir, Shimon Bar; Buenavida, Claudia; Goldin, Eran
2003-08-01
The aim of this study is to determine the accuracy of a novel laptop sized 13C-Urea breath test analyzer that continuously measures expired breath and to use its advantages to decrease testing time. One hundred and eighty-six subjects (mean age of 47.8 years) were tested simultaneously by the BreathID system (Oridion, Israel), and by the traditional IRMS. BreathID continuously measured the expired breath for a ratio of 13CO(2):12CO(2.) This value was expressed as delta over baseline (DOB) and displayed graphically on a screen in real time. One hundred and one subjects were positive and 85 were negative for H. pylori by isotope ratio mass spectrometry (IRMS). The correlation for the BreathID system at 30 minutes was 100% for positive cases and 98% for negative cases. Analysis of the continuous curves generated by the BreathID for all patients permitted definition of different DOB thresholds for a positive or negative result at shorter time intervals. Thus, after 6 minutes a conclusive test result could be obtained for 64% of subjects, and after 10 minutes for 92% of subjects. The 13C-Urea breath test utilizing the technology of molecular correlation spectrometry is an accurate method for determining infection by H. pylori. The advantage of continuous measurements can shorten testing time without compromising accuracy.
Low frequency acoustic waves from explosive sources in the atmosphere
NASA Astrophysics Data System (ADS)
Millet, Christophe; Robinet, Jean-Christophe; Roblin, Camille; Gloerfelt, Xavier
2006-11-01
In this study, a perturbative formulation of non linear euler equations is used to compute the pressure variation for low frequency acoustic waves from explosive sources in real atmospheres. Based on a Dispersion-Relation-Preserving (DRP) finite difference scheme, the discretization provides good properties for both sound generation and long range sound propagation over a variety of spatial atmospheric scales. It also assures that there is no wave mode coupling in the numerical simulation The background flow is obtained by matching the comprehensive empirical global model of horizontal winds HWM-93 (and MSISE-90 for the temperature profile) with meteorological reanalysis of the lower atmosphere. Benchmark calculations representing cases where there is downward and upward refraction (including shadow zones), ducted propagation, and generation of acoustic waves from low speed shear layers are considered for validation. For all cases, results show a very good agreement with analytical solutions, when available, and with other standard approaches, such as the ray tracing and the normal mode technique. Comparison of calculations and experimental data from the high explosive ``Misty Picture'' test that provided the scaled equivalent airblast of an 8 kt nuclear device (on May 14, 1987), is also considered. It is found that instability waves develop less than one hour after the wavefront generated by the detonation passes.
Use of statecharts in the modelling of dynamic behaviour in the ATLAS DAQ prototype-1
NASA Astrophysics Data System (ADS)
Croll, P.; Duval, P.-Y.; Jones, R.; Kolos, S.; Sari, R. F.; Wheeler, S.
1998-08-01
Many applications within the ATLAS DAQ prototype-1 system have complicated dynamic behaviour which can be successfully modelled in terms of states and transitions between states. Previously, state diagrams implemented as finite-state machines have been used. Although effective, they become ungainly as system size increases. Harel statecharts address this problem by implementing additional features such as hierarchy and concurrency. The CHSM object-oriented language system is freeware which implements Harel statecharts as concurrent, hierarchical, finite-state machines (CHSMs). An evaluation of this language system by the ATLAS DAQ group has shown it to be suitable for describing the dynamic behaviour of typical DAQ applications. The language is currently being used to model the dynamic behaviour of the prototype-1 run-control system. The design is specified by means of a CHSM description file, and C++ code is obtained by running the CHSM compiler on the file. In parallel with the modelling work, a code generator has been developed which translates statecharts, drawn using the StP CASE tool, into the CHSM language. C++ code, describing the dynamic behaviour of the run-control system, has been successfully generated directly from StP statecharts using the CHSM generator and compiler. The validity of the design was tested using the simulation features of the Statemate CASE tool.
Umehara, Kensuke; Ota, Junko; Ishida, Takayuki
2017-10-18
In this study, the super-resolution convolutional neural network (SRCNN) scheme, which is the emerging deep-learning-based super-resolution method for enhancing image resolution in chest CT images, was applied and evaluated using the post-processing approach. For evaluation, 89 chest CT cases were sampled from The Cancer Imaging Archive. The 89 CT cases were divided randomly into 45 training cases and 44 external test cases. The SRCNN was trained using the training dataset. With the trained SRCNN, a high-resolution image was reconstructed from a low-resolution image, which was down-sampled from an original test image. For quantitative evaluation, two image quality metrics were measured and compared to those of the conventional linear interpolation methods. The image restoration quality of the SRCNN scheme was significantly higher than that of the linear interpolation methods (p < 0.001 or p < 0.05). The high-resolution image reconstructed by the SRCNN scheme was highly restored and comparable to the original reference image, in particular, for a ×2 magnification. These results indicate that the SRCNN scheme significantly outperforms the linear interpolation methods for enhancing image resolution in chest CT images. The results also suggest that SRCNN may become a potential solution for generating high-resolution CT images from standard CT images.
Hood, Glen R; Ott, James R
2017-04-01
Successive generations of bi- and multivoltine species encounter differing biotic and abiotic environments intra-annually. The question of whether selection can independently adjust the relationship between body size and components of reproductive effort within successive generations in response to generation-specific environmental variation is applicable to a diversity of taxa. Herein, we develop a conceptual framework that illustrates increasingly independent life history adjustments between successive generations of taxa exhibiting complex life cycles. We apply this framework to the reproductive biology of the gall-forming insect, Belonocnema treatae (Hymenoptera: Cynipidae). This bivoltine species expresses cyclical parthenogenesis in which alternating sexual and asexual generations develop in different seasons and different environments. We tested the hypotheses that ecological divergence between the alternate generations is accompanied by generational differences in body size, egg size, and egg number and by changes in the relationships between body size and these components of reproductive effort. Increased potential reproductive effort of sexual generation B. treatae is attained by increased body size and egg number (with no trade-off between egg number and egg size) and by a significant increase in the slope of the relationship between body size and potential fecundity. These generation-specific relationships, interpreted in the context of the model framework, suggest that within each generation selection has independently molded the relationships relating body size to potential fecundity and potential reproductive effort in B. treatae. The conceptual framework is broadly applicable to comparisons involving the alternating generations of bi- and multivoltine species.
Loughman, James; Davison, Peter; Flitcroft, Ian
2007-11-01
Preattentive visual search (PAVS) describes rapid and efficient retinal and neural processing capable of immediate target detection in the visual field. Damage to the nerve fibre layer or visual pathway might reduce the efficiency with which the visual system performs such analysis. The purpose of this study was to test the hypothesis that patients with glaucoma are impaired on parallel search tasks, and that this would serve to distinguish glaucoma in early cases. Three groups of observers (glaucoma patients, suspect and normal individuals) were examined, using computer-generated flicker, orientation, and vertical motion displacement targets to assess PAVS efficiency. The task required rapid and accurate localisation of a singularity embedded in a field of 119 homogeneous distractors on either the left or right-hand side of a computer monitor. All subjects also completed a choice reaction time (CRT) task. Independent sample T tests revealed PAVS efficiency to be significantly impaired in the glaucoma group compared with both normal and suspect individuals. Performance was impaired in all types of glaucoma tested. Analysis between normal and suspect individuals revealed a significant difference only for motion displacement response times. Similar analysis using a PAVS/CRT index confirmed the glaucoma findings but also showed statistically significant differences between suspect and normal individuals across all target types. A test of PAVS efficiency appears capable of differentiating early glaucoma from both normal and suspect cases. Analysis incorporating a PAVS/CRT index enhances the diagnostic capacity to differentiate normal from suspect cases.
Development of a molecular diagnostic test for Retinitis Pigmentosa in the Japanese population.
Maeda, Akiko; Yoshida, Akiko; Kawai, Kanako; Arai, Yuki; Akiba, Ryutaro; Inaba, Akira; Takagi, Seiji; Fujiki, Ryoji; Hirami, Yasuhiko; Kurimoto, Yasuo; Ohara, Osamu; Takahashi, Masayo
2018-05-21
Retinitis Pigmentosa (RP) is the most common form of inherited retinal dystrophy caused by different genetic variants. More than 60 causative genes have been identified to date. The establishment of cost-effective molecular diagnostic tests with high sensitivity and specificity can be beneficial for patients and clinicians. Here, we developed a clinical diagnostic test for RP in the Japanese population. Evaluation of diagnostic technology, Prospective, Clinical and experimental study. A panel of 39 genes reported to cause RP in Japanese patients was established. Next generation sequence (NGS) technology was applied for the analyses of 94 probands with RP and RP-related diseases. After interpretation of detected genetic variants, molecular diagnosis based on a study of the genetic variants and a clinical phenotype was made by a multidisciplinary team including clinicians, researchers and genetic counselors. NGS analyses found 14,343 variants from 94 probands. Among them, 189 variants in 83 probands (88.3% of all cases) were selected as pathogenic variants and 64 probands (68.1%) have variants which can cause diseases. After the deliberation of these 64 cases, molecular diagnosis was made in 43 probands (45.7%). The final molecular diagnostic rate with the current system combining supplemental Sanger sequencing was 47.9% (45 of 94 cases). The RP panel provides the significant advantage of detecting genetic variants with a high molecular diagnostic rate. This type of race-specific high-throughput genotyping allows us to conduct a cost-effective and clinically useful genetic diagnostic test.
Is there a role for a test controller in the development of new ATC equipment?
NASA Technical Reports Server (NTRS)
Westrum, Ron
1994-01-01
Earl Wiener points out that human factors problems fixed during the R & D stage are paid for once. When they are not fixed during R & D, they are then paid for every day. How users are involved in the R & D process to assist in developing equipment is a critical issue. Effective involvement can produce real improvements. Ineffective involvement can produce inefficient kludges or systems that are actually dangerous. The underlying problem is the management of information and ideas. To develop a really generative system a great deal would have to change in the way that the FAA innovates. Use of test controllers would solve only some of the problems. For instance, we have cockpit resource management now for pilots; we may have it soon for controllers. But the management of ideas in the innovation process also needs intellectual resource management. Simply involving users is not enough. Brought in at the wrong point in the development process, users can block or compromise innovation. User involvement must be carefully considered. A test controller may be one solution to this problem. It might be necessary to have several kinds of test controllers (en route versus TRACON, for instance). No doubt further problems would surface in getting test controllers into operation. I would recommend that the FAA engage in a series of case studies of controller involvement in the innovation process. A systematic comparison of effective and ineffective cases would do much to clarify what we ought to do in the future. Unfortunately, I have been unable to find any cases where test controllers have been used. Perhaps we need to create some, to see how they work.
Analytical Investigation of a Reflux Boiler
NASA Technical Reports Server (NTRS)
Simon, William E.; Young, Fred M.; Chambers, Terrence L.
1996-01-01
A thermal model of a single Ultralight Fabric Reflux Tube (UFRT) was constructed and tested against data for an array of such tubes tested in the NASA-JSC facility. Modifications to the single fin model were necessary to accommodate the change in radiation shape factors due to adjacent tubes. There was good agreement between the test data and data generated for the same cases by the thermal model. The thermal model was also used to generate single and linear array data for the lunar environment (the primary difference between the test and lunar data was due to lunar gravity). The model was also used to optimize the linear spacing of the reflux tubes in an array. The optimal spacing of the tubes was recommended to be about 5 tube diameters based on maximizing the heat transfer per unit mass. The model also showed that the thermal conductivity of the Nextel fabric was the major limitation to the heat transfer. This led to a suggestion that the feasibility of jacketing the Nextel fiber bundles with copper strands be investigated. This jacketing arrangement was estimated to be able to double the thermal conductivity of the fabric at a volume concentration of about 12-14%. Doubling the thermal conductivity of the fabric would double the amount of heat transferred at the same steam saturation temperature.
40 CFR 86.1333-2010 - Transient test cycle generation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Transient test cycle generation. 86... (CONTINUED) Emission Regulations for New Otto-Cycle and Diesel Heavy-Duty Engines; Gaseous and Particulate Exhaust Test Procedures § 86.1333-2010 Transient test cycle generation. (a) Generating transient test...
An abstraction layer for efficient memory management of tabulated chemistry and flamelet solutions
NASA Astrophysics Data System (ADS)
Weise, Steffen; Messig, Danny; Meyer, Bernd; Hasse, Christian
2013-06-01
A large number of methods for simulating reactive flows exist, some of them, for example, directly use detailed chemical kinetics or use precomputed and tabulated flame solutions. Both approaches couple the research fields computational fluid dynamics and chemistry tightly together using either an online or offline approach to solve the chemistry domain. The offline approach usually involves a method of generating databases or so-called Lookup-Tables (LUTs). As these LUTs are extended to not only contain material properties but interactions between chemistry and turbulent flow, the number of parameters and thus dimensions increases. Given a reasonable discretisation, file sizes can increase drastically. The main goal of this work is to provide methods that handle large database files efficiently. A Memory Abstraction Layer (MAL) has been developed that handles requested LUT entries efficiently by splitting the database file into several smaller blocks. It keeps the total memory usage at a minimum using thin allocation methods and compression to minimise filesystem operations. The MAL has been evaluated using three different test cases. The first rather generic one is a sequential reading operation on an LUT to evaluate the runtime behaviour as well as the memory consumption of the MAL. The second test case is a simulation of a non-premixed turbulent flame, the so-called HM1 flame, which is a well-known test case in the turbulent combustion community. The third test case is a simulation of a non-premixed laminar flame as described by McEnally in 1996 and Bennett in 2000. Using the previously developed solver 'flameletFoam' in conjunction with the MAL, memory consumption and the performance penalty introduced were studied. The total memory used while running a parallel simulation was reduced significantly while the CPU time overhead associated with the MAL remained low.
NASA Astrophysics Data System (ADS)
Yakushev, P.; Bershtein, V.; Bukowska-Śluz, I.; Sobiesiak, M.; Gawdzik, B.
2016-05-01
Methacrylated derivatives of glucose (MGLU) and galactose (MGAL) were synthesized by the procedure described by Vogel, and their copolymers with methyl methacrylate (MMA) and MMA/N-vinyl pyrrolidone (MMA/NVP) (1:1) mixture were obtained with the aim to modify some properties of carbochain polymers, in particular to generate their biodegradability. These hybrids of synthetic and natural products, with 10, 20 or 30 wt. % modifiers, were characterized by DMA and TGA methods and in the biodegradation tests. Increasing Tg values by 20-30°C was registered in all cases whereas thermal stability was improved only for PMMA due to modification. On the contrary, only for hybrids based on hygroscopic MMA/NVP copolymer the essential biodegradability could be generated.
ERIC Educational Resources Information Center
Sheppard, Michael; Vibert, Conor
2016-01-01
Case studies have been an important tool in business, legal, and medical education for generations of students. Traditional text-based cases tend to be self-contained and structured in such a way as to teach a particular concept. The multimedia cases introduced in this study feature unscripted web-hosted video interviews with business owners and…
Kumar, Atul; Samadder, S R
2017-10-01
Accurate prediction of the quantity of household solid waste generation is very much essential for effective management of municipal solid waste (MSW). In actual practice, modelling methods are often found useful for precise prediction of MSW generation rate. In this study, two models have been proposed that established the relationships between the household solid waste generation rate and the socioeconomic parameters, such as household size, total family income, education, occupation and fuel used in the kitchen. Multiple linear regression technique was applied to develop the two models, one for the prediction of biodegradable MSW generation rate and the other for non-biodegradable MSW generation rate for individual households of the city Dhanbad, India. The results of the two models showed that the coefficient of determinations (R 2 ) were 0.782 for biodegradable waste generation rate and 0.676 for non-biodegradable waste generation rate using the selected independent variables. The accuracy tests of the developed models showed convincing results, as the predicted values were very close to the observed values. Validation of the developed models with a new set of data indicated a good fit for actual prediction purpose with predicted R 2 values of 0.76 and 0.64 for biodegradable and non-biodegradable MSW generation rate respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Instrumentation for accelerated life tests of concentrator solar cells.
Núñez, N; Vázquez, M; González, J R; Jiménez, F J; Bautista, J
2011-02-01
Concentrator photovoltaic is an emergent technology that may be a good economical and efficient alternative for the generation of electricity at a competitive cost. However, the reliability of these new solar cells and systems is still an open issue due to the high-irradiation level they are subjected to as well as the electrical and thermal stresses that they are expected to endure. To evaluate the reliability in a short period of time, accelerated aging tests are essential. Thermal aging tests for concentrator photovoltaic solar cells and systems under illumination are not available because no technical solution to the problem of reaching the working concentration inside a climatic chamber has been available. This work presents an automatic instrumentation system that overcomes the aforementioned limitation. Working conditions have been simulated by forward biasing the solar cells to the current they would handle at the working concentration (in this case, 700 and 1050 times the irradiance at one standard sun). The instrumentation system has been deployed for more than 10 000 h in a thermal aging test for III-V concentrator solar cells, in which the generated power evolution at different temperatures has been monitored. As a result of this test, the acceleration factor has been calculated, thus allowing for the degradation evolution at any temperature in addition to normal working conditions to be obtained.
Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F
2010-07-19
A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic.
2010-01-01
Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic. PMID:20642827
NASA Astrophysics Data System (ADS)
Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir
2018-04-01
Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.
Lesion Generation Through Ribs Using Histotripsy Therapy Without Aberration Correction
Kim, Yohan; Wang, Tzu-Yin; Xu, Zhen; Cain, Charles A.
2012-01-01
This study investigates the feasibility of using high-intensity pulsed therapeutic ultrasound, or histotripsy, to non-invasively generate lesions through the ribs. Histotripsy therapy mechanically ablates tissue through the generation of a cavitation bubble cloud, which occurs when the focal pressure exceeds a certain threshold. We hypothesize that histotripsy can generate precise lesions through the ribs without aberration correction if the main lobe retains its shape and exceeds the cavitation initiation threshold and the secondary lobes remain below the threshold. To test this hypothesis, a 750-kHz focused transducer was used to generate lesions in tissue-mimicking phantoms with and without the presence of rib aberrators. In all cases, 8000 pulses with 16 to 18 MPa peak rarefactional pressure at a repetition frequency of 100 Hz were applied without aberration correction. Despite the high secondary lobes introduced by the aberrators, high-speed imaging showed that bubble clouds were generated exclusively at the focus, resulting in well-confined lesions with comparable dimensions. Collateral damage from secondary lobes was negligible, caused by single bubbles that failed to form a cloud. These results support our hypothesis, suggesting that histotripsy has a high tolerance for aberrated fields and can generate confined focal lesions through rib obstacles without aberration correction. PMID:22083767
Lesion generation through ribs using histotripsy therapy without aberration correction.
Kim, Yohan; Wang, Tzu-Yin; Xu, Zhen; Cain, Charles A
2011-11-01
This study investigates the feasibility of using high-intensity pulsed therapeutic ultrasound, or histotripsy, to non-invasively generate lesions through the ribs. Histotripsy therapy mechanically ablates tissue through the generation of a cavitation bubble cloud, which occurs when the focal pressure exceeds a certain threshold. We hypothesize that histotripsy can generate precise lesions through the ribs without aberration correction if the main lobe retains its shape and exceeds the cavitation initiation threshold and the secondary lobes remain below the threshold. To test this hypothesis, a 750-kHz focused transducer was used to generate lesions in tissue-mimicking phantoms with and without the presence of rib aberrators. In all cases, 8000 pulses with 16 to 18 MPa peak rarefactional pressure at a repetition frequency of 100 Hz were applied without aberration correction. Despite the high secondary lobes introduced by the aberrators, high-speed imaging showed that bubble clouds were generated exclusively at the focus, resulting in well-confined lesions with comparable dimensions. Collateral damage from secondary lobes was negligible, caused by single bubbles that failed to form a cloud. These results support our hypothesis, suggesting that histotripsy has a high tolerance for aberrated fields and can generate confined focal lesions through rib obstacles without aberration correction.
Multi-GPU implementation of a VMAT treatment plan optimization algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Zhen, E-mail: Zhen.Tian@UTSouthwestern.edu, E-mail: Xun.Jia@UTSouthwestern.edu, E-mail: Steve.Jiang@UTSouthwestern.edu; Folkerts, Michael; Tan, Jun
Purpose: Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU’s relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors’ group, on a multi-GPU platform tomore » solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. Methods: The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors’ method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H and N) cancer case is then used to validate the authors’ method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H and N patient cases and three prostate cases are used to demonstrate the advantages of the authors’ method. Results: The authors’ multi-GPU implementation can finish the optimization process within ∼1 min for the H and N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23–46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. Conclusions: The results demonstrate that the multi-GPU implementation of the authors’ column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors’ study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.« less
Residual stress control and design of next-generation ultra-hard gear steels
NASA Astrophysics Data System (ADS)
Qian, Yana
In high power density transmission systems, Ni-Co secondary hardening steels have shown great potential for next-generation gear applications due to their excellent strength, toughness and superior fatigue performance. Study of residual stress generation and evolution in Ferrium C61 and C67 gear steels revealed that shot peening and laser peening processes effectively produce desired beneficial residual stress in the steels for enhanced fatigue performance. Surface residual stress levels of -1.4GPa and -1.5GPa were achieved in shot peened C61 and laser peened C67, respectively, without introducing large surface roughness or defects. Higher compressive residual stress is expected in C67 according to a demonstrated correlation between attainable residual stress and material hardness. Due to the lack of appropriate shot media, dual laser peening is proposed for future peening optimization in C67. A novel non-destructive synchrotron radiation technique was implemented and applied for the first time for residual stress distribution analysis in gear steels with large composition and property gradients. Observed substantial residual stress redistribution and material microstructure change during the rolling contact fatigue screening test with extremely high 5.4GPa load indicates the unsuitability of the test as a fatigue life predictor. To exploit benefits of higher case hardness and associated residual stress, a new material and process (CryoForm70) aiming at 70Rc surface hardness was designed utilizing the systems approach based on thermodynamics and secondary hardening mechanisms. The composition design was first validated by the excellent agreement between experimental and theoretical core martensite start temperature in the prototype. A novel cryogenic deformation process was concurrently designed to increase the case martensite volume fraction from 76% to 92% for enhanced strengthening efficiency and surface hardness. High temperature vacuum carburizing was optimized for desired carbon content profiles using carbon diffusion simulation in the multi-component system. After cyclic tempering with intermediate cryogenic treatment, a case hardness of 68.5 +/- 0.3Rc at 0.72 +/- 0.2wt% carbon content was achieved. The design demonstrated the effectiveness of cryogenic deformation in promoting martensite transformation for high carbon and high alloy steels. Good agreement between achieved and predicted case and core hardness supports the effectiveness of the computational design approach.
An Efficient Functional Test Generation Method For Processors Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Hudec, Ján; Gramatová, Elena
2015-07-01
The paper presents a new functional test generation method for processors testing based on genetic algorithms and evolutionary strategies. The tests are generated over an instruction set architecture and a processor description. Such functional tests belong to the software-oriented testing. Quality of the tests is evaluated by code coverage of the processor description using simulation. The presented test generation method uses VHDL models of processors and the professional simulator ModelSim. The rules, parameters and fitness functions were defined for various genetic algorithms used in automatic test generation. Functionality and effectiveness were evaluated using the RISC type processor DP32.
Development of Displacement Gages Exposed to Solid Rocket Motor Internal Environments
NASA Technical Reports Server (NTRS)
Bolton, D. E.; Cook, D. J.
2003-01-01
The Space Shuttle Reusable Solid Rocket Motor (RSRM) has three non-vented segment-to-segment case field joints. These joints use an interference fit J-joint that is bonded at assembly with a Pressure Sensitive Adhesive (PSA) inboard of redundant O-ring seals. Full-scale motor and sub-scale test article experience has shown that the ability to preclude gas leakage past the J-joint is a function of PSA type, joint moisture from pre-assembly humidity exposure, and the magnitude of joint displacement during motor operation. To more accurately determine the axial displacements at the J-joints, two thermally durable displacement gages (one mechanical and one electrical) were designed and developed. The mechanical displacement gage concept was generated first as a non-electrical, self-contained gage to capture the maximum magnitude of the J-joint motion. When it became feasible, the electrical displacement gage concept was generated second as a real-time linear displacement gage. Both of these gages were refined in development testing that included hot internal solid rocket motor environments and simulated vibration environments. As a result of this gage development effort, joint motions have been measured in static fired RSRM J-joints where intentional venting was produced (Flight Support Motor #8, FSM-8) and nominal non-vented behavior occurred (FSM-9 and FSM-10). This data gives new insight into the nominal characteristics of the three case J-joint positions (forward, center and aft) and characteristics of some case J-joints that became vented during motor operation. The data supports previous structural model predictions. These gages will also be useful in evaluating J-joint motion differences in a five-segment Space Shuttle solid rocket motor.
Parametric Analysis of a Hover Test Vehicle using Advanced Test Generation and Data Analysis
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen; Schumann, Johann; Menzies, Tim; Barrett, Tony
2009-01-01
Large complex aerospace systems are generally validated in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. This is due to the large parameter space, and complex, highly coupled nonlinear nature of the different systems that contribute to the performance of the aerospace system. We have addressed the factors deterring such an analysis by applying a combination of technologies to the area of flight envelop assessment. We utilize n-factor (2,3) combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. The data generated is automatically analyzed through a combination of unsupervised learning using a Bayesian multivariate clustering technique (AutoBayes) and supervised learning of critical parameter ranges using the machine-learning tool TAR3, a treatment learner. Covariance analysis with scatter plots and likelihood contours are used to visualize correlations between simulation parameters and simulation results, a task that requires tool support, especially for large and complex models. We present results of simulation experiments for a cold-gas-powered hover test vehicle.
NASA Technical Reports Server (NTRS)
Hollis, Brian R.; Hollingsworth, Kevin E.
2014-01-01
Aeroheating data on mid lift-to-drag ratio entry vehicle configurations has been obtained through hypersonic wind tunnel testing. Vehicles of this class have been proposed for high-mass Mars missions, such as sample return and crewed exploration, for which the conventional sphere-cone entry vehicle geometries of previous Mars missions are insufficient. Several configurations were investigated, including elliptically-blunted cylinders with both circular and elliptical cross sections, biconic geometries based on launch vehicle dual-use shrouds, and parametrically-optimized analytic geometries. Testing was conducted at Mach 6 over a range of Reynolds numbers sufficient to generate laminar, transitional, and turbulent flow. Global aeroheating data were obtained using phosphor thermography. Both stream-wise and cross-flow transition occured on different configurations. Comparisons were made with laminar and turbulent computational predictions generated with an algebraic turbulence model. Predictions were generally in good agreement in regions of laminar or fully-turbulent flow; however for transitional cases, the lack of a transition onset prediction capability produced less accurate comparisons. The data obtained in this study are intended to be used for prelimary mission design studies and the development and validation of computational methods.
Anticipating the higher generations of quarks from rephasing invariance of the mixing matrix
NASA Astrophysics Data System (ADS)
Botella, F. J.; Chau, Ling-Lie
1986-02-01
We show that the number of invariant CP violating parameters XCP jumps from the unique universal one in three generations to nine in the four-generation case, saturating the parameter space for generation numbers higher than three. This can lead to drastically different consequences in CP-violating phenomena. We give the quark mass matrices in the three-generation case and speculate for higher generations. We also give some invariant definitions of “maximal” CP violation.
Approaches to the safety assessment of engineered nanomaterials (ENM) in food.
Cockburn, Andrew; Bradford, Roberta; Buck, Neil; Constable, Anne; Edwards, Gareth; Haber, Bernd; Hepburn, Paul; Howlett, John; Kampers, Frans; Klein, Christoph; Radomski, Marek; Stamm, Hermann; Wijnhoven, Susan; Wildemann, Tanja
2012-06-01
A systematic, tiered approach to assess the safety of engineered nanomaterials (ENMs) in foods is presented. The ENM is first compared to its non-nano form counterpart to determine if ENM-specific assessment is required. Of highest concern from a toxicological perspective are ENMs which have potential for systemic translocation, are insoluble or only partially soluble over time or are particulate and bio-persistent. Where ENM-specific assessment is triggered, Tier 1 screening considers the potential for translocation across biological barriers, cytotoxicity, generation of reactive oxygen species, inflammatory response, genotoxicity and general toxicity. In silico and in vitro studies, together with a sub-acute repeat-dose rodent study, could be considered for this phase. Tier 2 hazard characterisation is based on a sentinel 90-day rodent study with an extended range of endpoints, additional parameters being investigated case-by-case. Physicochemical characterisation should be performed in a range of food and biological matrices. A default assumption of 100% bioavailability of the ENM provides a 'worst case' exposure scenario, which could be refined as additional data become available. The safety testing strategy is considered applicable to variations in ENM size within the nanoscale and to new generations of ENM. Copyright © 2012 Elsevier Ltd. All rights reserved.