Software Architecture: Managing Design for Achieving Warfighter Capability
2007-04-30
The Government’s requirements and specifications for a new weapon...at the Preliminary Design Review (PDR) is likely to have a much higher probability of meeting the warfighters’ need for capability. Test -case...inventories of test cases are developed from the user-defined scenarios so that there is one or more test case for every scenario. The test cases will
SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution Systems and Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Bri-Mathias; Palmintier, Bryan
This presentation provides an overview of full-scale, high-quality, synthetic distribution system data set(s) for testing distribution automation algorithms, distributed control approaches, ADMS capabilities, and other emerging distribution technologies.
NASA Technical Reports Server (NTRS)
Mohlenbrink, Christoph P.; Omar, Faisal Gamal; Homola, Jeffrey R.
2017-01-01
This is a video replay of system data that was generated from the UAS Traffic Management (UTM) Technical Capability Level (TCL) 2 flight demonstration in Nevada and rendered in Google Earth. What is depicted in the replay is a particular set of flights conducted as part of what was referred to as the Ocean scenario. The test range and surrounding area are presented followed by an overview of operational volumes. System messaging is also displayed as well as a replay of all of the five test flights as they occurred.
Russ, Alissa L; Saleem, Jason J
2018-02-01
The quality of usability testing is highly dependent upon the associated usability scenarios. To promote usability testing as part of electronic health record (EHR) certification, the Office of the National Coordinator (ONC) for Health Information Technology requires that vendors test specific capabilities of EHRs with clinical end-users and report their usability testing process - including the test scenarios used - along with the results. The ONC outlines basic expectations for usability testing, but there is little guidance in usability texts or scientific literature on how to develop usability scenarios for healthcare applications. The objective of this article is to outline key factors to consider when developing usability scenarios and tasks to evaluate computer-interface based health information technologies. To achieve this goal, we draw upon a decade of our experience conducting usability tests with a variety of healthcare applications and a wide range of end-users, to include healthcare professionals as well as patients. We discuss 10 key factors that influence scenario development: objectives of usability testing; roles of end-user(s); target performance goals; evaluation time constraints; clinical focus; fidelity; scenario-related bias and confounders; embedded probes; minimize risks to end-users; and healthcare related outcome measures. For each factor, we present an illustrative example. This article is intended to aid usability researchers and practitioners in their efforts to advance health information technologies. The article provides broad guidance on usability scenario development and can be applied to a wide range of clinical information systems and applications. Published by Elsevier Inc.
NASA Public Affairs and NUANCE Lab News Conference at Reno-Stead Airport.
2016-10-19
News Conference following the test of Unmanned Aircraft Systems Traffic Management (UTM) technical capability Level 2 (TCL2) at Reno-Stead Airport, Nevada. Joseph Rios, NASA Ames Aerospace Engineer and UTM Technical Lead, describes the purpose of the test and flight scenarios.
NASA Technical Reports Server (NTRS)
Scheuring, Richard A.; Hamilton, Doug; Jones, Jeffrey A.; Alexander, David
2009-01-01
There are currently several physiological monitoring requirements for EVA in the Human-Systems Interface Requirements (HSIR) document. There are questions as to whether the capability to monitor heart rhythm in the lunar surface space suit is a necessary capability for lunar surface operations. Similarly, there are questions as to whether the capability to monitor heart rhythm during a cabin depressurization scenario in the launch/landing space suit is necessary. This presentation seeks to inform space medicine personnel of recommendations made by an expert panel of cardiovascular medicine specialists regarding in-suit ECG heart rhythm monitoring requirements during lunar surface operations. After a review of demographic information and clinical cases and panel discussion, the panel recommended that ECG monitoring capability as a clinical tool was not essential in the lunar space suit; ECG monitoring was not essential in the launch/landing space suit for contingency scenarios; the current hear rate monitoring capability requirement for both launch/landing and lunar space suits should be maintained; lunar vehicles should be required to have ECG monitoring capability with a minimum of 5-lead ECG for IVA medical assessments; and, exercise stress testing for astronaut selection and retention should be changed from the current 85% maximum heart rate limit to maximal, exhaustive 'symptom-limited' testing to maximize diagnostic utility as a screening tool for evaluating the functional capacity of astronauts and their cardiovascular health.
1992-06-01
Paper, Version 2.0, December 1989. [Woodcock90] Gary Woodcock , Automated Generation of Hypertext Documents, CIVC Technical Report (working paper...environment setup, performance testing, assessor testing, and analysis) of the ACEC. A captive scenario example could be developed that would guide the
Scenario for Hollow Cathode End-Of-Life
NASA Technical Reports Server (NTRS)
Sarver-Verhey, Timothy R.
2000-01-01
Recent successful hollow cathode life tests have demonstrated that lifetimes can meet the requirements of several space applications. However, there are no methods for assessing cathode lifetime short of demonstrating the requirement. Previous attempts to estimate or predict cathode lifetime were based on relatively simple chemical depletion models derived from the dispenser cathode community. To address this lack of predicative capability, a scenario for hollow cathode lifetime under steady-state operating conditions is proposed. This scenario has been derived primarily from the operating behavior and post-test condition of a hollow cathode that was operated for 28,000 hours. In this scenario, the insert chemistry evolves through three relatively distinct phases over the course of the cathode lifetime. These phases are believed to correspond to demonstrable changes in cathode operation. The implications for cathode lifetime limits resulting from this scenario are examined, including methods to assess cathode lifetime without operating to End-of- Life and methods to extend the cathode lifetime.
National Unmanned Aerial System Standardized Performance Testing and Rating (NUSTAR)
NASA Technical Reports Server (NTRS)
Kopardekar, Parimal
2016-01-01
The overall objective of the NUSTAR Capability is to offer standardized tests and scenario conditions to assess performance of the UAS. The following are goals of the NU-STAR: 1. Create a prototype standardized tests and scenarios that vehicles can be tested against. 2. Identify key performance parameters of all UAS and their standardized measurement strategy. 3. Develop standardized performance reporting method (e.g., consumer report style) to assist prospective buyers. 4. Identify key performance metrics that could be used by judged towards overall safety of the UAS and operations. 5. If vehicle certification standard is made by a regulatory agency, the performance of individual UAS could be compared against the minimum requirement (e.g., sense and avoid detection time, stopping distance, kinetic energy, etc.).
Medical Simulations for Exploration Medicine
NASA Technical Reports Server (NTRS)
Reyes, David; Suresh, Rahul; Pavela, James; Urbina, Michelle; Mindock, Jennifer; Antonsen, Erik
2018-01-01
Medical simulation is a useful tool that can be used to train personnel, develop medical processes, and assist cross-disciplinary communication. Medical simulations have been used in the past at NASA for these purposes, however they are usually created ad hoc. A stepwise approach to scenario development has not previously been used. The NASA Exploration Medical Capability (ExMC) created a medical scenario development tool to test medical procedures, technologies, concepts of operation and for use in systems engineering (SE) processes.
Hot Cell Installation and Demonstration of the Severe Accident Test Station
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linton, Kory D.; Burns, Zachary M.; Terrani, Kurt A.
A Severe Accident Test Station (SATS) capable of examining the oxidation kinetics and accident response of irradiated fuel and cladding materials for design basis accident (DBA) and beyond design basis accident (BDBA) scenarios has been successfully installed and demonstrated in the Irradiated Fuels Examination Laboratory (IFEL), a hot cell facility at Oak Ridge National Laboratory. The two test station modules provide various temperature profiles, steam, and the thermal shock conditions necessary for integral loss of coolant accident (LOCA) testing, defueled oxidation quench testing and high temperature BDBA testing. The installation of the SATS system restores the domestic capability to examinemore » postulated and extended LOCA conditions on spent fuel and cladding and provides a platform for evaluation of advanced fuel and accident tolerant fuel (ATF) cladding concepts. This document reports on the successful in-cell demonstration testing of unirradiated Zircaloy-4. It also contains descriptions of the integral test facility capabilities, installation activities, and out-of-cell benchmark testing to calibrate and optimize the system.« less
Portability scenarios for intelligent robotic control agent software
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2014-06-01
Portability scenarios are critical in ensuring that a piece of AI control software will run effectively across the collection of craft that it is required to control. This paper presents scenarios for control software that is designed to control multiple craft with heterogeneous movement and functional characteristics. For each prospective target-craft type, its capabilities, mission function, location, communications capabilities and power profile are presented and performance characteristics are reviewed. This work will inform future work related to decision making related to software capabilities, hardware control capabilities and processing requirements.
Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems
NASA Astrophysics Data System (ADS)
Abeynayake, Canicious; Tran, Minh D.
2015-05-01
Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.
Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.; Bradish, Martin A.; Juergens, Jeffrey R.; Lewis, Michael J.; Vrnak, Daniel R.
2011-01-01
This Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept document was developed as a first step in developing the Component-Level Electronic-Assembly Repair (CLEAR) System Architecture (NASA/TM-2011-216956). The CLEAR operational concept defines how the system will be used by the Constellation Program and what needs it meets. The document creates scenarios for major elements of the CLEAR architecture. These scenarios are generic enough to apply to near-Earth, Moon, and Mars missions. The CLEAR operational concept involves basic assumptions about the overall program architecture and interactions with the CLEAR system architecture. The assumptions include spacecraft and operational constraints for near-Earth orbit, Moon, and Mars missions. This document addresses an incremental development strategy where capabilities evolve over time, but it is structured to prevent obsolescence. The approach minimizes flight hardware by exploiting Internet-like telecommunications that enables CLEAR capabilities to remain on Earth and to be uplinked as needed. To minimize crew time and operational cost, CLEAR exploits offline development and validation to support online teleoperations. Operational concept scenarios are developed for diagnostics, repair, and functional test operations. Many of the supporting functions defined in these operational scenarios are further defined as technologies in NASA/TM-2011-216956.
Orion Flight Test Architecture Benefits of MBSE Approach
NASA Technical Reports Server (NTRS)
Reed, Don; Simpson, Kim
2012-01-01
Exploration Flight Test 1 (EFT-1) is an unmanned first orbital flight test of the Multi Purpose Crew Vehicle (MPCV) Mission s purpose is to: Test Orion s ascent, on-orbit and entry capabilities Monitor critical activities Provide ground control in support of contingency scenarios Requires development of a large scale end-to-end information system network architecture To effectively communicate the scope of the end-to-end system a model-based system engineering approach was chosen.
75 FR 16703 - Safety Zone; Wilson Bay, Jacksonville, NC
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... the safety of the general public and exercise participants from potential hazards associated with low flying helicopters and vessels participating in this multi agency exercise. DATES: Comments and related... multi agency exercise to test response capabilities of water rescue services in a mass casualty scenario...
Sorokine, Alexandre; Schlicher, Bob G.; Ward, Richard C.; ...
2015-05-22
This paper describes an original approach to generating scenarios for the purpose of testing the algorithms used to detect special nuclear materials (SNM) that incorporates the use of ontologies. Separating the signal of SNM from the background requires sophisticated algorithms. To assist in developing such algorithms, there is a need for scenarios that capture a very wide range of variables affecting the detection process, depending on the type of detector being used. To provide such a cpability, we developed an ontology-driven information system (ODIS) for generating scenarios that can be used in creating scenarios for testing of algorithms for SNMmore » detection. The ontology-driven scenario generator (ODSG) is an ODIS based on information supplied by subject matter experts and other documentation. The details of the creation of the ontology, the development of the ontology-driven information system, and the design of the web user interface (UI) are presented along with specific examples of scenarios generated using the ODSG. We demonstrate that the paradigm behind the ODSG is capable of addressing the problem of semantic complexity at both the user and developer levels. Compared to traditional approaches, an ODIS provides benefits such as faithful representation of the users' domain conceptualization, simplified management of very large and semantically diverse datasets, and the ability to handle frequent changes to the application and the UI. Furthermore, the approach makes possible the generation of a much larger number of specific scenarios based on limited user-supplied information« less
75 FR 30706 - Safety Zone; Wilson Bay, Jacksonville, NC
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-02
... the general public and exercise participants from potential hazards associated with low flying helicopters and vessels participating in this multi agency exercise. DATES: This rule is effective from 6 a.m... exercise to test response capabilities of water rescue services in a mass casualty scenario on the waters...
NASA Astrophysics Data System (ADS)
Wang, Xiaohui; Couwenhoven, Mary E.; Foos, David H.; Doran, James; Yankelevitz, David F.; Henschke, Claudia I.
2008-03-01
An image-processing method has been developed to improve the visibility of tube and catheter features in portable chest x-ray (CXR) images captured in the intensive care unit (ICU). The image-processing method is based on a multi-frequency approach, wherein the input image is decomposed into different spatial frequency bands, and those bands that contain the tube and catheter signals are individually enhanced by nonlinear boosting functions. Using a random sampling strategy, 50 cases were retrospectively selected for the study from a large database of portable CXR images that had been collected from multiple institutions over a two-year period. All images used in the study were captured using photo-stimulable, storage phosphor computed radiography (CR) systems. Each image was processed two ways. The images were processed with default image processing parameters such as those used in clinical settings (control). The 50 images were then separately processed using the new tube and catheter enhancement algorithm (test). Three board-certified radiologists participated in a reader study to assess differences in both detection-confidence performance and diagnostic efficiency between the control and test images. Images were evaluated on a diagnostic-quality, 3-megapixel monochrome monitor. Two scenarios were studied: the baseline scenario, representative of today's workflow (a single-control image presented with the window/level adjustments enabled) vs. the test scenario (a control/test image pair presented with a toggle enabled and the window/level settings disabled). The radiologists were asked to read the images in each scenario as they normally would for clinical diagnosis. Trend analysis indicates that the test scenario offers improved reading efficiency while providing as good or better detection capability compared to the baseline scenario.
2013-12-20
MORRO BAY, Calif. – The SpaceX Dragon test article tumbles over the Pacific Ocean, off the coast of Morro Bay, Calif., following its release for an Erickson Sky Crane helicopter. SpaceX engineers induced the tumble to evaluate the spacecraft's parachute deployment system in an emergency abort scenario. The test is part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – An Erickson Sky Crane helicopter releases the SpaceX Dragon test article, inducing a tumble similar to what is expected in an emergency abort scenario, over the Pacific Ocean, off the coast of Morro Bay, Calif. The test allowed engineers to better evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – The SpaceX Dragon test article tumbles over the Pacific Ocean, off the coast of Morro Bay, Calif., following its release for an Erickson Sky Crane helicopter. SpaceX engineers induced the tumble to evaluate the spacecraft's parachute deployment system in an emergency abort scenario. The test is part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
Integration of the SSPM and STAGE with the MPACT Virtual Facility Distributed Test Bed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cipiti, Benjamin B.; Shoman, Nathan
The Material Protection Accounting and Control Technologies (MPACT) program within DOE NE is working toward a 2020 milestone to demonstrate a Virtual Facility Distributed Test Bed. The goal of the Virtual Test Bed is to link all MPACT modeling tools, technology development, and experimental work to create a Safeguards and Security by Design capability for fuel cycle facilities. The Separation and Safeguards Performance Model (SSPM) forms the core safeguards analysis tool, and the Scenario Toolkit and Generation Environment (STAGE) code forms the core physical security tool. These models are used to design and analyze safeguards and security systems and generatemore » performance metrics. Work over the past year has focused on how these models will integrate with the other capabilities in the MPACT program and specific model changes to enable more streamlined integration in the future. This report describes the model changes and plans for how the models will be used more collaboratively. The Virtual Facility is not designed to integrate all capabilities into one master code, but rather to maintain stand-alone capabilities that communicate results between codes more effectively.« less
NASA Astrophysics Data System (ADS)
Zea, Luis; Diaz, Alejandro R.; Shepherd, Charles K.; Kumar, Ranganathan
2010-07-01
Extra-vehicular activities (EVAs) are an essential part of human space exploration, but involve inherently dangerous procedures which can put crew safety at risk during a space mission. To help mitigate this risk, astronauts' training programs spend substantial attention on preparing for surface EVA emergency scenarios. With the help of two Mars Desert Research Station (MDRS) crews (61 and 65), wearing simulated spacesuits, the most important of these emergency scenarios were examined at three different types of locations that geologically and environmentally resemble lunar and Martian landscapes. These three platforms were analyzed geologically as well as topographically (utilizing a laser range finder with slope estimation capabilities and a slope determination software). Emergency scenarios were separated into four main groups: (1) suit issues, (2) general physiological, (3) attacks and (4) others. Specific tools and procedures were developed to address each scenario. The tools and processes were tested in the field under Mars-analog conditions with the suited subjects for feasibility and speed of execution.
Simulation of Lunar Surface Communications Network Exploration Scenarios
NASA Technical Reports Server (NTRS)
Linsky, Thomas W.; Bhasin, Kul B.; White, Alex; Palangala, Srihari
2006-01-01
Simulations and modeling of surface-based communications networks provides a rapid and cost effective means of requirement analysis, protocol assessments, and tradeoff studies. Robust testing in especially important for exploration systems, where the cost of deployment is high and systems cannot be easily replaced or repaired. However, simulation of the envisioned exploration networks cannot be achieved using commercial off the shelf network simulation software. Models for the nonstandard, non-COTS protocols used aboard space systems are not readily available. This paper will address the simulation of realistic scenarios representative of the activities which will take place on the surface of the Moon, including selection of candidate network architectures, and the development of an integrated simulation tool using OPNET modeler capable of faithfully modeling those communications scenarios in the variable delay, dynamic surface environments. Scenarios for exploration missions, OPNET development, limitations, and simulations results will be provided and discussed.
Development of HWIL Testing Capabilities for Satellite Target Emulation at AEDC
NASA Astrophysics Data System (ADS)
Lowry, H.; Crider, D.; Burns, J.; Thompson, R.; Goldsmith, G., II; Sholes, W.
Programs involved in Space Situational Awareness (SSA) need the capability to test satellite sensors in a Hardware-in-the-Loop (HWIL) environment. Testing in a ground system avoids the significant cost of on-orbit test targets and the resulting issues such as debris mitigation, and in-space testing implications. The space sensor test facilities at AEDC consist of cryo-vacuum chambers that have been developed to project simulated targets to air-borne, space-borne, and ballistic platforms. The 7V chamber performs calibration and characterization of surveillance and seeker systems, as well as some mission simulation. The 10V chamber is being upgraded to provide real-time target simulation during the detection, acquisition, discrimination, and terminal phases of a seeker mission. The objective of the Satellite Emulation project is to upgrade this existing capability to support the ability to discern and track other satellites and orbital debris in a HWIL capability. It would provide a baseline for realistic testing of satellite surveillance sensors, which would be operated in a controlled environment. Many sensor functions could be tested, including scene recognition and maneuvering control software, using real interceptor hardware and software. Statistically significant and repeatable datasets produced by the satellite emulation system can be acquired during such test and saved for further analysis. In addition, the robustness of the discrimination and tracking algorithms can be investigated by a parametric analysis using slightly different scenarios; this will be used to determine critical points where a sensor system might fail. The radiometric characteristics of satellites are expected to be similar to the targets and decoys that make up a typical interceptor mission scenario, since they are near ambient temperature. Their spectral reflectivity, emissivity, and shape must also be considered, but the projection systems employed in the 7V and 10V chambers should be capable of providing the simulation of satellites as well. There may also be a need for greater radiometric intensity or shorter time response. An appropriate satellite model is integral to the scene generation process to meet the requirements of SSA programs. The Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility and the Guided Weapons Evaluation Facility (GWEF), both at Eglin Air Force Base, FL are assisting in developing the scene projection hardware, based on their significant test experience using resistive emitter arrays to test interceptors in a real-time environment. Army Aviation and Missile Research & Development Command (AMRDEC) will develop the Scene Generation System for the real-time mission simulation.
The Practical Concept of an Evaluator and Its Use in the Design of Training Systems.
ERIC Educational Resources Information Center
Gibbons, Andrew S.; Rogers, Dwayne H.
1991-01-01
The evaluator is an instructional system product that provides practice, testing capability, and feedback in a way not yet seen in computer-assisted instruction. Training methods using an evaluator contain scenario-based simulation exercises, followed by a critique of performance. A focus on competency-based education and performance makes the…
Becky K. Kerns; Miles A. Hemstrom; David Conklin; Gabriel I. Yospin; Bart Johnson; Dominique Bachelet; Scott Bridgham
2012-01-01
Understanding landscape vegetation dynamics often involves the use of scientifically-based modeling tools that are capable of testing alternative management scenarios given complex ecological, management, and social conditions. State-and-transition simulation model (STSM) frameworks and software such as PATH and VDDT are commonly used tools that simulate how landscapes...
Complexity associated with the optimisation of capability options in military operations
NASA Astrophysics Data System (ADS)
Pincombe, A.; Bender, A.; Allen, G.
2005-12-01
In the context of a military operation, even if the intended actions, the geographic location, and the capabilities of the opposition are known, there are still some critical uncertainties that could have a major impact on the effectiveness of a given set of capabilities. These uncertainties include unpredictable events and the response alternatives that are available to the command and control elements of the capability set. They greatly complicate any a priori mathematical description. In a forecasting approach, the most likely future might be chosen and a solution sought that is optimal for that case. With scenario analysis, futures are proposed on the basis of critical uncertainties and the option that is most robust is chosen. We use scenario analysis but our approach is different in that we focus on the complexity and use the coupling between scenarios and options to create information on ideal options. The approach makes use of both soft and hard operations research methods, with subject matter expertise being used to define plausible responses to scenarios. In each scenario, uncertainty affects only a subset of the system-inherent variables and the variables that describe system-environment interactions. It is this scenario-specific reduction of variables that makes the problem mathematically tractable. The process we define is significantly different to existing scenario analysis processes, so we have named it adversarial scenario analysis. It can be used in conjunction with other methods, including recent improvements to the scenario analysis process. To illustrate the approach, we undertake a tactical level scenario analysis for a logistics problem that is defined by a network, expected throughputs to end users, the transport capacity available, the infrastructure at the nodes and the capacities of roads, stocks etc. The throughput capacity, e.g. the effectiveness, of the system relies on all of these variables and on the couplings between them. The system is initially in equilibrium for a given level of demand. However, different, and simpler, solutions emerge as the balance of couplings and the importance of variables change. The scenarios describe such changes in conditions. For each scenario it was possible to define measures that describe the differences between options. As with agent-based distillations, the solution is essentially qualitative and exploratory, bringing awareness of possible future difficulties and of the capabilities that are necessary if we are to deal successfully with those difficulties.
Adaptive Augmenting Control Flight Characterization Experiment on an F/A-18
NASA Technical Reports Server (NTRS)
VanZwieten, Tannen S.; Gilligan, Eric T.; Wall, John H.; Orr, Jeb S.; Miller, Christopher J.; Hanson, Curtis E.
2014-01-01
The NASA Marshall Space Flight Center (MSFC) Flight Mechanics and Analysis Division developed an Adaptive Augmenting Control (AAC) algorithm for launch vehicles that improves robustness and performance by adapting an otherwise welltuned classical control algorithm to unexpected environments or variations in vehicle dynamics. This AAC algorithm is currently part of the baseline design for the SLS Flight Control System (FCS), but prior to this series of research flights it was the only component of the autopilot design that had not been flight tested. The Space Launch System (SLS) flight software prototype, including the adaptive component, was recently tested on a piloted aircraft at Dryden Flight Research Center (DFRC) which has the capability to achieve a high level of dynamic similarity to a launch vehicle. Scenarios for the flight test campaign were designed specifically to evaluate the AAC algorithm to ensure that it is able to achieve the expected performance improvements with no adverse impacts in nominal or nearnominal scenarios. Having completed the recent series of flight characterization experiments on DFRC's F/A-18, the AAC algorithm's capability, robustness, and reproducibility, have been successfully demonstrated. Thus, the entire SLS control architecture has been successfully flight tested in a relevant environment. This has increased NASA's confidence that the autopilot design is ready to fly on the SLS Block I vehicle and will exceed the performance of previous architectures.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
NASA Astrophysics Data System (ADS)
Perry, S. C.; Holbrook, C. C.
2008-12-01
The ShakeOut Scenario of a magnitude 7.8 earthquake on the southern San Andreas Fault was developed to fit needs of end users, particularly emergency managers at Federal, State, and local levels. Customization has continued after initial publication. The Scenario, a collaboration among some 300 experts in physical and social sciences, engineering, and industry, was released in May, 2008, to a key planning conference for the November 2008 Golden Guardian Exercise series. According to long-standing observers, the 2008 exercise is the most ambitious of their experience. The scientific foundation has attracted a large number of participants and there are already requests to continue use of the Scenario in 2009. Successful exercises cover a limited range of capabilities, in order to test performance in measurable ways, and to train staff without overwhelming them. Any one exercise would fail if it attempted to capture the complexity of impacts from a major earthquake. Instead, exercise planners have used the Scenario like a magnifying glass to identify risk and capabilities most critical to their own jurisdictions. Presentations by Scenario scientists and a 16-page narrative provided an initial overview. However, many planners were daunted in attempts to extract details from a 300-page report, 12 supplemental studies, and 10 appendices, or in attempts to cast the reality into straightforward events to drive successful exercises. Thus we developed an evolving collection of documents, presentations, and consultations that included impacts to specific jurisdictions; distillations of damages and consequences; and annotated lists of capabilities and situations to consider. Some exercise planners needed realistic extrapolations beyond posited damages; others sought reality checks; yet others needed new formats or perspectives. Through all this, it was essential to maintain flexibility, assisting planners to adjust findings where appropriate, while indicating why some results could not be changed. The results of these efforts have been exercises that use a richer set of scientific findings; planners and participants with a broader understanding of the regional impacts of a major earthquake; and for future scenarists, increased insight into emergency management application of hazard science results, and into the value of ongoing engagement with stakeholders.
Human-Robot Planetary Exploration Teams
NASA Technical Reports Server (NTRS)
Tyree, Kimberly
2004-01-01
The EVA Robotic Assistant (ERA) project at NASA Johnson Space Center studies human-robot interaction and robotic assistance for future human planetary exploration. Over the past four years, the ERA project has been performing field tests with one or more four-wheeled robotic platforms and one or more space-suited humans. These tests have provided experience in how robots can assist humans, how robots and humans can communicate in remote environments, and what combination of humans and robots works best for different scenarios. The most efficient way to understand what tasks human explorers will actually perform, and how robots can best assist them, is to have human explorers and scientists go and explore in an outdoor, planetary-relevant environment, with robots to demonstrate what they are capable of, and roboticists to observe the results. It can be difficult to have a human expert itemize all the needed tasks required for exploration while sitting in a lab: humans do not always remember all the details, and experts in one arena may not even recognize that the lower level tasks they take for granted may be essential for a roboticist to know about. Field tests thus create conditions that more accurately reveal missing components and invalid assumptions, as well as allow tests and comparisons of new approaches and demonstrations of working systems. We have performed field tests in our local rock yard, in several locations in the Arizona desert, and in the Utah desert. We have tested multiple exploration scenarios, such as geological traverses, cable or solar panel deployments, and science instrument deployments. The configuration of our robot can be changed, based on what equipment is needed for a given scenario, and the sensor mast can even be placed on one of two robot bases, each with different motion capabilities. The software architecture of our robot is also designed to be as modular as possible, to allow for hardware and configuration changes. Two focus areas of our research are safety and crew time efficiency. For safety, our work involves enabling humans to reliably communicate with a robot while moving in the same workspace, and enabling robots to monitor and advise humans of potential problems. Voice, gesture, remote computer control, and enhanced robot intelligence are methods we are studying. For crew time efficiency, we are investigating the effects of assigning different roles to humans and robots in collaborative exploration scenarios.
NASA Technical Reports Server (NTRS)
Hanley, G. M.
1980-01-01
An evolutionary Satellite Power Systems development plan was prepared. Planning analysis was directed toward the evolution of a scenario that met the stated objectives, was technically possible and economically attractive, and took into account constraining considerations, such as requirements for very large scale end-to-end demonstration in a compressed time frame, the relative cost/technical merits of ground testing versus space testing, and the need for large mass flow capability to low Earth orbit and geosynchronous orbit at reasonable cost per pound.
NASA Technical Reports Server (NTRS)
Qualls, Garry; Cross, Charles; Mahlin, Matthew; Montague, Gilbert; Motter, Mark; Neilan, James; Rothhaar, Paul; Tran, Loc; Trujillo, Anna; Allen, B. Danette
2015-01-01
Software tools are being developed by the Autonomy Incubator at NASA's Langley Research Center that will provide an integrated and scalable capability to support research and non-research flight operations across several flight domains, including urban and mixed indoor-outdoor operations. These tools incorporate a full range of data products to support mission planning, approval, flight operations, and post-flight review. The system can support a number of different operational scenarios that can incorporate live and archived data streams for UAS operators, airspace regulators, and other important stakeholders. Example use cases are described that illustrate how the tools will benefit a variety of users in nominal and off-nominal operational scenarios. An overview is presented for the current state of the toolset, including a summary of current demonstrations that have been completed. Details of the final, fully operational capability are also presented, including the interfaces that will be supported to ensure compliance with existing and future airspace operations environments.
Expanded Capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Brian; Melaina, Marc; Penev, Michael
This presentation describes how NREL expanded the capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST) in FY16. It was presented at the U.S. Department of Energy Hydrogen and Fuel Cells Program 2016 Annual Merit Review and Peer Evaluation Meeting on June 8, 2016, in Washington, D.C.
Physical environment virtualization for human activities recognition
NASA Astrophysics Data System (ADS)
Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen
2015-05-01
Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.
West-Coast Wide Expansion and Testing of the Geodetic Alarm System (G-larmS)
NASA Astrophysics Data System (ADS)
Ruhl, C. J.; Grapenthin, R.; Melgar, D.; Aranha, M. A.; Allen, R. M.
2016-12-01
The Geodetic Alarm System (G-larmS) was developed in collaboration between the Berkeley Seismological Laboratory (BSL) and New Mexico Tech for real-time Earthquake Early Warning (EEW). G-larmS has been in continuous operation at the BSL since 2014 using event triggers from the ShakeAlert EEW system and real-time position time series from a fully triangulated network consisting of BARD, PBO and USGS stations across northern California (CA). G-larmS has been extended to include southern CA and Cascadia, providing continuous west-coast wide coverage. G-larmS currently uses high rate (1 Hz), low latency (< 5 s), accurate positioning (cm level) time series data from a regional GPS network and P-wave event triggers from the ShakeAlert EEW system. It extracts static offsets from real-time GPS time series upon S-wave arrival and performs a least squares inversion on these offsets to determine slip on a finite fault. A key issue with geodetic EEW approaches is that unlike seismology-based algorithms that are routinely tested using frequent small-magnitude events, geodetic systems are not regularly exercised. Scenario ruptures are therefore important for testing the performance of G-larmS. We discuss results from scenario events on several large faults (capable of M>6.5) in CA and Cascadia built from realistic 3D geometries. Synthetic long-period 1Hz displacement waveforms were obtained from a new stochastic kinematic slip distribution generation method. Waveforms are validated by direct comparison to peak P-wave displacement scaling laws and to PGD GMPEs obtained from high-rate GPS observations of large events worldwide. We run the scenarios on real-time streams to systematically test the recovery of slip and magnitude by G-larmS. In addition to presenting these results, we will discuss new capabilities, such as implementing 2D geometry and the applicability of these results to GPS enhanced tsunami warning systems.
NASA Technical Reports Server (NTRS)
Schifer, Nicholas A.; Oriti, Salvatore M.
2013-01-01
The NASA Glenn Research Center (GRC) has been testing 100 We class, free-piston Stirling convertors for potential use in Stirling Radioisotope Power Systems (RPS) for space science and exploration missions. Free-piston Stirling convertors are capable of achieving a 38% conversion efficiency, making Stirling attractive for meeting future power system needs in light of the shrinking U.S. plutonium fuel supply. Convertors currently on test include four Stirling Technology Demonstration Convertors (TDCs), manufactured by the Stirling Technology Company (STC), and six Advanced Stirling Convertors (ASCs), manufactured by Sunpower, Inc. Total hours of operation is greater than 514,000 hours (59 years). Several tests have been initiated to demonstrate the functionality of Stirling convertors for space applications, including: in-air extended operation, thermal vacuum extended operation. Other tests have also been conducted to characterize Stirling performance in anticipated mission scenarios. Data collected during testing has been used to support life and reliability estimates, drive design changes and improve quality, and plan for expected mission scenarios. This paper will provide a summary of convertors tested at NASA GRC and discuss lessons learned through extended testing.
Future Interoperability of Camp Protection Systems (FICAPS)
NASA Astrophysics Data System (ADS)
Caron, Sylvie; Gündisch, Rainer; Marchand, Alain; Stahl, Karl-Hermann
2013-05-01
The FICAPS Project has been established as a Project of the European Defence Agency based on an initiative of Germany and France. Goal of this Project was to derive Guidelines, which by a proper implementation in future developments improve Camp Protection Systems (CPS) by enabling and improving interoperability between Camp Protection Systems and its Equipments of different Nations involved in multinational missions. These Guidelines shall allow for: • Real-time information exchange between equipments and systems of different suppliers and nations (even via SatCom), • Quick and easy replacement of equipments (even of different Nations) at run-time in the field by means of plug and play capability, thus lowering the operational and logistic costs and making the system highly available, • Enhancement of system capabilities (open and modular systems) by adding new equipment with new capabilities (just plug-in, automatic adjustment of the HMI Human Machine Interface) without costly and time consuming validation and test on system level (validation and test can be done on Equipment level), Four scenarios have been identified to summarize the interoperability requirements from an operational viewpoint. To prove the definitions given in the Guideline Document, a French and a German Demonstration System, based on existing national assets, were realized. Demonstrations, showing the capabilities given by the defined interoperability requirements with respect to the operational scenarios, were performed. Demonstrations included remote control of a CPS by another CPS, remote sensor control (Electro-Optic/InfraRed EO/IR) and remote effector control. This capability can be applied to extend the protection area or to protect distant infrastructural assets Demonstrations have been performed. The required interoperability functionality was shown successfully. Even if the focus of the FICAPS project was on camp protection, the solution found is also appropriate for other force protection and ISR (Intelligence Surveillance Reconnaissance) tasks not only due to its flexibility but also due to the chosen interfacing.
The Ensemble Space Weather Modeling System (eSWMS): Status, Capabilities and Challenges
NASA Astrophysics Data System (ADS)
Fry, C. D.; Eccles, J. V.; Reich, J. P.
2010-12-01
Marking a milestone in space weather forecasting, the Space Weather Modeling System (SWMS) successfully completed validation testing in advance of operational testing at Air Force Weather Agency’s primary space weather production center. This is the first coupling of stand-alone, physics-based space weather models that are currently in operations at AFWA supporting the warfighter. Significant development effort went into ensuring the component models were portable and scalable while maintaining consistent results across diverse high performance computing platforms. Coupling was accomplished under the Earth System Modeling Framework (ESMF). The coupled space weather models are the Hakamada-Akasofu-Fry version 2 (HAFv2) solar wind model and GAIM1, the ionospheric forecast component of the Global Assimilation of Ionospheric Measurements (GAIM) model. The SWMS was developed by team members from AFWA, Explorations Physics International, Inc. (EXPI) and Space Environment Corporation (SEC). The successful development of the SWMS provides new capabilities beyond enabling extended lead-time, data-driven ionospheric forecasts. These include ingesting diverse data sets at higher resolution, incorporating denser computational grids at finer time steps, and performing probability-based ensemble forecasts. Work of the SWMS development team now focuses on implementing the ensemble-based probability forecast capability by feeding multiple scenarios of 5 days of solar wind forecasts to the GAIM1 model based on the variation of the input fields to the HAFv2 model. The ensemble SWMS (eSWMS) will provide the most-likely space weather scenario with uncertainty estimates for important forecast fields. The eSWMS will allow DoD mission planners to consider the effects of space weather on their systems with more advance warning than is currently possible. The payoff is enhanced, tailored support to the warfighter with improved capabilities, such as point-to-point HF propagation forecasts, single-frequency GPS error corrections, and high cadence, high-resolution Space Situational Awareness (SSA) products. We present the current status of eSWMS, its capabilities, limitations and path of transition to operational use.
Celeste Journey; Anne B. Hoos; David E. Ladd; John W. brakebill; Richard A. Smith
2016-01-01
The U.S. Geological Survey (USGS) National Water Quality Assessment program has developed a web-based decision support system (DSS) to provide free public access to the steady-stateSPAtially Referenced Regressions On Watershed attributes (SPARROW) model simulation results on nutrient conditions in streams and rivers and to offer scenario testing capabilities for...
STEADY STATE MODELING OF THE MINIMUM CRITICAL CORE OF THE TRANSIENT REACTOR TEST FACILITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony L. Alberti; Todd S. Palmer; Javier Ortensi
2016-05-01
With the advent of next generation reactor systems and new fuel designs, the U.S. Department of Energy (DOE) has identified the need for the resumption of transient testing of nuclear fuels. The DOE has decided that the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory (INL) is best suited for future testing. TREAT is a thermal neutron spectrum, air-cooled, nuclear test facility that is designed to test nuclear fuels in transient scenarios. These specific scenarios range from simple temperature transients to full fuel melt accidents. DOE has expressed a desire to develop a simulation capability that will accurately modelmore » the experiments before they are irradiated at the facility. It is the aim for this capability to have an emphasis on effective and safe operation while minimizing experimental time and cost. The multi physics platform MOOSE has been selected as the framework for this project. The goals for this work are to identify the fundamental neutronics properties of TREAT and to develop an accurate steady state model for future multiphysics transient simulations. In order to minimize computational cost, the effect of spatial homogenization and angular discretization are investigated. It was found that significant anisotropy is present in TREAT assemblies and to capture this effect, explicit modeling of cooling channels and inter-element gaps is necessary. For this modeling scheme, single element calculations at 293 K gave power distributions with a root mean square difference of 0.076% from those of reference SERPENT calculations. The minimum critical core configuration with identical gap and channel treatment at 293 K resulted in a root mean square, total core, radial power distribution 2.423% different than those of reference SERPENT solutions.« less
NASA's New High Intensity Solar Environment Test Capability
NASA Technical Reports Server (NTRS)
Schneider, Todd A.; Vaughn, Jason A.; Wright, Kenneth H.
2012-01-01
Across the world, new spaceflight missions are being designed and executed that will place spacecraft and instruments into challenging environments throughout the solar system. To aid in the successful completion of these new missions, NASA has developed a new flexible space environment test platform. The High Intensity Solar Environment Test (HISET) capability located at NASA fs Marshall Space Flight Center provides scientists and engineers with the means to test spacecraft materials and systems in a wide range of solar wind and solar photon environments. Featuring a solar simulator capable of delivering approximately 1 MW/m2 of broad spectrum radiation at maximum power, HISET provides a means to test systems or components that could explore the solar corona. The solar simulator consists of three high-power Xenon arc lamps that can be operated independently over a range of power to meet test requirements; i.e., the lamp power can be greatly reduced to simulate the solar intensity at several AU. Integral to the HISET capability are charged particle sources that can provide a solar wind (electron and proton) environment. Used individually or in combination, the charged particle sources can provide fluxes ranging from a few nA/cm2 to 100s of nA/cm2 over an energy range of 50 eV to 100 keV for electrons and 100 eV to 30 keV for protons. Anchored by a high vacuum facility equipped with a liquid nitrogen cold shroud for radiative cooling scenarios, HISET is able to accommodate samples as large as 1 meter in diameter. In this poster, details of the HISET capability will be presented, including the wide ]ranging configurability of the system.
Unmanned and Unattended Response Capability for Homeland Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
BENNETT, PHIL C.
2002-11-01
An analysis was conducted of the potential for unmanned and unattended robotic technologies for forward-based, immediate response capabilities that enables access and controlled task performance. The authors analyze high-impact response scenarios in conjunction with homeland security organizations, such as the NNSA Office of Emergency Response, the FBI, the National Guard, and the Army Technical Escort Unit, to cover a range of radiological, chemical and biological threats. They conducted an analysis of the potential of forward-based, unmanned and unattended robotic technologies to accelerate and enhance emergency and crisis response by Homeland Defense organizations. Response systems concepts were developed utilizing new technologiesmore » supported by existing emerging threats base technologies to meet the defined response scenarios. These systems will pre-position robotic and remote sensing capabilities stationed close to multiple sites for immediate action. Analysis of assembled systems included experimental activities to determine potential efficacy in the response scenarios, and iteration on systems concepts and remote sensing and robotic technologies, creating new immediate response capabilities for Homeland Defense.« less
EV-Grid Integration (EVGI) Control and System Implementation - Research Overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kisacikoglu, Mithat; Markel, Tony; Meintz, Andrew
2016-03-23
Plug-in electric vehicles (PEVs) are being increasingly adopted in industry today. Microgrid applications of PEVs require the development of charging and discharging algorithms and individual characterization of vehicles including the on-board chargers and vehicle mobility. This study summarizes the capabilities of the Electric Vehicle Grid Integration (EVGI) Team at NREL and underlines different recent projects of the Team. Our studies include V1G, V2G, and V2H control of PEVs as well as test and analysis of stationary and dynamic wireless power transfer (WPT) systems. The presentation also includes the future scope of study which implements real-time simulation of PEVs in amore » microgrid scenario. The capabilities at Vehicle Testing and Integration Facility (VTIF) and Energy Systems Integration Facility (ESIF) were described within the scope of the EVGI research.« less
1992-10-01
intelligence developed an authentic European conflict scenario based on WINTEX- CIMEX , a detailed European command post exercise. One of the primary...them. The only exercises in which we effectively train from start to finish are the large CPXs like WINTEX/ CIMEX . This exercise is a procedural...general war CPX, sponsored by the Joint Chiefs of Staff. WINTEX/ CIMEX exercises. tests, and evaluates command and control procedures. planning. and
A 21st Century National Public Health System
2008-09-01
Security (DHS) released fifteen national planning scenarios in 2004 and the Target Capabilities List: A Companion to the National Preparedness Goal in...no clinical samples available from the first SARS patient in China to test for the virus; however, the second identified SARS case was a chef , Huang...Xingchu, who worked at a restaurant and was reported to have atypical pneumonia. As a chef , he came into regular contact with several types of
A Ground Testbed to Advance US Capability in Autonomous Rendezvous and Docking Project
NASA Technical Reports Server (NTRS)
D'Souza, Chris
2014-01-01
This project will advance the Autonomous Rendezvous and Docking (AR&D) GNC system by testing it on hardware, particularly in a flight processor, with a goal of testing it in IPAS with the Waypoint L2 AR&D scenario. The entire Agency supports development of a Commodity for Autonomous Rendezvous and Docking (CARD) as outlined in the Agency-wide Community of Practice whitepaper entitled: "A Strategy for the U.S. to Develop and Maintain a Mainstream Capability for Automated/Autonomous Rendezvous and Docking in Low Earth Orbit and Beyond". The whitepaper establishes that 1) the US is in a continual state of AR&D point-designs and therefore there is no US "off-the-shelf" AR&D capability in existence today, 2) the US has fallen behind our foreign counterparts particularly in the autonomy of AR&D systems, 3) development of an AR&D commodity is a national need that would benefit NASA, our commercial partners, and DoD, and 4) an initial estimate indicates that the development of a standardized AR&D capability could save the US approximately $60M for each AR&D project and cut each project's AR&D flight system implementation time in half.
The Personal Satellite Assistant: An Internal Spacecraft Autonomous Mobile Monitor
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Gawdiak, Yuri; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper presents an overview of the research and development effort at the NASA Ames Research Center to create an internal spacecraft autonomous mobile monitor capable of performing intra-vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the capabilities, mission roles, rationale, high-level functional requirements, and design challenges for an autonomous mobile monitor. The rapid prototyping design methodology used, in which five prototypes of increasing fidelity are designed, is described as well as the status of these prototypes, of which two are operational and being tested, and one is actively being designed. The physical test facilities used to perform ground testing are briefly described, including a micro-gravity test facility that permits a prototype to propel itself in 3 dimensions with 6 degrees-of-freedom as if it were in an micro-gravity environment. We also describe an overview of the autonomy framework and its components including the software simulators used in the development process. Sample mission test scenarios are also described. The paper concludes with a discussion of future and related work followed by the summary.
Police officer response to the injured officer: a survey-based analysis of medical care decisions.
Sztajnkrycer, Matthew D; Callaway, David W; Baez, Amado Alejandro
2007-01-01
No widely accepted, specialized medical training exists for police officers confronted with medical emergencies while under conditions of active threat. The purpose of this study was to assess medical decision-making capabilities of law enforcement personnel under these circumstances. Web-based surveys were administered to all sworn officers within the county jurisdiction. Thirty-eight key actions were predetermined for nine injured officer scenarios, with each correct action worth one point. Descriptive statistics and t-tests were used to analyze results. Ninety-seven officers (65.1% response rate) responded to the survey. The majority of officers (68.0%) were trained to the first-responder level. Overall mean score for the scenarios was 15.5 +/- 3.6 (range 7-25). A higher level of medical training (EMT-B/P versus first responder) was associated with a higher mean score (16.6 +/- 3.4, p = 0.05 vs. 15.0 +/- 3.6, p = 0.05). Tactical unit assignment was associated with a lower score compared with non-assigned officers (13.5 +/- 2.9 vs. 16.0 +/- 3.6, p = 0.0085). No difference was noted based upon previous military experience. Ninety-two percent of respondents expressed interest in a law enforcement-oriented advanced first-aid course. Tactical medical decision-making capability, as assessed through the nine scenarios, was sub-optimal. In this post 9/11 era, development of law enforcement-specific medical training appears appropriate.
Leveling the Playing Field: China’s Development of Advanced Energy Weapons
2012-05-02
02-05-2012 2. REPORT TYPE Master of Military Studies Research Paper 3. DATES COVERED (From - To) September 2011 - April 2012 5a. CONTRACT NUMBER...weapons in a surprise attack scenario to counter superior U.S. capabilities and technology. This paper will update and review current and developing...utilizing these weapons in a surprise attack scenario to counter superior U.S. capabilities and technology. This paper will update and review current
NASA Technical Reports Server (NTRS)
Cissom, R. D.; Melton, T. L.; Schneider, M. P.; Lapenta, C. C.
1999-01-01
The objective of this paper is to provide the future ISS scientist and/or engineer a sense of what ISS payload operations are expected to be. This paper uses a real-time operations scenario to convey this message. The real-time operations scenario begins at the initiation of payload operations and runs through post run experiment analysis. In developing this scenario, it is assumed that the ISS payload operations flight and ground capabilities are fully available for use by the payload user community. Emphasis is placed on telescience operations whose main objective is to enable researchers to utilize experiment hardware onboard the International Space Station as if it were located in their terrestrial laboratory. An overview of the Payload Operations Integration Center (POIC) systems and user ground system options is included to provide an understanding of the systems and interfaces users will utilize to perform payload operations. Detailed information regarding POIC capabilities can be found in the POIC Capabilities Document, SSP 50304.
DNA testing in homicide investigations.
Prahlow, Joseph A; Cameron, Thomas; Arendt, Alexander; Cornelis, Kenneth; Bontrager, Anthony; Suth, Michael S; Black, Lisa; Tobey, Rebbecca; Pollock, Sharon; Stur, Shawn; Cotter, Kenneth; Gabrielse, Joel
2017-10-01
Objectives With the widespread use of DNA testing, police, death investigators, and attorneys need to be aware of the capabilities of this technology. This review provides an overview of scenarios where DNA evidence has played a major role in homicide investigations in order to highlight important educational issues for police, death investigators, forensic pathologists, and attorneys. Methods This was a nonrandom, observational, retrospective study. Data were obtained from the collective files of the authors from casework during a 15-year period, from 2000 through 2014. Results A series of nine scenarios, encompassing 11 deaths, is presented from the standpoint of the police and death investigation, the forensic pathology autopsy performance, the subsequent DNA testing of evidence, and, ultimately, the final adjudication of cases. Details of each case are presented, along with a discussion that focuses on important aspects of sample collection for potential DNA testing, especially at the crime scene and the autopsy. The presentation highlights the diversity of case and evidence types in which DNA testing played a valuable role in the successful prosecution of the case. Conclusions By highlighting homicides where DNA testing contributed to the successful adjudication of cases, police, death investigators, forensic pathologists, and attorneys will be better informed regarding the types of evidence and situations where such testing is of potential value.
Microtechnology management considering test and cost aspects for stacked 3D ICs with MEMS
NASA Astrophysics Data System (ADS)
Hahn, K.; Wahl, M.; Busch, R.; Grünewald, A.; Brück, R.
2018-01-01
Innovative automotive systems require complex semiconductor devices currently only available in consumer grade quality. The European project TRACE will develop and demonstrate methods, processes, and tools to facilitate usage of Consumer Electronics (CE) components to be deployable more rapidly in the life-critical automotive domain. Consumer electronics increasingly use heterogeneous system integration methods and "More than Moore" technologies, which are capable to combine different circuit domains (Analog, Digital, RF, MEMS) and which are integrated within SiP or 3D stacks. Making these technologies or at least some of the process steps available under automotive electronics requirements is an important goal to keep pace with the growing demand for information processing within cars. The approach presented in this paper aims at a technology management and recommendation system that covers technology data, functional and non-functional constraints, and application scenarios, and that will comprehend test planning and cost consideration capabilities.
Countering MANPADS: study of new concepts and applications
NASA Astrophysics Data System (ADS)
Maltese, Dominique; Robineau, Jacques; Audren, Jean-Thierry; Aragones, Julien; Sailliot, Christophe
2006-05-01
The latest events of ground-to-air Man Portable Air Defense (MANPAD) attacks against aircraft have revealed a new threat both for military and civilian aircraft. Consequently, the implementation of Protecting systems (i.e. Directed InfraRed Counter Measure - DIRCM) in order to face IR guided missiles turns out to be now inevitable. In a near future, aircraft will have to possess detection, tracking, targeting and jamming capabilities to face single and multiple MANPAD threats fired in short-range scenarios from various environments (urban sites, landscape...). In this paper, a practical example of a DIRCM system under study at SAGEM DEFENSE & SECURITY company is presented. The self-protection solution includes built-in and automatic locking-on, tracking, identification and laser jamming capabilities, including defeat assessment. Target Designations are provided by a Missile Warning System. Multiple Target scenarios have been considered to design the system architecture. The article deals with current and future threats (IR seekers of different generations...), scenarios and platforms for system definition. Plus, it stresses on self-protection solutions based on laser jamming capability. Different strategies including target identification, multi band laser, active imagery are described. The self-protection system under study at SAGEM DEFENSE & SECURITY company is also a part of this chapter. Eventually, results of self-protection scenarios are provided for different MANPAD scenarios. Data have been obtained from a simulation software. The results highlight how the system reacts to incoming IR-guided missiles in short time scenarios.
Origins of Life: Open Questions and Debates
NASA Astrophysics Data System (ADS)
Brack, André
2017-10-01
Stanley Miller demonstrated in 1953 that it was possible to form amino acids from methane, ammonia, and hydrogen in water, thus launching the ambitious hope that chemists would be able to shed light on the origins of life by recreating a simple life form in a test tube. However, it must be acknowledged that the dream has not yet been accomplished, despite the great volume of effort and innovation put forward by the scientific community. A minima, primitive life can be defined as an open chemical system, fed with matter and energy, capable of self-reproduction (i.e., making more of itself by itself), and also capable of evolving. The concept of evolution implies that chemical systems would transfer their information fairly faithfully but make some random errors. If we compared the components of primitive life to parts of a chemical automaton, we could conceive that, by chance, some parts self-assembled to generate an automaton capable of assembling other parts to produce a true copy. Sometimes, minor errors in the building generated a more efficient automaton, which then became the dominant species. Quite different scenarios and routes have been followed and tested in the laboratory to explain the origin of life. There are two schools of thought in proposing the prebiotic supply of organics. The proponents of a metabolism-first call for the spontaneous formation of simple molecules from carbon dioxide and water to rapidly generate life. In a second hypothesis, the primeval soup scenario, it is proposed that rather complex organic molecules accumulated in a warm little pond prior to the emergence of life. The proponents of the primeval soup or replication first approach are by far the more active. They succeeded in reconstructing small-scale versions of proteins, membranes, and RNA. Quite different scenarios have been proposed for the inception of life: the RNA world, an origin within droplets, self-organization counteracting entropy, or a stochastic approach merging chemistry and geology. Understanding the emergence of a critical feature of life, its one-handedness, is a shared preoccupation in all these approaches.
Overview of the laser activities at Rheinmetall Waffe Munition
NASA Astrophysics Data System (ADS)
Ludewigt, Klaus; Riesbeck, Thomas; Schünemann, B.; Graf, A.; Jung, Markus; Schreiber, Th.; Eberhardt, Ramona; Tünnermann, A.
2012-11-01
The paper will give an overview over the laser weapon activities at RWM (Rheinmetall Waffe Munition) over the last years. Starting from the actual scenarios for laser weapon applications as: CRAM (Counter Rocket Artillery Mortar), Air Defence and UXO (unexploded ordnance) clearing. The basic requirements of a future laser weapon as beam diameter, beam quality, tracking capability, adaptive optics were deduced. For the UXO scenario a mobile directed energy laser demonstrator for humanitarian mine and UXO clearing based on fiber lasers is presented. Based on the parameters the system concept including the cooling system, power supply and the integration into the armoured vehicle TM 170 are explained. The contribution show first experiments of UXO and IED clearing. Different technical approaches to achieve laser power in the 100 kW regime combined with very good beam quality are discussed to fulfil the requirements of the CRAM and Air Defence scenario. Spectral coupling and the beam superimposing both are performed by Rheinmetall Waffe Munition. At the spectral coupling the basic technology parameters for the fiber laser and the dielectric grating as the latest results were put into context with the power levels reached at other groups. For the beam super imposing technology the basic experiments regarding the tracking capability and compensation of the atmosphere on the test range at Unterlüß will be explained. A generic 10 kW Laser Weapon Demonstrator based on 2 Laser Weapon Modules (LWM) from RWM each 5 kW fiber Laser with beam forming and tracking integrate by the team of RWM and RAD (Rheinmetall Air Defense) into a Ground based Air Defend system consisting of Skyguard and Millenium turret are presented. The flight path of the UAV within the valley of the life firing range at Ochsenboden Switzerland is shown. Selected results of the successful tests against UAV's are presented. It shows the capability of the generic 10 kW Laser Weapon Demonstrator to track and to destroy the target. From these results the next steps of Rheinmetall Waffe Munition for a 100 kW class laser weapon are explained.
NASA Astrophysics Data System (ADS)
Reid, J.; Polasky, S.; Hawthorne, P.
2014-12-01
Sustainable development requires providing for human well-being by meeting basic demands for food, energy and consumer goods and services, all while maintaining an environment capable of sustaining the provisioning of those demands for future generations. Failure to meet the basic needs of human well-being is not an ethically viable option and strategies for doubling agricultural production and providing energy and goods for a growing population exist. However, the question is, at what cost to environmental quality? We developed an integrated modeling approach to test strategies for meeting multiple objectives within the limits of the earth system. We use scenarios to explore a range of assumptions on socio-economic factors like population growth, per capita income and technological change; food systems factors like food waste, production intensification and expansion, and meat demand; and technological developments in energy efficiency and wastewater treatment. We use these scenario to test the conditions in which we can fit the simultaneous goals of sustainable development.
Design and Testing of an Active Heat Rejection Radiator with Digital Turn-Down Capability
NASA Technical Reports Server (NTRS)
Sunada, Eric; Birur, Gajanana C.; Ganapathi, Gani B.; Miller, Jennifer; Berisford, Daniel; Stephan, Ryan
2010-01-01
NASA's proposed lunar lander, Altair, will be exposed to vastly different external environment temperatures. The challenges to the active thermal control system (ATCS) are compounded by unfavorable transients in the internal waste heat dissipation profile: the lowest heat load occurs in the coldest environment while peak loads coincide with the warmest environment. The current baseline for this fluid is a 50/50 inhibited propylene glycol/water mixture with a freeze temperature around -35 C. While the overall size of the radiator's heat rejection area is dictated by the worst case hot scenario, a turn-down feature is necessary to tolerate the worst case cold scenario. A radiator with digital turn-down capability is being designed as a robust means to maintain cabin environment and equipment temperatures while minimizing mass and power consumption. It utilizes active valving to isolate and render ineffective any number of parallel flow tubes which span across the ATCS radiator. Several options were assessed in a trade-study to accommodate flow tube isolation and how to deal with the stagnant fluid that would otherwise remain in the tube. Bread-board environmental tests were conducted for options to drain the fluid from a turned-down leg as well an option to allow a leg to freeze/thaw. Each drain option involved a positive displacement gear pump with different methods of providing a pressure head to feed it. Test results showed that a start-up heater used to generate vapor at the tube inlet held the most promise for tube evacuation. Based on these test results and conclusions drawn from the trade-study, a full-scale radiator design is being worked for the Altair mission profile.
Runway Incursion Prevention System Testing at the Wallops Flight Facility
NASA Technical Reports Server (NTRS)
Jones, Denise R.
2005-01-01
A Runway Incursion Prevention System (RIPS) integrated with a Synthetic Vision System concept (SVS) was tested at the Reno/Tahoe International Airport (RNO) and Wallops Flight Facility (WAL) in the summer of 2004. RIPS provides enhanced surface situational awareness and alerts of runway conflicts in order to prevent runway incidents while also improving operational capability. A series of test runs was conducted using a Gulfstream-V (G-V) aircraft as the test platform and a NASA test aircraft and a NASA test van as incurring traffic. The purpose of the study, from the RIPS perspective, was to evaluate the RIPS airborne incursion detection algorithms and associated alerting and airport surface display concepts, focusing on crossing runway incursion scenarios. This paper gives an overview of the RIPS, WAL flight test activities, and WAL test results.
Hypothetical Scenario Generator for Fault-Tolerant Diagnosis
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
The Hypothetical Scenario Generator for Fault-tolerant Diagnostics (HSG) is an algorithm being developed in conjunction with other components of artificial- intelligence systems for automated diagnosis and prognosis of faults in spacecraft, aircraft, and other complex engineering systems. By incorporating prognostic capabilities along with advanced diagnostic capabilities, these developments hold promise to increase the safety and affordability of the affected engineering systems by making it possible to obtain timely and accurate information on the statuses of the systems and predicting impending failures well in advance. The HSG is a specific instance of a hypothetical- scenario generator that implements an innovative approach for performing diagnostic reasoning when data are missing. The special purpose served by the HSG is to (1) look for all possible ways in which the present state of the engineering system can be mapped with respect to a given model and (2) generate a prioritized set of future possible states and the scenarios of which they are parts.
Analysis of the Operational Test and Evaluation of the CBRNE Crime Scene Modeller (C2SM)
2014-07-09
International Society for Optical Engineering, vol.7305, 730509 (10 pp), 2009. 2 It should be noted that these two projects are somewhat unique in...effectiveness of the capability as well as an opportunity to receive an arm’s length peer evaluation by an audience of International expert LE personnel with...Operators. Proceedings of the SPIE - The International Society for Optical Engineering, vol.7666, 76660N (8 pp.), 2010. 8 Note: the scenario is
Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground
NASA Astrophysics Data System (ADS)
Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.
2011-11-01
U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.
Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground
NASA Astrophysics Data System (ADS)
Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.
2012-05-01
U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.
Extinguishing agent for magnesium fire, phases 5 and 6
NASA Astrophysics Data System (ADS)
Beeson, H. D.; Tapscott, R. E.; Mason, B. E.
1987-07-01
This report documents the validation testing of the extinguishing system for metal fires developed as part of Phases 1 to 4. The results of this validation testing form the basis of information from which draft military specifications necessary to procure the agent and the agent delivery system may be developed. The developed system was tested against a variety of large-scale metal fire scenarios and the capabilities of the system were assessed. In addition the response of the system to storage and to changes in ambient conditions was tested. Results of this testing revealed that the developed system represented a reliable metal fire extinguishing system that could control and extinguish very large metal fires. The specifications developed for the agent and for the delivery system are discussed in detail.
Lunar stepping stones to a manned Mars exploration scenario
NASA Technical Reports Server (NTRS)
Davidson, W. L.; Stump, W. R.
1992-01-01
The initial trips to Mars by humans will be the first real severing of our dependence on Earth's environment. Common sense dictates that a human departure from Earth measured in years, to explore a distant planet, requires systems, techniques, and operations that have solid credibility proven with space experience. The space test and verification experience must occur with Mars-like conditions but under proving-ground conditions with good instrumentation, close monitoring, and fast emergency recovery capabilities. The lunar environment is the only arena that satisfies the requirements of a space recovery capabilities. The lunar environment is the only arena that satisfies the requirements of a space planetary proving-ground. The objective of this scenario is to demonstrate a program planning approach that has human presence at Mars as the goal but, prudently, capitalizes on manned lunar project facilities, operations, and experience to enable a safe journey for the first Mars crews. The emphasis in lunar application objectives is to perform productive science and resources exploitation missions. Most of the Mars mission aspects can be proven in the lunar environment providing 'stepping stones' to conducting the first human mission to travel to Mars and return safely to Earth.
Reduced infectivity of waterborne viable but nonculturable Helicobacter pylori strain SS1 in mice.
Boehnke, Kevin F; Eaton, Kathryn A; Fontaine, Clinton; Brewster, Rebecca; Wu, Jianfeng; Eisenberg, Joseph N S; Valdivieso, Manuel; Baker, Laurence H; Xi, Chuanwu
2017-08-01
Helicobacter pylori infection has been consistently associated with lack of access to clean water and proper sanitation, but no studies have demonstrated that the transmission of viable but nonculturable (VBNC) H. pylori can occur from drinking contaminated water. In this study, we used a laboratory mouse model to test whether waterborne VBNCH. pylori could cause gastric infection. We performed five mouse experiments to assess the infectivity of VBNCH. pylori in various exposure scenarios. VBNC viability was examined using Live/Dead staining and Biolog phenotype metabolism arrays. High doses of VBNCH. pylori in water were chosen to test the "worst-case" scenario for different periods of time. One experiment also investigated the infectious capabilities of VBNC SS1 using gavage. Further, immunocompromised mice were exposed to examine infectivity among potentially vulnerable groups. After exposure, mice were euthanized and their stomachs were examined for H. pylori infection using culture and PCR methodology. VBNC cells were membrane intact and retained metabolic activity. Mice exposed to VBNCH. pylori via drinking water and gavage were not infected, despite the various exposure scenarios (immunocompromised, high doses) that might have permitted infection with VBNCH. pylori. The positive controls exposed to viable, culturable H. pylori did become infected. While other studies that have used viable, culturable SS1 via gavage or drinking water exposures to successfully infect mice, in our study, waterborne VBNC SS1 failed to colonize mice under all test conditions. Future studies could examine different H. pylori strains in similar exposure scenarios to compare the relative infectivity of the VBNC vs the viable, culturable state, which would help inform future risk assessments of H. pylori in water. © 2017 The Authors. Helicobacter Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Horsham, Gary A. P.
1992-01-01
This structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this kind of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.
NASA Technical Reports Server (NTRS)
Horsham, Gary A. P.
1991-01-01
The structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this king of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.
A Sensor Failure Simulator for Control System Reliability Studies
NASA Technical Reports Server (NTRS)
Melcher, K. J.; Delaat, J. C.; Merrill, W. C.; Oberle, L. G.; Sadler, G. G.; Schaefer, J. H.
1986-01-01
A real-time Sensor Failure Simulator (SFS) was designed and assembled for the Advanced Detection, Isolation, and Accommodation (ADIA) program. Various designs were considered. The design chosen features an IBM-PC/XT. The PC is used to drive analog circuitry for simulating sensor failures in real-time. A user defined scenario describes the failure simulation for each of the five incoming sensor signals. Capabilities exist for editing, saving, and retrieving the failure scenarios. The SFS has been tested closed-loop with the Controls Interface and Monitoring (CIM) unit, the ADIA control, and a real-time F100 hybrid simulation. From a productivity viewpoint, the menu driven user interface has proven to be efficient and easy to use. From a real-time viewpoint, the software controlling the simulation loop executes at greater than 100 cycles/sec.
Simulating economic effects of disruptions in the telecommunications infrastructure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Roger Gary; Barton, Dianne Catherine; Reinert, Rhonda K.
2004-01-01
CommAspen is a new agent-based model for simulating the interdependent effects of market decisions and disruptions in the telecommunications infrastructure on other critical infrastructures in the U.S. economy such as banking and finance, and electric power. CommAspen extends and modifies the capabilities of Aspen-EE, an agent-based model previously developed by Sandia National Laboratories to analyze the interdependencies between the electric power system and other critical infrastructures. CommAspen has been tested on a series of scenarios in which the communications network has been disrupted, due to congestion and outages. Analysis of the scenario results indicates that communications networks simulated by themore » model behave as their counterparts do in the real world. Results also show that the model could be used to analyze the economic impact of communications congestion and outages.« less
A sensor failure simulator for control system reliability studies
NASA Astrophysics Data System (ADS)
Melcher, K. J.; Delaat, J. C.; Merrill, W. C.; Oberle, L. G.; Sadler, G. G.; Schaefer, J. H.
A real-time Sensor Failure Simulator (SFS) was designed and assembled for the Advanced Detection, Isolation, and Accommodation (ADIA) program. Various designs were considered. The design chosen features an IBM-PC/XT. The PC is used to drive analog circuitry for simulating sensor failures in real-time. A user defined scenario describes the failure simulation for each of the five incoming sensor signals. Capabilities exist for editing, saving, and retrieving the failure scenarios. The SFS has been tested closed-loop with the Controls Interface and Monitoring (CIM) unit, the ADIA control, and a real-time F100 hybrid simulation. From a productivity viewpoint, the menu driven user interface has proven to be efficient and easy to use. From a real-time viewpoint, the software controlling the simulation loop executes at greater than 100 cycles/sec.
NASA Technical Reports Server (NTRS)
Culbert, Christopher J.; Mongrard, Olivier; Satoh, Naoki; Goodliff, Kandyce; Seaman, Calvin H.; Troutman, Patrick; Martin, Eric
2011-01-01
The International Space Exploration Coordination Group (ISECG) was established in response to The Global Exploration Strategy (GES): The Framework for Coordination developed by fourteen space agencies* and released in May 2007. This GES Framework Document recognizes that preparing for human space exploration is a stepwise process, starting with basic knowledge and culminating in a sustained human presence in deep space. ISECG has developed several optional global exploration mission scenarios enabling the phased transition from human operations in Low Earth Orbit (LEO) and utilization of the International Space Station (ISS) to human missions beyond LEO leading ultimately to human missions to cis-lunar space, the Moon, Near Earth Asteroids, Mars and its environs. Mission scenarios provide the opportunity for judging various exploration approaches in a manner consistent with agreed international goals and strategies. Each ISECG notional mission scenario reflects a series of coordinated human and robotic exploration missions over a 25-year horizon. Mission scenarios are intended to provide insights into next steps for agency investments, following on the success of the ISS. They also provide a framework for advancing the definition of Design Reference Missions (DRMs) and the concepts for capabilities contained within. Each of the human missions contained in the scenarios has been characterized by a DRM which is a top level definition of mission sequence and the capabilities needed to execute that mission. While DRMs are generally destination focused, they will comprise capabilities which are reused or evolved from capabilities used at other destinations. In this way, an evolutionary approach to developing a robust set of capabilities to sustainably explore our solar system is defined. Agencies also recognize that jointly planning for our next steps, building on the accomplishments of ISS, is important to ensuring the robustness and sustainability of any human exploration plan. Developing a shared long-term vision is important, but agencies recognize this is an evolutionary process and requires consideration of many strategic factors. Strategic factors such as the implications of an emerging commercial space industry in LEO, the opportunity provided by extending ISS lifetime to at least 2020, and the importance of defining a plan which is sustainable in light of inevitable domestic policy shifts are timely for agency consideration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.; Podmore, Robin
2008-11-17
The focus of the present study is on improved training approaches to accelerate learning and improved methods for analyzing effectiveness of tools within a high-fidelity power grid simulated environment. A theory-based model has been developed to document and understand the mental processes that an expert power system operator uses when making critical decisions. The theoretical foundation for the method is based on the concepts of situation awareness, the methods of cognitive task analysis, and the naturalistic decision making (NDM) approach of Recognition Primed Decision Making. The method has been systematically explored and refined as part of a capability demonstration ofmore » a high-fidelity real-time power system simulator under normal and emergency conditions. To examine NDM processes, we analyzed transcripts of operator-to-operator conversations during the simulated scenario to reveal and assess NDM-based performance criteria. The results of the analysis indicate that the proposed framework can be used constructively to map or assess the Situation Awareness Level of the operators at each point in the scenario. We can also identify the mental models and mental simulations that the operators employ at different points in the scenario. This report documents the method, describes elements of the model, and provides appendices that document the simulation scenario and the associated mental models used by operators in the scenario.« less
NASA Astrophysics Data System (ADS)
Juarsa, M.; Giarno; Rohman, A. N.; Heru K., G. B.; Witoko, J. P.; Sony Tjahyani, D. T.
2018-02-01
The need for large-scale experimental facilities to investigate the phenomenon of natural circulation flow rate becomes a necessity in the development of nuclear reactor safety management. The FASSIP-01 loop has been built to determine the natural circulation flow rate performance in the large-scale media and aimed to reduce errors in the results for its application in the design of new generation reactors. The commissioning needs to be done to define the capability of the FASSIP-01 loop and to prescribe the experiment limitations. On this commissioning, two scenarios experimental method has been used. The first scenario is a static condition test which was conducted to verify measurement system response during 24 hours without electrical load in heater and cooler, there is water and no water inside the rectangular loop. Second scenario is a dynamics condition that aims to understand the flow rate, a dynamic test was conducted using heater power of 5627 watts and coolant flow rate in the HSS loop of 9.35 LPM. The result of this test shows that the temperature characterization on static test provide a recommendation, that the experiments should be done at night because has a better environmental temperature stability compared to afternoon, with stable temperature around 1°C - 3°C. While on the dynamic test, the water temperature difference between the inlet-outlets in the heater area is quite large, about 7 times the temperature difference in the cooler area. The magnitude of the natural circulation flow rate calculated is much larger at about 300 times compared to the measured flow rate with different flow rate profiles.
Lunar base - A stepping stone to Mars
NASA Technical Reports Server (NTRS)
Duke, M. B.; Mendell, W. W.; Roberts, B. B.
1985-01-01
Basic elements of technology and programmatic development are identified that appear relevant to the Case for Mars, starting from a base on the moon. The moon is a logical stepping stone toward human exploration of Mars because a lunar base can provide the first test of human ability to use the resources of another planetary body to provide basic materials for life support. A lunar base can provide the first long-term test of human capability to work and live in a reduced (but not zero) gravity field. A lunar base requires creation of the elements of a space transportation system that will be necessary to deliver large payloads to Mars and the space operations capability and experience necessary to carry out a Mars habitation program efficiently and with high reliability. A lunar base is feasible for the first decade of the 21st Century. Scenarios have been studied that provide advanced capability by 2015 within budget levels that are less than historical U.S. space expenditures (Apollo). Early return on the investment in terms of knowledge, practical experience and lunar products are important in gaining momentum for an expanded human exploration of the solar system and the eventual colonization of Mars.
Wu, Qunjian; Yan, Bin; Zeng, Ying; Zhang, Chi; Tong, Li
2018-05-03
The electroencephalogram (EEG) signal represents a subject's specific brain activity patterns and is considered as an ideal biometric given its superior invisibility, non-clonality, and non-coercion. In order to enhance its applicability in identity authentication, a novel EEG-based identity authentication method is proposed based on self- or non-self-face rapid serial visual presentation. In contrast to previous studies that extracted EEG features from rest state or motor imagery, the designed paradigm could obtain a distinct and stable biometric trait with a lower time cost. Channel selection was applied to select specific channels for each user to enhance system portability and improve discriminability between users and imposters. Two different imposter scenarios were designed to test system security, which demonstrate the capability of anti-deception. Fifteen users and thirty imposters participated in the experiment. The mean authentication accuracy values for the two scenarios were 91.31 and 91.61%, with 6 s time cost, which illustrated the precision and real-time capability of the system. Furthermore, in order to estimate the repeatability and stability of our paradigm, another data acquisition session is conducted for each user. Using the classification models generated from the previous sessions, a mean false rejected rate of 7.27% has been achieved, which demonstrates the robustness of our paradigm. Experimental results reveal that the proposed paradigm and methods are effective for EEG-based identity authentication.
Gene panel testing for inherited cancer risk.
Hall, Michael J; Forman, Andrea D; Pilarski, Robert; Wiesner, Georgia; Giri, Veda N
2014-09-01
Next-generation sequencing technologies have ushered in the capability to assess multiple genes in parallel for genetic alterations that may contribute to inherited risk for cancers in families. Thus, gene panel testing is now an option in the setting of genetic counseling and testing for cancer risk. This article describes the many gene panel testing options clinically available to assess inherited cancer susceptibility, the potential advantages and challenges associated with various types of panels, clinical scenarios in which gene panels may be particularly useful in cancer risk assessment, and testing and counseling considerations. Given the potential issues for patients and their families, gene panel testing for inherited cancer risk is recommended to be offered in conjunction or consultation with an experienced cancer genetic specialist, such as a certified genetic counselor or geneticist, as an integral part of the testing process. Copyright © 2014 by the National Comprehensive Cancer Network.
Tokunaga, Hironobu; Ando, Hirotaka; Obika, Mikako; Miyoshi, Tomoko; Tokuda, Yasuharu; Bautista, Miho; Kataoka, Hitomi; Terasawa, Hidekazu
2014-01-01
Objectives We report the preliminary development of a unique Web-based instrument for assessing and teaching knowledge and developing clinical thinking called the “Sequential Questions and Answers” (SQA) test. Included in this feasibility report are physicians’ answers to the Sequential Questions and Answers pre- and posttests and their brief questionnaire replies. Methods The authors refined the SQA test case scenario for content, ease of modifications of case scenarios, test uploading and answer retrieval. Eleven geographically distant physicians evaluated the SQA test, taking the pretest and posttest within two weeks. These physicians completed a brief questionnaire about the SQA test. Results Eleven physicians completed the SQA pre- and posttest; all answers were downloaded for analysis. They reported the ease of website login and navigating within the test module together with many helpful suggestions. Their average posttest score gain was 53% (p=0.012). Conclusions We report the successful launch of a unique Web-based instrument referred to as the Sequential Questions and Answers test. This distinctive test combines teaching organization of the clinical narrative into an assessment tool that promotes acquiring medical knowledge and clinical thinking. We successfully demonstrated the feasibility of geographically distant physicians to access the SQA instrument. The physicians’ helpful suggestions will be added to future SQA test versions. Medical schools might explore the integration of this multi-language-capable SQA assessment and teaching instrument into their undergraduate medical curriculum. PMID:25341203
NASA Astrophysics Data System (ADS)
Trigo, Guilherme F.; Maass, Bolko; Krüger, Hans; Theil, Stephan
2018-01-01
Accurate autonomous navigation capabilities are essential for future lunar robotic landing missions with a pin-point landing requirement, since in the absence of direct line of sight to ground control during critical approach and landing phases, or when facing long signal delays the herein before mentioned capability is needed to establish a guidance solution to reach the landing site reliably. This paper focuses on the processing and evaluation of data collected from flight tests that consisted of scaled descent scenarios where the unmanned helicopter of approximately 85 kg approached a landing site from altitudes of 50 m down to 1 m for a downrange distance of 200 m. Printed crater targets were distributed along the ground track and their detection provided earth-fixed measurements. The Crater Navigation (CNav) algorithm used to detect and match the crater targets is an unmodified method used for real lunar imagery. We analyze the absolute position and attitude solutions of CNav obtained and recorded during these flight tests, and investigate the attainable quality of vehicle pose estimation using both CNav and measurements from a tactical-grade inertial measurement unit. The navigation filter proposed for this end corrects and calibrates the high-rate inertial propagation with the less frequent crater navigation fixes through a closed-loop, loosely coupled hybrid setup. Finally, the attainable accuracy of the fused solution is evaluated by comparison with the on-board ground-truth solution of a dual-antenna high-grade GNSS receiver. It is shown that the CNav is an enabler for building autonomous navigation systems with high quality and suitability for exploration mission scenarios.
NASA Technical Reports Server (NTRS)
Aquilina, Rudolph A.
2015-01-01
The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.
NASA Technical Reports Server (NTRS)
Bibb, Karen L.; Prabhu, Ramadas K.
2004-01-01
In support of the Columbia Accident Investigation, inviscid computations of the aerodynamic characteristics for various Shuttle Orbiter damage scenarios were performed using the FELISA unstructured CFD solver. Computed delta aerodynamics were compared with the reconstructed delta aerodynamics in order to postulate a progression of damage through the flight trajectory. By performing computations at hypervelocity flight and CF4 tunnel conditions, a bridge was provided between wind tunnel testing in Langley's 20-Inch CF4 facility and the flight environment experienced by Columbia during re-entry. The rapid modeling capability of the unstructured methodology allowed the computational effort to keep pace with the wind tunnel and, at times, guide the wind tunnel efforts. These computations provided a detailed view of the flowfield characteristics and the contribution of orbiter components (such as the vertical tail and wing) to aerodynamic forces and moments that were unavailable from wind tunnel testing. The damage scenarios are grouped into three categories. Initially, single and multiple missing full RCC panels were analyzed to determine the effect of damage location and magnitude on the aerodynamics. Next is a series of cases with progressive damage, increasing in severity, in the region of RCC panel 9. The final group is a set of wing leading edge and windward surface deformations that model possible structural deformation of the wing skin due to internal heating of the wing structure. By matching the aerodynamics from selected damage scenarios to the reconstructed flight aerodynamics, a progression of damage that is consistent with the flight data, debris forensics, and wind tunnel data is postulated.
Hagerman, Amy D; Ward, Michael P; Anderson, David P; Looney, J Chris; McCarl, Bruce A
2013-07-01
In this study our aim was to value the benefits of rapid effective trace-back capability-based on a livestock identification system - in the event of a foot and mouth disease (FMD) outbreak. We simulated an FMD outbreak in the Texas High Plains, an area of high livestock concentration, beginning in a large feedlot. Disease spread was simulated under different time dependent animal tracing scenarios. In the specific scenario modeled (incursion of FMD within a large feedlot, detection within 14 days and 90% effective tracing), simulation suggested that control costs of the outbreak significantly increase if tracing does not occur until day 10 as compared to the baseline of tracing on day 2. In addition, control costs are significantly increased if effectiveness were to drop to 30% as compared to the baseline of 90%. Results suggest potential benefits from rapid effective tracing in terms of reducing government control costs; however, a variety of other scenarios need to be explored before determining in which situations rapid effective trace-back capability is beneficial. Copyright © 2012 Elsevier B.V. All rights reserved.
Multi-Scenario Use Case based Demonstration of Buildings Cybersecurity Framework Webtool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gourisetti, Sri Nikhil G.; Mylrea, Michael E.; Gervais, Easton L.
The purpose of this paper is to demonstrate the cybersecurity and software capabilities of Buildings Cybersecurity Framework (BCF) webtool. The webtool is designed based on BCF document and existing NIST standards. It’s capabilities and features are depicted through a building usecase with four different investment scenarios geared towards improving the cybersecurity posture of the building. BCF webtool also facilitates implementation of the goals outlined in Presidential Executive Order (EO) on Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure (May 2017. In realization of the EO goals, BCF includes five core elements: Identify, Protect, Detect, Respond, and Recover, to helpmore » determine various policy and process level vulnerabilities and provide mitigation strategies. With the BCF webtool, an organization can perform a cybersecurity self-assessment; determine the current cybersecurity posture; define investment based goals to achieve a target state; connect the cybersecurity posture with business processes, functions, and continuity; and finally, develop plans to answer critical organizational cybersecurity questions. In this paper, the webtool and its core capabilities are depicted by performing an extensive comparative assessment over four different scenarios.« less
Highly immersive virtual reality laparoscopy simulation: development and future aspects.
Huber, Tobias; Wunderling, Tom; Paschold, Markus; Lang, Hauke; Kneist, Werner; Hansen, Christian
2018-02-01
Virtual reality (VR) applications with head-mounted displays (HMDs) have had an impact on information and multimedia technologies. The current work aimed to describe the process of developing a highly immersive VR simulation for laparoscopic surgery. We combined a VR laparoscopy simulator (LapSim) and a VR-HMD to create a user-friendly VR simulation scenario. Continuous clinical feedback was an essential aspect of the development process. We created an artificial VR (AVR) scenario by integrating the simulator video output with VR game components of figures and equipment in an operating room. We also created a highly immersive VR surrounding (IVR) by integrating the simulator video output with a [Formula: see text] video of a standard laparoscopy scenario in the department's operating room. Clinical feedback led to optimization of the visualization, synchronization, and resolution of the virtual operating rooms (in both the IVR and the AVR). Preliminary testing results revealed that individuals experienced a high degree of exhilaration and presence, with rare events of motion sickness. The technical performance showed no significant difference compared to that achieved with the standard LapSim. Our results provided a proof of concept for the technical feasibility of an custom highly immersive VR-HMD setup. Future technical research is needed to improve the visualization, immersion, and capability of interacting within the virtual scenario.
Wealth distribution across communities of adaptive financial agents
NASA Astrophysics Data System (ADS)
DeLellis, Pietro; Garofalo, Franco; Lo Iudice, Francesco; Napoletano, Elena
2015-08-01
This paper studies the trading volumes and wealth distribution of a novel agent-based model of an artificial financial market. In this model, heterogeneous agents, behaving according to the Von Neumann and Morgenstern utility theory, may mutually interact. A Tobin-like tax (TT) on successful investments and a flat tax are compared to assess the effects on the agents’ wealth distribution. We carry out extensive numerical simulations in two alternative scenarios: (i) a reference scenario, where the agents keep their utility function fixed, and (ii) a focal scenario, where the agents are adaptive and self-organize in communities, emulating their neighbours by updating their own utility function. Specifically, the interactions among the agents are modelled through a directed scale-free network to account for the presence of community leaders, and the herding-like effect is tested against the reference scenario. We observe that our model is capable of replicating the benefits and drawbacks of the two taxation systems and that the interactions among the agents strongly affect the wealth distribution across the communities. Remarkably, the communities benefit from the presence of leaders with successful trading strategies, and are more likely to increase their average wealth. Moreover, this emulation mechanism mitigates the decrease in trading volumes, which is a typical drawback of TTs.
NASA Astrophysics Data System (ADS)
Du, E.; Cai, X.; Minsker, B. S.
2014-12-01
Agriculture comprises about 80 percent of the total water consumption in the US. Under conditions of water shortage and fully committed water rights, market-based water allocations could be promising instruments for agricultural water redistribution from marginally profitable areas to more profitable ones. Previous studies on water market have mainly focused on theoretical or statistical analysis. However, how water users' heterogeneous physical attributes and decision rules about water use and water right trading will affect water market efficiency has been less addressed. In this study, we developed an agent-based model to evaluate the benefits of an agricultural water market in the Guadalupe River Basin during drought events. Agricultural agents with different attributes (i.e., soil type for crops, annual water diversion permit and precipitation) are defined to simulate the dynamic feedback between water availability, irrigation demand and water trading activity. Diversified crop irrigation rules and water bidding rules are tested in terms of crop yield, agricultural profit, and water-use efficiency. The model was coupled with a real-time hydrologic model and run under different water scarcity scenarios. Preliminary results indicate that an agricultural water market is capable of increasing crop yield, agricultural profit, and water-use efficiency. This capability is more significant under moderate drought scenarios than in mild and severe drought scenarios. The water market mechanism also increases agricultural resilience to climate uncertainty by reducing crop yield variance in drought events. The challenges of implementing an agricultural water market under climate uncertainty are also discussed.
Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks.
Wang, Zhijun; Mirdamadi, Reza; Wang, Qing
2016-01-01
Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building.
Cross-modal individual recognition in wild African lions.
Gilfillan, Geoffrey; Vitale, Jessica; McNutt, John Weldon; McComb, Karen
2016-08-01
Individual recognition is considered to have been fundamental in the evolution of complex social systems and is thought to be a widespread ability throughout the animal kingdom. Although robust evidence for individual recognition remains limited, recent experimental paradigms that examine cross-modal processing have demonstrated individual recognition in a range of captive non-human animals. It is now highly relevant to test whether cross-modal individual recognition exists within wild populations and thus examine how it is employed during natural social interactions. We address this question by testing audio-visual cross-modal individual recognition in wild African lions (Panthera leo) using an expectancy-violation paradigm. When presented with a scenario where the playback of a loud-call (roaring) broadcast from behind a visual block is incongruent with the conspecific previously seen there, subjects responded more strongly than during the congruent scenario where the call and individual matched. These findings suggest that lions are capable of audio-visual cross-modal individual recognition and provide a useful method for studying this ability in wild populations. © 2016 The Author(s).
Implementing a self-structuring data learning algorithm
NASA Astrophysics Data System (ADS)
Graham, James; Carson, Daniel; Ternovskiy, Igor
2016-05-01
In this paper, we elaborate on what we did to implement our self-structuring data learning algorithm. To recap, we are working to develop a data learning algorithm that will eventually be capable of goal driven pattern learning and extrapolation of more complex patterns from less complex ones. At this point we have developed a conceptual framework for the algorithm, but have yet to discuss our actual implementation and the consideration and shortcuts we needed to take to create said implementation. We will elaborate on our initial setup of the algorithm and the scenarios we used to test our early stage algorithm. While we want this to be a general algorithm, it is necessary to start with a simple scenario or two to provide a viable development and testing environment. To that end, our discussion will be geared toward what we include in our initial implementation and why, as well as what concerns we may have. In the future, we expect to be able to apply our algorithm to a more general approach, but to do so within a reasonable time, we needed to pick a place to start.
Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks
Wang, Zhijun; Mirdamadi, Reza; Wang, Qing
2016-01-01
Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building. PMID:28540284
Towards an autonomous telescope system: the Test-Bed Telescope project
NASA Astrophysics Data System (ADS)
Racero, E.; Ocaña, F.; Ponz, D.; the TBT Consortium
2015-05-01
In the context of the Space Situational Awareness (SSA) programme of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. It is foreseen that this test-bed environment will be used to validate future prototype software systems as well as to evaluate remote monitoring and control techniques. The test-bed system will be capable to deliver astrometric and photometric data of the observed objects in near real-time. This contribution describes the current status of the project.
ASC FY17 Implementation Plan, Rev. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, P. G.
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resources, including technical staff, hardware, simulation software, and computer science solutions.« less
NASA Astrophysics Data System (ADS)
Taylor, Ron; Downey, Jack; Wood, Jeffrey; Lin, Yen-Hung; Bugata, Bharathi; Fan, Dongsheng; Hess, Carl; Wylie, Mark
2016-10-01
In this work, the SEMI specification for reticle and pod management (E109) with internal reticle library support has been integrated for the first time on KLA-Tencor's TeronTM and TeraScanTM reticle inspection tools. Manufacturing Execution System scheduling reticle jobs and Automated Material Handling System scheduling to transfer pods simultaneously have also been integrated and tested. GLOBALFOUNDRIES collaboratively worked with KLA-Tencor to successfully implement these capabilities. Both library and non-library scenarios have been demonstrated for comparison in a real production environment resulting in productivity increase of approximately 29% by making use of the library. Reticle re-qualification test cases were used for the comparison in this work.
NASA Astrophysics Data System (ADS)
Aghion, S.; Ariga, A.; Bollani, M.; Ereditato, A.; Ferragut, R.; Giammarchi, M.; Lodari, M.; Pistillo, C.; Sala, S.; Scampoli, P.; Vladymyrov, M.
2018-05-01
Nuclear emulsions are capable of very high position resolution in the detection of ionizing particles. This feature can be exploited to directly resolve the micrometric-scale fringe pattern produced by a matter-wave interferometer for low energy positrons (in the 10–20 keV range). We have tested the performance of emulsion films in this specific scenario. Exploiting silicon nitride diffraction gratings as absorption masks, we produced periodic patterns with features comparable to the expected interferometer signal. Test samples with periodicities of 6, 7 and 20 μ m were exposed to the positron beam, and the patterns clearly reconstructed. Our results support the feasibility of matter-wave interferometry experiments with positrons.
Radial Basis Function Neural Network Application to Power System Restoration Studies
Sadeghkhani, Iman; Ketabi, Abbas; Feuillet, Rene
2012-01-01
One of the most important issues in power system restoration is overvoltages caused by transformer switching. These overvoltages might damage some equipment and delay power system restoration. This paper presents a radial basis function neural network (RBFNN) to study transformer switching overvoltages. To achieve good generalization capability for developed RBFNN, equivalent parameters of the network are added to RBFNN inputs. The developed RBFNN is trained with the worst-case scenario of switching angle and remanent flux and tested for typical cases. The simulated results for a partial of 39-bus New England test system show that the proposed technique can estimate the peak values and duration of switching overvoltages with good accuracy. PMID:22792093
Medical Data Architecture Project Status
NASA Technical Reports Server (NTRS)
Krihak, M.; Middour, C.; Lindsey, A.; Marker, N.; Wolfe, S.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.
2017-01-01
The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current International Space Station (ISS) medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable an increasingly autonomous crew than the current ISS paradigm. The MDA will develop capabilities that support automated data collection, and the necessary functionality and challenges in executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. To attain this goal, the first year of the MDA project focused on reducing technical risk, developing documentation and instituting iterative development processes that established the basis for the first version of MDA software (or Test Bed 1). Test Bed 1 is based on a nominal operations scenario authored by the ExMC Element Scientist. This narrative was decomposed into a Concept of Operations that formed the basis for Test Bed 1 requirements. These requirements were successfully vetted through the MDA Test Bed 1 System Requirements Review, which permitted the MDA project to begin software code development and component integration. This paper highlights the MDA objectives, development processes, and accomplishments, and identifies the fiscal year 2017 milestones and deliverables in the upcoming year.
Integration of Multiple Data Sources to Simulate the Dynamics of Land Systems
Deng, Xiangzheng; Su, Hongbo; Zhan, Jinyan
2008-01-01
In this paper we present and develop a new model, which we have called Dynamics of Land Systems (DLS). The DLS model is capable of integrating multiple data sources to simulate the dynamics of a land system. Three main modules are incorporated in DLS: a spatial regression module, to explore the relationship between land uses and influencing factors, a scenario analysis module of the land uses of a region during the simulation period and a spatial disaggregation module, to allocate land use changes from a regional level to disaggregated grid cells. A case study on Taips County in North China is incorporated in this paper to test the functionality of DLS. The simulation results under the baseline, economic priority and environmental scenarios help to understand the land system dynamics and project near future land-use trajectories of a region, in order to focus management decisions on land uses and land use planning. PMID:27879726
Booth, N.L.; Everman, E.J.; Kuo, I.-L.; Sprague, L.; Murphy, L.
2011-01-01
The U.S. Geological Survey National Water Quality Assessment Program has completed a number of water-quality prediction models for nitrogen and phosphorus for the conterminous United States as well as for regional areas of the nation. In addition to estimating water-quality conditions at unmonitored streams, the calibrated SPAtially Referenced Regressions On Watershed attributes (SPARROW) models can be used to produce estimates of yield, flow-weighted concentration, or load of constituents in water under various land-use condition, change, or resource management scenarios. A web-based decision support infrastructure has been developed to provide access to SPARROW simulation results on stream water-quality conditions and to offer sophisticated scenario testing capabilities for research and water-quality planning via a graphical user interface with familiar controls. The SPARROW decision support system (DSS) is delivered through a web browser over an Internet connection, making it widely accessible to the public in a format that allows users to easily display water-quality conditions and to describe, test, and share modeled scenarios of future conditions. SPARROW models currently supported by the DSS are based on the modified digital versions of the 1:500,000-scale River Reach File (RF1) and 1:100,000-scale National Hydrography Dataset (medium-resolution, NHDPlus) stream networks. ?? 2011 American Water Resources Association. This article is a U.S. Government work and is in the public domain in the USA.
Deep Space Network Capabilities for Receiving Weak Probe Signals
NASA Technical Reports Server (NTRS)
Asmar, Sami; Johnston, Doug; Preston, Robert
2005-01-01
Planetary probes can encounter mission scenarios where communication is not favorable during critical maneuvers or emergencies. Launch, initial acquisition, landing, trajectory corrections, safing. Communication challenges due to sub-optimum antenna pointing or transmitted power, amplitude/frequency dynamics, etc. Prevent lock-up on signal and extraction of telemetry. Examples: loss of Mars Observer, nutation of Ulysses, Galileo antenna, Mars Pathfinder and Mars Exploration Rovers Entry, Descent, and Landing, and the Cassini Saturn Orbit Insertion. A Deep Space Network capability to handle such cases has been used successfully to receive signals to characterize the scenario. This paper will describe the capability and highlight the cases of the critical communications for the Mars rovers and Saturn Orbit Insertion and preparation radio tracking of the Huygens probe at (non-DSN) radio telescopes.
Comparative Assessment and Decision Support System for Strategic Military Airlift Capability
NASA Technical Reports Server (NTRS)
Salmon, John; Iwata, Curtis; Mavris, Dimitri; Weston, Neil; Fahringer, Philip
2011-01-01
The Lockheed Martin Aeronautics Company has been awarded several programs to modernize the aging C-5 military transport fleet. In order to ensure its continuation amidst budget cuts, it was important to engage the decision makers by providing an environment to analyze the benefits of the modernization program. This paper describes an interface that allows the user to change inputs such as the scenario airfields, take-off conditions, and reliability characteristics. The underlying logistics surrogate model was generated using data from a discrete-event simulation. Various visualizations such as intercontinental flight paths illustrated in 3D, have been created to aid the user in analyzing scenarios and performing comparative assessments for various output logistics metrics. The capability to rapidly and dynamically evaluate and compare scenarios was developed enabling real time strategy exploration and trade-offs.
2010-06-01
Military Scenario Definition Language (MSDL) for Nontraditional Warfare Scenarios," Paper 09S- SIW -001, Proceedings of the Spring Simulation...Update to the M&S Community," Paper 09S- SIW -002, Proceedings of the Spring Simulation Interoperability Workshop, Simulation Interoperability...Multiple Simulations: An Application of the Military Scenario Definition Language (MSDL)," Paper 09S- SIW -003, Proc. of the Spring Simulation
Active glass-type human augmented cognition system considering attention and intention
NASA Astrophysics Data System (ADS)
Kim, Bumhwi; Ojha, Amitash; Lee, Minho
2015-10-01
Human cognition is the result of an interaction of several complex cognitive processes with limited capabilities. Therefore, the primary objective of human cognitive augmentation is to assist and expand these limited human cognitive capabilities independently or together. In this study, we propose a glass-type human augmented cognition system, which attempts to actively assist human memory functions by providing relevant, necessary and intended information by constantly assessing intention of the user. To achieve this, we exploit selective attention and intention processes. Although the system can be used in various real-life scenarios, we test the performance of the system in a person identity scenario. To detect the intended face, the system analyses the gaze points and change in pupil size to determine the intention of the user. An assessment of the gaze points and change in pupil size together indicates that the user intends to know the identity and information about the person in question. Then, the system retrieves several clues through speech recognition system and retrieves relevant information about the face, which is finally displayed through head-mounted display. We present the performance of several components of the system. Our results show that the active and relevant assistance based on users' intention significantly helps the enhancement of memory functions.
A Storm Surge and Inundation Model of the Back River Watershed at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Loftis, Jon Derek; Wang, Harry V.; DeYoung, Russell J.
2013-01-01
This report on a Virginia Institute for Marine Science project demonstrates that the sub-grid modeling technology (now as part of Chesapeake Bay Inundation Prediction System, CIPS) can incorporate high-resolution Lidar measurements provided by NASA Langley Research Center into the sub-grid model framework to resolve detailed topographic features for use as a hydrological transport model for run-off simulations within NASA Langley and Langley Air Force Base. The rainfall over land accumulates in the ditches/channels resolved via the model sub-grid was tested to simulate the run-off induced by heavy precipitation. Possessing both the capabilities for storm surge and run-off simulations, the CIPS model was then applied to simulate real storm events starting with Hurricane Isabel in 2003. It will be shown that the model can generate highly accurate on-land inundation maps as demonstrated by excellent comparison of the Langley tidal gauge time series data (CAPABLE.larc.nasa.gov) and spatial patterns of real storm wrack line measurements with the model results simulated during Hurricanes Isabel (2003), Irene (2011), and a 2009 Nor'easter. With confidence built upon the model's performance, sea level rise scenarios from the ICCP (International Climate Change Partnership) were also included in the model scenario runs to simulate future inundation cases.
NASA Astrophysics Data System (ADS)
Russo, David
2017-11-01
The main goal of this study was to test the capability of irrigation water-based and soil-based approaches to control nitrate and chloride mass fluxes and concentrations below the root zone of agricultural fields irrigated with treated waste water (TWW). Using numerical simulations of flow and transport in relatively a fine-textured, unsaturated, spatially heterogeneous, flow domain, scenarios examined include: (i) irrigating with TWW only (REF); (ii) irrigation water is substituted between TWW and desalinized water (ADW); (iii) soil includes a capillary barrier (CB) and irrigating with TWW only (CB + TWW); and (iv) combination of (ii) and a CB (CB + ADW). Considering groundwater quality protection, plausible goals are: (i) to minimize solute discharges leaving the root zone, and, (ii) to maximize the probability that solute concentrations leaving the root zone will not exceed a prescribed, critical value. Results of the analyses suggest that in the case of a seasonal crop (a corn field) subject to irrigations only, with respect to the first goal, the CB + TWW and CB + ADW scenarios provide similar, excellent results, better than the ADW scenario; with respect to the second goal, however, the CB + ADW scenario gave substantially better results than the CB + TWW scenario. In the case a multiyear, perennial crop (a citrus orchard), subject to a sequence of irrigation and rainfall periods, for both solutes, and, particularly, nitrate, with respect to the two goals, both the ADW and CB + ADW scenarios perform better than the CB + TWW scenario. As compared with the REF and CB + TWW scenarios, the ADW and CB + ADW scenarios substantially reduce nitrogen mass fluxes to the groundwater and to the atmosphere, and, essentially, did not reduce nitrogen mass fluxes to the trees. Similar results, even better, were demonstrated for a relatively coarse-textured, spatially heterogeneous soil.
NASA Astrophysics Data System (ADS)
Nikitczuk, Jason; Weinberg, Brian; Mavroidis, Constantinos
2006-03-01
In this paper we present the design and control algorithms for novel electro-rheological fluid based torque generation elements that will be used to drive the joint of a new type of portable and controllable Active Knee Rehabilitation Orthotic Device (AKROD) for gait retraining in stroke patients. The AKROD is composed of straps and rigid components for attachment to the leg, with a central hinge mechanism where a gear system is connected. The key features of AKROD include: a compact, lightweight design with highly tunable torque capabilities through a variable damper component, full portability with on board power, control circuitry, and sensors (encoder and torque), and real-time capabilities for closed loop computer control for optimizing gait retraining. The variable damper component is achieved through an electro-rheological fluid (ERF) element that connects to the output of the gear system. Using the electrically controlled rheological properties of ERFs, compact brakes capable of supplying high resistive and controllable torques, are developed. A preliminary prototype for AKROD v.2 has been developed and tested in our laboratory. AKROD's v.2 ERF resistive actuator was tested in laboratory experiments using our custom made ERF Testing Apparatus (ETA). ETA provides a computer controlled environment to test ERF brakes and actuators in various conditions and scenarios including emulating the interaction between human muscles involved with the knee and AKROD's ERF actuators / brakes. In our preliminary results, AKROD's ERF resistive actuator was tested in closed loop torque control experiments. A hybrid (non-linear, adaptive) Proportional-Integral (PI) torque controller was implemented to achieve this goal.
QuickStrike ASOC Battlefield Simulation: Preparing the War Fighter to Win
NASA Technical Reports Server (NTRS)
Jones, Richard L.
2010-01-01
The QuickStrike ASOC (Air Support Operations Center) Battlefield Simulation fills a crucial gap in USAF and United Kingdom Close Air Support (CAS) and airspace manager training. The system now provides six squadrons with the capability to conduct total-mission training events whenever the personnel and time are available. When the 111th ASOC returned from their first deployment to Afghanistan they realized the training available prior to deployment was inadequate. They sought an organic training capability focused on the ASOC mission that was low cost, simple to use, adaptable, and available now. Using a commercial off-the-shelf simulation, they developed a complete training system by adapting the simulation to their training needs. Through more than two years of spiral development, incorporating lessons learned, the system has matured, and can now realistically replicate the Tactical Operations Center (TOC) in Kabul, Afghanistan, the TOC supporting the mission in Iraq, or can expand to support a major conflict scenario. The training system provides a collaborative workspace for the training audience and exercise control group via integrated software and workstations that can easily adapt to new mission reqUirements and TOC configurations. The system continues to mature. Based on inputs from the war fighter, new capabilities have been incorporated to add realism and simplify the scenario development process. The QuickStrike simulation can now import TBMCS Air Tasking Order air mission data and can provide air and ground tracks to a common operating picture; presented through either C2PC or JADOCS. This oranic capability to practice team processes and tasks and to conduct mission rehearsals proved its value in the 111 h ASOS's next deployment. The ease of scenario development and the simple to learn and intuitive gamelike interface enables the squadrons to develop and share scenarios incorporating lessons learned from every deployment. These war fighters have now filled the training gap and have the capability they need to train to win.
Space Station Freedom extravehicular activity systems evolution study
NASA Technical Reports Server (NTRS)
Rouen, Michael
1990-01-01
Evaluation of Space Station Freedom (SSF) support of manned exploration is in progress to identify SSF extravehicular activity (EVA) system evolution requirements and capabilities. The output from these studies will provide data to support the preliminary design process to ensure that Space Station EVA system requirements for future missions (including the transportation node) are adequately considered and reflected in the baseline design. The study considers SSF support of future missions and the EVA system baseline to determine adequacy of EVA requirements and capabilities and to identify additional requirements, capabilities, and necessary technology upgrades. The EVA demands levied by formal requirements and indicated by evolutionary mission scenarios are high for the out-years of Space Station Freedom. An EVA system designed to meet the baseline requirements can easily evolve to meet evolution demands with few exceptions. Results to date indicate that upgrades or modifications to the EVA system may be necessary to meet the full range of EVA thermal environments associated with the transportation node. Work continues to quantify the EVA capability in this regard. Evolution mission scenarios with EVA and ground unshielded nuclear propulsion engines are inconsistent with anthropomorphic EVA capabilities.
Defining a Simulation Capability Hierarchy for the Modeling of a SeaBase Enabler (SBE)
2010-09-01
ability to maintain the sea lanes of communication. Relief efforts in crisis-stricken countries like India in 2007, Aceh Indonesia and Sri Lanka in...the number of entities that were built into the scenario run for each category. 104 Advanced Scenario Results Speed Cargo Rate Escorts SURF
ISECG Global Exploration Roadmap: A Stepwise Approach to Deep Space Exploration
NASA Technical Reports Server (NTRS)
Martinez, Roland; Goodliff, Kandyce; Whitley, Ryan
2013-01-01
In 2011, ISECG released the Global Exploration Roadmap (GER), advancing the "Global Exploration Strategy: The Framework for Coordination" by articulating the perspectives of participating agencies on exploration goals and objectives, mission scenarios, and coordination of exploration preparatory activities. The GER featured a stepwise development and demonstration of capabilities ultimately required for human exploration of Mars. In 2013 the GER was updated to reflect the ongoing evolution of agency's exploration policies and plans, informed by individual agency and coordinated analysis activities that are relevant to various elements of the GER framework as well as coordinated stakeholder engagement activities. For this release of version 2 of the GER in the mid 2013 timeframe, a modified mission scenario is presented, more firmly reflecting the importance of a stepwise evolution of critical capabilities provided by multiple partners necessary for executing increasingly complex missions to multiple destinations and leading to human exploration of Mars. This paper will describe the updated mission scenario, the changes since the release of version 1, the mission themes incorporated into the scenario, and risk reduction for Mars missions provided by exploration at various destinations.
Group 1: Scenario design and development issues
NASA Technical Reports Server (NTRS)
Sherwin, P.
1981-01-01
All LOFT scenarios and flight segments should be designed on the basis of a detailed statement of specific objectives. These objectives must state what kind of situation is to be addressed and why. The origin, routing, and destination of a particular scenario should be dictated by the specific objectives for that scenario or leg. Other factors to be considered are the desired weather, climate, etc. Simulator visual system, as well as other capabilities and limitations must be considered at a very early stage of scenario design. The simulator navigation area must be apropriate and must coincide with current Jeppeson charts. Much of the realism of LOFT is destroyed if the crew is unable to use current manuals and other materials.
Fontaine, Joseph J.; Jorgensen, Christopher; Stuber, Erica F.; Gruber, Lutz F.; Bishop, Andrew A.; Lusk, Jeffrey J.; Zach, Eric S.; Decker, Karie L.
2017-01-01
We know economic and social policy has implications for ecosystems at large, but the consequences for a given geographic area or specific wildlife population are more difficult to conceptualize and communicate. Species distribution models, which extrapolate species-habitat relationships across ecological scales, are capable of predicting population changes in distribution and abundance in response to management and policy, and thus, are an ideal means for facilitating proactive management within a larger policy framework. To illustrate the capabilities of species distribution modeling in scenario planning for wildlife populations, we projected an existing distribution model for ring-necked pheasants (Phasianus colchicus) onto a series of alternative future landscape scenarios for Nebraska, USA. Based on our scenarios, we qualitatively and quantitatively estimated the effects of agricultural policy decisions on pheasant populations across Nebraska, in specific management regions, and at wildlife management areas.
NASA Technical Reports Server (NTRS)
Anderson, Steve; Horne, Gary; Meyer, Ted; Triola, Larry
2012-01-01
Data farming uses simulation modeling, high performance computing, and analysis to examine questions of interest with large possibility spaces.This methodology allows for the examination of whole landscapes of potential outcomes and provides the capability of executing enough experiments so that outlets might be captured and examined for insights. This capability may be quite informative when used to examine the plethora of "What if?" questions that result when examining potential scenarios that our forces may face in the uncertain world of the future. Many of theses scenarios most certainly will be challenging and solutions may depend on interagency and international collaboration as well as the need for inter-disciplinary scientific inquiry preceding these events. In this paper, we describe data farming and illustrate it in the context of application to questions inherent to military decision-making as we consider alternate future scenarios.
Risk Unbound: Threat, Catastrophe, and the End of Homeland Security
2015-09-01
Defense (DOD) models ) is now the prevalent model for developing plans.63 Capabilities- based within the national preparedness system is defined as...capabilities- based planning is the accounting for scenarios through organizational capability development , and the search for commonality and structure...of providing perfect security, and demonstrate the limitations of risk- based security practices. This thesis presents an argument in three parts
Dual Mission Scenarios for the Human Lunar Campaign - Performance, Cost and Risk Benefits
NASA Technical Reports Server (NTRS)
Saucillo, Rudolph J.; Reeves, David M.; Chrone, Jonathan D.; Stromgren, Chel; Reeves, John D.; North, David D.
2008-01-01
Scenarios for human lunar operations with capabilities significantly beyond Constellation Program baseline missions are potentially feasible based on the concept of dual, sequential missions utilizing a common crew and a single Ares I/CEV (Crew Exploration Vehicle). For example, scenarios possible within the scope of baseline technology planning include outpost-based sortie missions and dual sortie missions. Top level cost benefits of these dual sortie scenarios may be estimated by comparison to the Constellation Program reference two-mission-per-year lunar campaign. The primary cost benefit is the accomplishment of Mission B with a "single launch solution" since no Ares I launch is required. Cumulative risk to the crew is lowered since crew exposure to launch risks and Earth return risks are reduced versus comparable Constellation Program reference two-mission-per-year scenarios. Payload-to-the-lunar-surface capability is substantially increased in the Mission B sortie as a result of additional propellant available for Lunar Lander #2 descent. This additional propellant is a result of EDS #2 transferring a smaller stack through trans-lunar injection and using remaining propellant to perform a portion of the lunar orbit insertion (LOI) maneuver. This paper describes these dual mission concepts, including cost, risk and performance benefits per lunar sortie site, and provides an initial feasibility assessment.
78 FR 71435 - Policy Statement on the Scenario Design Framework for Stress Testing
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... Statement III. Summary of Comments A. Design of Stress Test Scenarios B. Additional Variables C. Severely... policy statement and its overall organization. A. Design of Stress Test Scenarios Commenters suggested a variety of ways for the Board to alter or improve the design of stress test scenarios, including by making...
NASA Technical Reports Server (NTRS)
1979-01-01
A manned remote work station (MRWS) mission scenario, broken down into the three time phases was selected as the basis for analysis of the MRWS flight article requirements and concepts. The mission roles for the three time phases, supporting tradeoff and evaluation studies, was used to identify key issues requiring simulation. The MRWS is discussed in terms of its capability to perform such operations as support of Spacelab experiments, servicing and repair of satellites, and construction. Future considerations for the use of the MRWS are also given.
NASA Technical Reports Server (NTRS)
Sweet, Adam
2008-01-01
The IVHM Project in the Aviation Safety Program has funded research in electrical power system (EPS) health management. This problem domain contains both discrete and continuous behavior, and thus is directly relevant for the hybrid diagnostic tool HyDE. In FY2007 work was performed to expand the HyDE diagnosis model of the ADAPT system. The work completed resulted in a HyDE model with the capability to diagnose five times the number of ADAPT components previously tested. The expanded diagnosis model passed a corresponding set of new ADAPT fault injection scenario tests with no incorrect faults reported. The time required for the HyDE diagnostic system to isolate the fault varied widely between tests; this variance was reduced by tuning HyDE input parameters. These results and other diagnostic design trade-offs are discussed. Finally, possible future improvements for both the HyDE diagnostic model and HyDE itself are presented.
Flooding Capability for River-based Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Prescott, Steven; Ryan, Emerald
2015-10-01
This report describes the initial investigation into modeling and simulation tools for application of riverine flooding representation as part of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluations. The report provides examples of different flooding conditions and scenarios that could impact river and watershed systems. Both 2D and 3D modeling approaches are described.
Automated Generation and Assessment of Autonomous Systems Test Cases
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.
2008-01-01
This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results based on a range of metrics. Although the specific means of scoring depends highly on the application, the use of formal scoring - metrics has high value in identifying and prioritizing anomalies, and in presenting an overall picture of the state of the test program. In this paper we present a case study based on automatic generation and assessment of faulted test runs for the Dawn mission, and discuss its role in optimizing the allocation of resources for completing the test program.
Microgrid optimized resource dispatch for public-purpose resiliency and sustainability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, Michael; Camilleri, John; Lubkeman, David
Communities in Atlantic coastal regions have in recent years sought to improve the resiliency of their critical infrastructure and public services, especially to protect against hurricanes and other events capable of causing widespread damage and disruption. As the backbone of any community’s critical functions, the electricity distribution system requires high resiliency in order to maintain local energy delivery services. Against this backdrop, the Project sought to develop a resilient energy microgrid control system capable of integrating distributed renewable energy resources, natural gas CHP units, energy storage, and demand-side management technologies in near-real-time optimization schemes for the community of Olney, Md.more » The Montgomery County Planning Board in 2005 established the Olney Town Center area as a “civic center/town commons,” in part because it serves as a key point of interaction in the community – but also because it contains numerous vital community assets. With a total peak electrical load of about 8 MW (including Montgomery General Hospital with a 2.4 MW peak), the Project area is a business and essential services area, directly serving a suburban population of more than 33,000 residents. It contains a hospital, police station, two fire stations, two schools, grocery stores, and gas stations, and the community’s water tower, among other things. Moreover, the location stands at the crossroads of two state highways that represent major regional arteries for commerce and public safety in Montgomery County. These characteristics made the Project area an appropriate setting for considering microgrid deployment. It presented a model of a typical Maryland suburban community, with geographic dispersion of vital assets over a sizeable area, and a combination of overhead distribution lines and underground cables serving those critical loads. Such a representative model helped to ensure the solutions developed and the scenarios tested would be readily applicable to other communities in the state and the region. Further, the Project’s outcomes and lessons provide insights to guide community microgrid design and development in many locations. To achieve Project objectives – including those established by the U.S. Department of Energy (DOE) National Energy Technology Laboratory – the Project team researched, developed, and tested in simulation a set of microgrid controls capable of maintaining electricity supplies for critical community loads in the event of a regional utility outage lasting many days or even weeks. Testing and analysis showed that the microgrid would be capable of maintaining electricity supply to critical loads essentially indefinitely in most outage scenarios, while also substantially improving overall reliability for microgrid customers. With targeted improvements in local utility distribution infrastructure, test analysis showed that the microgrid would be capable of reducing annual electricity outages for critical loads by 98%. Further, to help achieve environmental and efficiency policy goals established by both the State of Maryland and the federal government, the team designed the system to reduce the annual carbon footprint of served loads by 20%, and to improve system energy efficiency for those loads by at least 20%. Testing showed that, as designed, the system is capable of meeting these performance requirements, with potential for further improvements through more effective thermal energy utilization. This Final Report, comprised of four volumes and 11 annexes, presents the results of these project efforts, including feasibility assessment (section F) and guidance for decision-makers considering prospective deployment of public-purpose microgrid systems in Maryland communities.« less
Tomographic capabilities of the new GEM based SXR diagnostic of WEST
NASA Astrophysics Data System (ADS)
Jardin, A.; Mazon, D.; O'Mullane, M.; Mlynar, J.; Loffelmann, V.; Imrisek, M.; Chernyshova, M.; Czarski, T.; Kasprowicz, G.; Wojenski, A.; Bourdelle, C.; Malard, P.
2016-07-01
The tokamak WEST (Tungsten Environment in Steady-State Tokamak) will start operating by the end of 2016 as a test bed for the ITER divertor components in long pulse operation. In this context, radiative cooling of heavy impurities like tungsten (W) in the Soft X-ray (SXR) range [0.1 keV; 20 keV] is a critical issue for the plasma core performances. Thus reliable tools are required to monitor the local impurity density and avoid W accumulation. The WEST SXR diagnostic will be equipped with two new GEM (Gas Electron Multiplier) based poloidal cameras allowing to perform 2D tomographic reconstructions in tunable energy bands. In this paper tomographic capabilities of the Minimum Fisher Information (MFI) algorithm developed for Tore Supra and upgraded for WEST are investigated, in particular through a set of emissivity phantoms and the standard WEST scenario including reconstruction errors, influence of noise as well as computational time.
Landing System Development- Design and Test Prediction of a Lander Leg Using Nonlinear Analysis
NASA Astrophysics Data System (ADS)
Destefanis, Stefano; Buchwald, Robert; Pellegrino, Pasquale; Schroder, Silvio
2014-06-01
Several mission studies have been performed focusing on a soft and precision landing using landing legs. Examples for such missions are Mars Sample Return scenarios (MSR), Lunar landing scenarios (MoonNEXT, Lunar Lander) and small body sample return studies (Marco Polo, MMSR, Phootprint). Such missions foresee a soft landing on the planet surface for delivering payload in a controlled manner and limiting the landing loads.To ensure a successful final landing phase, a landing system is needed, capable of absorbing the residual velocities (vertical, horizontal and angular) at touch- down, and insuring a controlled attitude after landing. Such requirements can be fulfilled by using landing legs with adequate damping.The Landing System Development (LSD) study, currently in its phase 2, foresees the design, analysis, verification, manufacturing and testing of a representative landing leg breadboard based on the Phase B design of the ESA Lunar Lander. Drop tests of a single leg will be performed both on rigid and soft ground, at several impact angles. The activity is covered under ESA contract with TAS-I as Prime Contractor, responsible for analysis and verification, Astrium GmbH for design and test and QinetiQ Space for manufacturing. Drop tests will be performed at the Institute of Space Systems of the German Aerospace Center (DLR-RY) in Bremen.This paper presents an overview of the analytical simulations (test predictions and design verification) performed, comparing the results produced by Astrium made multi body model (rigid bodies, nonlinearities accounted for in mechanical joints and force definitions, based on development tests) and TAS-I made nonlinear explicit model (fully deformable bodies).
Malo, Sergio; Fateri, Sina; Livadas, Makis; Mares, Cristinel; Gan, Tat-Hean
2017-07-01
Ultrasonic guided waves testing is a technique successfully used in many industrial scenarios worldwide. For many complex applications, the dispersive nature and multimode behavior of the technique still poses a challenge for correct defect detection capabilities. In order to improve the performance of the guided waves, a 2-D compressed pulse analysis is presented in this paper. This novel technique combines the use of pulse compression and dispersion compensation in order to improve the signal-to-noise ratio (SNR) and temporal-spatial resolution of the signals. The ability of the technique to discriminate different wave modes is also highlighted. In addition, an iterative algorithm is developed to identify the wave modes of interest using adaptive peak detection to enable automatic wave mode discrimination. The employed algorithm is developed in order to pave the way for further in situ applications. The performance of Barker-coded and chirp waveforms is studied in a multimodal scenario where longitudinal and flexural wave packets are superposed. The technique is tested in both synthetic and experimental conditions. The enhancements in SNR and temporal resolution are quantified as well as their ability to accurately calculate the propagation distance for different wave modes.
Sticky foam as a less-than-lethal technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, S.H.
1996-12-31
Sandia National Labs (SNL) in 1994 completed a project funded by the National Institute of Justice (NIJ) to determine the applicability of sticky foam for correctional applications. Sticky foam is an extremely tacky, tenacious material used to block, entangle, and impair individuals. The NIJ project developed a gun capable of firing multiple shots of sticky foam, tested the gun and sticky foam effectiveness on SNL volunteers acting out prison and law enforcement scenarios, and had the gun and sticky foam evaluated by correctional representatives. Based on the NIJ project work, SNL supported the Marine Corps Mission, Operation United Shield, withmore » sticky foam guns and supporting equipment to assist in the withdrawal of UN Peacekeepers from Somalia. Prior to the loan of the equipment, the Marines were given training in sticky foam characterization, toxicology, safety issues, cleanup and waste disposal, use limitations, use protocol and precautions, emergency facial clean-up, skin cleanup, gun filling, targeting and firing, and gun cleaning. The Marine Corps successfully used the sticky foam guns as part of that operation. This paper describes these recent developments of sticky foam for non-lethal uses and some of the lessons learned from scenario and application testing.« less
Fernández, Roemi; Salinas, Carlota; Montes, Héctor; Sarria, Javier
2014-01-01
The motivation of this research was to explore the feasibility of detecting and locating fruits from different kinds of crops in natural scenarios. To this end, a unique, modular and easily adaptable multisensory system and a set of associated pre-processing algorithms are proposed. The offered multisensory rig combines a high resolution colour camera and a multispectral system for the detection of fruits, as well as for the discrimination of the different elements of the plants, and a Time-Of-Flight (TOF) camera that provides fast acquisition of distances enabling the localisation of the targets in the coordinate space. A controlled lighting system completes the set-up, increasing its flexibility for being used in different working conditions. The pre-processing algorithms designed for the proposed multisensory system include a pixel-based classification algorithm that labels areas of interest that belong to fruits and a registration algorithm that combines the results of the aforementioned classification algorithm with the data provided by the TOF camera for the 3D reconstruction of the desired regions. Several experimental tests have been carried out in outdoors conditions in order to validate the capabilities of the proposed system. PMID:25615730
Countering MANPADS: study of new concepts and applications: part two
NASA Astrophysics Data System (ADS)
Maltese, Dominique; Vergnolle, Jean-François; Aragones, Julien; Renaudat, Mathieu
2007-04-01
The latest events of ground-to-air Man Portable Air Defense (MANPAD) attacks against aircraft have revealed a new threat both for military and civilian aircraft. Consequently, the implementation of protecting systems (i.e. Directed Infra Red Counter Measure - DIRCM) in order to face IR guided missiles turns out to be now inevitable. In a near future, aircraft will have to possess detection, tracking, identification, targeting and jamming capabilities to face MANPAD threats. Besides, Multiple Missiles attacks become more and more current scenarios to deal with. In this paper, a practical example of DIRCM systems under study at SAGEM DEFENSE & SECURITY Company is presented. The article is the continuation of a previous SPIE one. Self-protection solutions include built-in and automatic locking-on, tracking, identification and laser jamming capabilities, including defeat assessment. Target Designations are provided by a Missile Warning System. Targets scenarios including multiple threats are considered to design systems architectures. In a first step, the article reminds the context, current and future threats (IR seekers of different generations...), and scenarios for system definition. Then, it focuses on potential self-protection systems under study at SAGEM DEFENSE & SECURITY Company. Different strategies including target identification, multi band laser and active imagery have been previously studied in order to design DIRCM System solutions. Thus, results of self-protection scenarios are provided for different MANPAD scenarios to highlight key problems to solve. Data have been obtained from simulation software modeling full DIRCM systems architectures on technical and operational scenarios (parametric studies).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reimus, Paul William
This report provides documentation of the mathematical basis for a colloid-facilitated radionuclide transport modeling capability that can be incorporated into GDSA-PFLOTRAN. It also provides numerous test cases against which the modeling capability can be benchmarked once the model is implemented numerically in GDSA-PFLOTRAN. The test cases were run using a 1-D numerical model developed by the author, and the inputs and outputs from the 1-D model are provided in an electronic spreadsheet supplement to this report so that all cases can be reproduced in GDSA-PFLOTRAN, and the outputs can be directly compared with the 1-D model. The cases include examplesmore » of all potential scenarios in which colloid-facilitated transport could result in the accelerated transport of a radionuclide relative to its transport in the absence of colloids. Although it cannot be claimed that all the model features that are described in the mathematical basis were rigorously exercised in the test cases, the goal was to test the features that matter the most for colloid-facilitated transport; i.e., slow desorption of radionuclides from colloids, slow filtration of colloids, and equilibrium radionuclide partitioning to colloids that is strongly favored over partitioning to immobile surfaces, resulting in a substantial fraction of radionuclide mass being associated with mobile colloids.« less
Spacecraft Environmental Testing SMAP (Soil, Moisture, Active, Passive)
NASA Technical Reports Server (NTRS)
Fields, Keith
2014-01-01
Testing a complete full up spacecraft to verify it will survive the environment, in which it will be exposed to during its mission, is a formidable task in itself. However, the ''test like you fly'' philosophy sometimes gets compromised because of cost, design and or time. This paper describes the thermal-vacuum and mass properties testing of the Soil Moisture Active Passive (SMAP) earth orbiting satellite. SMAP will provide global observations of soil moisture and freeze/thaw state (the hydrosphere state). SMAP hydrosphere state measurements will be used to enhance understanding of processes that link the water, energy, and carbon cycles, and to extend the capabilities of weather and climate prediction models. It will explain the problems encountered, and the solutions developed, which minimized the risk typically associated with such an arduous process. Also discussed, the future of testing on expensive long lead-time spacecraft. Will we ever reach the ''build and shoot" scenario with minimal or no verification testing?
Plan Execution Interchange Language (PLEXIL)
NASA Technical Reports Server (NTRS)
Estlin, Tara; Jonsson, Ari; Pasareanu, Corina; Simmons, Reid; Tso, Kam; Verma, Vandi
2006-01-01
Plan execution is a cornerstone of spacecraft operations, irrespective of whether the plans to be executed are generated on board the spacecraft or on the ground. Plan execution frameworks vary greatly, due to both different capabilities of the execution systems, and relations to associated decision-making frameworks. The latter dependency has made the reuse of execution and planning frameworks more difficult, and has all but precluded information sharing between different execution and decision-making systems. As a step in the direction of addressing some of these issues, a general plan execution language, called the Plan Execution Interchange Language (PLEXIL), is being developed. PLEXIL is capable of expressing concepts used by many high-level automated planners and hence provides an interface to multiple planners. PLEXIL includes a domain description that specifies command types, expansions, constraints, etc., as well as feedback to the higher-level decision-making capabilities. This document describes the grammar and semantics of PLEXIL. It includes a graphical depiction of this grammar and illustrative rover scenarios. It also outlines ongoing work on implementing a universal execution system, based on PLEXIL, using state-of-the-art rover functional interfaces and planners as test cases.
In-vehicle group activity modeling and simulation in sensor-based virtual environment
NASA Astrophysics Data System (ADS)
Shirkhodaie, Amir; Telagamsetti, Durga; Poshtyar, Azin; Chan, Alex; Hu, Shuowen
2016-05-01
Human group activity recognition is a very complex and challenging task, especially for Partially Observable Group Activities (POGA) that occur in confined spaces with limited visual observability and often under severe occultation. In this paper, we present IRIS Virtual Environment Simulation Model (VESM) for the modeling and simulation of dynamic POGA. More specifically, we address sensor-based modeling and simulation of a specific category of POGA, called In-Vehicle Group Activities (IVGA). In VESM, human-alike animated characters, called humanoids, are employed to simulate complex in-vehicle group activities within the confined space of a modeled vehicle. Each articulated humanoid is kinematically modeled with comparable physical attributes and appearances that are linkable to its human counterpart. Each humanoid exhibits harmonious full-body motion - simulating human-like gestures and postures, facial impressions, and hands motions for coordinated dexterity. VESM facilitates the creation of interactive scenarios consisting of multiple humanoids with different personalities and intentions, which are capable of performing complicated human activities within the confined space inside a typical vehicle. In this paper, we demonstrate the efficiency and effectiveness of VESM in terms of its capabilities to seamlessly generate time-synchronized, multi-source, and correlated imagery datasets of IVGA, which are useful for the training and testing of multi-source full-motion video processing and annotation. Furthermore, we demonstrate full-motion video processing of such simulated scenarios under different operational contextual constraints.
Lopez-Iturri, Peio; Aguirre, Erik; Trigo, Jesús Daniel; Astrain, José Javier; Azpilicueta, Leyre; Serrano, Luis; Villadangos, Jesús; Falcone, Francisco
2018-01-29
In the context of hospital management and operation, Intensive Care Units (ICU) are one of the most challenging in terms of time responsiveness and criticality, in which adequate resource management and signal processing play a key role in overall system performance. In this work, a context aware Intensive Care Unit is implemented and analyzed to provide scalable signal acquisition capabilities, as well as to provide tracking and access control. Wireless channel analysis is performed by means of hybrid optimized 3D Ray Launching deterministic simulation to assess potential interference impact as well as to provide required coverage/capacity thresholds for employed transceivers. Wireless system operation within the ICU scenario, considering conventional transceiver operation, is feasible in terms of quality of service for the complete scenario. Extensive measurements of overall interference levels have also been carried out, enabling subsequent adequate coverage/capacity estimations, for a set of Zigbee based nodes. Real system operation has been tested, with ad-hoc designed Zigbee wireless motes, employing lightweight communication protocols to minimize energy and bandwidth usage. An ICU information gathering application and software architecture for Visitor Access Control has been implemented, providing monitoring of the Boxes external doors and the identification of visitors via a RFID system. The results enable a solution to provide ICU access control and tracking capabilities previously not exploited, providing a step forward in the implementation of a Smart Health framework.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Modeling a student-classroom interaction in a tutorial-like system using learning automata.
Oommen, B John; Hashem, M Khaled
2010-02-01
Almost all of the learning paradigms used in machine learning, learning automata (LA), and learning theory, in general, use the philosophy of a Student (learning mechanism) attempting to learn from a teacher. This paradigm has been generalized in a myriad of ways, including the scenario when there are multiple teachers or a hierarchy of mechanisms that collectively achieve the learning. In this paper, we consider a departure from this paradigm by allowing the Student to be a member of a classroom of Students, where, for the most part, we permit each member of the classroom not only to learn from the teacher(s) but also to "extract" information from any of his fellow Students. This paper deals with issues concerning the modeling, decision-making process, and testing of such a scenario within the LA context. The main result that we show is that a weak learner can actually benefit from this capability of utilizing the information that he gets from a superior colleague-if this information transfer is done appropriately. As far as we know, the whole concept of Students learning from both a teacher and from a classroom of Students is novel and unreported in the literature. The proposed Student-classroom interaction has been tested for numerous strategies and for different environments, including the established benchmarks, and the results show that Students can improve their learning by interacting with each other. For example, for some interaction strategies, a weak Student can improve his learning by up to 73% when interacting with a classroom of Students, which includes Students of various capabilities. In these interactions, the Student does not have a priori knowledge of the identity or characteristics of the Students who offer their assistance.
A publicly available toxicogenomics capability for supporting predictive toxicology and meta-analysis depends on availability of gene expression data for chemical treatment scenarios, the ability to locate and aggregate such information by chemical, and broad data coverage within...
A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases
Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.
2013-01-01
How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357
A novel method to handle the effect of uneven sampling effort in biodiversity databases.
Pardo, Iker; Pata, María P; Gómez, Daniel; García, María B
2013-01-01
How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses.
A methodology for evaluation of an interactive multispectral image processing system
NASA Technical Reports Server (NTRS)
Kovalick, William M.; Newcomer, Jeffrey A.; Wharton, Stephen W.
1987-01-01
Because of the considerable cost of an interactive multispectral image processing system, an evaluation of a prospective system should be performed to ascertain if it will be acceptable to the anticipated users. Evaluation of a developmental system indicated that the important system elements include documentation, user friendliness, image processing capabilities, and system services. The criteria and evaluation procedures for these elements are described herein. The following factors contributed to the success of the evaluation of the developmental system: (1) careful review of documentation prior to program development, (2) construction and testing of macromodules representing typical processing scenarios, (3) availability of other image processing systems for referral and verification, and (4) use of testing personnel with an applications perspective and experience with other systems. This evaluation was done in addition to and independently of program testing by the software developers of the system.
Towards a Passive Low-Cost In-Home Gait Assessment System for Older Adults
Wang, Fang; Stone, Erik; Skubic, Marjorie; Keller, James M.; Abbott, Carmen; Rantz, Marilyn
2013-01-01
In this paper, we propose a webcam-based system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed, step time and step length from a three-dimensional voxel reconstruction, which is built from two calibrated webcam views. The gait parameters are validated with a GAITRite mat and a Vicon motion capture system in the lab with 13 participants and 44 tests, and again with GAITRite for 8 older adults in senior housing. An excellent agreement with intra-class correlation coefficients of 0.99 and repeatability coefficients between 0.7% and 6.6% was found for walking speed, step time and step length given the limitation of frame rate and voxel resolution. The system was further tested with 10 seniors in a scripted scenario representing everyday activities in an unstructured environment. The system results demonstrate the capability of being used as a daily gait assessment tool for fall risk assessment and other medical applications. Furthermore, we found that residents displayed different gait patterns during their clinical GAITRite tests compared to the realistic scenario, namely a mean increase of 21% in walking speed, a mean decrease of 12% in step time, and a mean increase of 6% in step length. These findings provide support for continuous gait assessment in the home for capturing habitual gait. PMID:24235111
Deployment of a Testbed in a Brazilian Research Network using IPv6 and Optical Access Technologies
NASA Astrophysics Data System (ADS)
Martins, Luciano; Ferramola Pozzuto, João; Olimpio Tognolli, João; Chaves, Niudomar Siqueira De A.; Reggiani, Atilio Eduardo; Hortêncio, Claudio Antonio
2012-04-01
This article presents the implementation of a testbed and the experimental results obtained with it on the Brazilian Experimental Network of the government-sponsored "GIGA Project." The use of IPv6 integrated to current and emerging optical architectures and technologies, such as dense wavelength division multiplexing and 10-gigabit Ethernet on the core and gigabit capable passive optical network and optical distribution network on access, were tested. These protocols, architectures, and optical technologies are promising and part of a brand new worldwide technological scenario that has being fairly adopted in the networks of enterprises and providers of the world.
Executive Summary of Propulsion on the Orion Abort Flight-Test Vehicles
NASA Technical Reports Server (NTRS)
Jones, Daniel S.; Koelfgen, Syri J.; Barnes, Marvin W.; McCauley, Rachel J.; Wall, Terry M.; Reed, Brian D.; Duncan, C. Miguel
2012-01-01
The NASA Orion Flight Test Office was tasked with conducting a series of flight tests in several launch abort scenarios to certify that the Orion Launch Abort System is capable of delivering astronauts aboard the Orion Crew Module to a safe environment, away from a failed booster. The first of this series was the Orion Pad Abort 1 Flight-Test Vehicle, which was successfully flown on May 6, 2010 at the White Sands Missile Range in New Mexico. This paper provides a brief overview of the three propulsive subsystems used on the Pad Abort 1 Flight-Test Vehicle. An overview of the propulsive systems originally planned for future flight-test vehicles is also provided, which also includes the cold gas Reaction Control System within the Crew Module, and the Peacekeeper first stage rocket motor encased within the Abort Test Booster aeroshell. Although the Constellation program has been cancelled and the operational role of the Orion spacecraft has significantly evolved, lessons learned from Pad Abort 1 and the other flight-test vehicles could certainly contribute to the vehicle architecture of many future human-rated space launch vehicles.
Testing of a Stitched Composite Large-Scale Multi-Bay Pressure Box
NASA Technical Reports Server (NTRS)
Jegley, Dawn; Rouse, Marshall; Przekop, Adam; Lovejoy, Andrew
2016-01-01
NASA has created the Environmentally Responsible Aviation (ERA) Project to develop technologies to reduce aviation's impact on the environment. A critical aspect of this pursuit is the development of a lighter, more robust airframe to enable the introduction of unconventional aircraft configurations. NASA and The Boeing Company have worked together to develop a structural concept that is lightweight and an advancement beyond state-of-the-art composite structures. The Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) is an integrally stiffened panel design where elements are stitched together. The PRSEUS concept is designed to maintain residual load carrying capabilities under a variety of damage scenarios. A series of building block tests were evaluated to explore the fundamental assumptions related to the capability and advantages of PRSEUS panels. The final step in the building block series is an 80%-scale pressure box representing a portion of the center section of a Hybrid Wing Body (HWB) transport aircraft. The testing of this article under maneuver load and internal pressure load conditions is the subject of this paper. The experimental evaluation of this article, along with the other building block tests and the accompanying analyses, has demonstrated the viability of a PRSEUS center body for the HWB vehicle. Additionally, much of the development effort is also applicable to traditional tube-and-wing aircraft, advanced aircraft configurations, and other structures where weight and through-the-thickness strength are design considerations.
NASA Technical Reports Server (NTRS)
McElroy, Mark; Jackson, Wade; Pankow, Mark
2016-01-01
It is not easy to isolate the damage mechanisms associated with low-velocity impact in composites using traditional experiments. In this work, a new experiment is presented with the goal of generating data representative of progressive damage processes caused by low-velocity impact in composite materials. Carbon fiber reinforced polymer test specimens were indented quasi-statically such that a biaxial-bending state of deformation was achieved. As a result, a three-dimensional damage process, involving delamination and delamination-migration, was observed and documented using ultrasonic and x-ray computed tomography. Results from two different layups are presented in this paper. Delaminations occurred at up to three different interfaces and interacted with one another via transverse matrix cracks. Although this damage pattern is much less complex than that of low-velocity impact on a plate, it is more complex than that of a standard delamination coupon test and provides a way to generate delamination, matrix cracking, and delamination-migration in a controlled manner. By limiting the damage process in the experiment to three delaminations, the same damage mechanisms seen during impact could be observed but in a simplified manner. This type of data is useful in stages of model development and validation when the model is capable of simulating simple tests, but not yet capable of simulating more complex and realistic damage scenarios.
JIMM: the next step for mission-level models
NASA Astrophysics Data System (ADS)
Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.
2001-09-01
The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Hendrickson, Bruce
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less
NASA Astrophysics Data System (ADS)
Labak, Peter; Sussman, Aviva; Rowlands, Aled; Chiappini, Massimo; Malich, Gregor; MacLeod, Gordon; Sankey, Peter; Sweeney, Jerry; Tuckwell, George
2016-04-01
The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI). During an OSI, up to 40 inspectors search a 1000km2 inspection area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of an OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams to execute the scenario in which the exercise was played, to those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, a number of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force Group (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, as well as other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection by other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Brian; Oppel, Fred; Rigdon, Brian
2012-09-13
This package contains classes that capture high-level aspects of characters and vehicles. Vehicles manage seats and riders. Vehicles and characters now can be configured to compose different behaviors and have certain capabilities, by adding them through xml data. These behaviors and capabilities are not included in this package, but instead are part of other packages such as mobility behavior, path planning, sight, sound. Entity is not dependent on these other packages. This package also contains the icons used for Umbra applications Dante Scenario Editor, Dante Tabletop and OpShed. This assertion includes a managed C++ wrapper code (EntityWrapper) to enable C#more » applications, such as Dante Scenario Editor, Dante Tabletop, and OpShed, to incorporate this library.« less
MESSOC capabilities and results. [Model for Estimating Space Station Opertions Costs
NASA Technical Reports Server (NTRS)
Shishko, Robert
1990-01-01
MESSOC (Model for Estimating Space Station Operations Costs) is the result of a multi-year effort by NASA to understand and model the mature operations cost of Space Station Freedom. This paper focuses on MESSOC's ability to contribute to life-cycle cost analyses through its logistics equations and databases. Together, these afford MESSOC the capability to project not only annual logistics costs for a variety of Space Station scenarios, but critical non-cost logistics results such as annual Station maintenance crewhours, upweight/downweight, and on-orbit sparing availability as well. MESSOC results using current logistics databases and baseline scenario have already shown important implications for on-orbit maintenance approaches, space transportation systems, and international operations cost sharing.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-20
... Development and Distribution of Annual Stress Test Scenarios AGENCY: Federal Deposit Insurance Corporation... distributing the stress test scenarios for the annual stress tests required by the Dodd- Frank Wall Street Reform and Consumer Protection Act of 2010 as implemented by the Annual Stress Test final rule (``Stress...
Coupling Matched Molecular Pairs with Machine Learning for Virtual Compound Optimization.
Turk, Samo; Merget, Benjamin; Rippmann, Friedrich; Fulle, Simone
2017-12-26
Matched molecular pair (MMP) analyses are widely used in compound optimization projects to gain insights into structure-activity relationships (SAR). The analysis is traditionally done via statistical methods but can also be employed together with machine learning (ML) approaches to extrapolate to novel compounds. The here introduced MMP/ML method combines a fragment-based MMP implementation with different machine learning methods to obtain automated SAR decomposition and prediction. To test the prediction capabilities and model transferability, two different compound optimization scenarios were designed: (1) "new fragments" which occurs when exploring new fragments for a defined compound series and (2) "new static core and transformations" which resembles for instance the identification of a new compound series. Very good results were achieved by all employed machine learning methods especially for the new fragments case, but overall deep neural network models performed best, allowing reliable predictions also for the new static core and transformations scenario, where comprehensive SAR knowledge of the compound series is missing. Furthermore, we show that models trained on all available data have a higher generalizability compared to models trained on focused series and can extend beyond chemical space covered in the training data. Thus, coupling MMP with deep neural networks provides a promising approach to make high quality predictions on various data sets and in different compound optimization scenarios.
NASA Astrophysics Data System (ADS)
Duquet, Jean Remi; Bergeron, Pierre; Blodgett, Dale E.; Couture, Jean; Macieszczak, Maciej; Mayrand, Michel; Chalmers, Bruce A.; Paradis, Stephane
1998-03-01
The Research and Development group at Lockheed Martin Canada, in collaboration with the Defence Research Establishment Valcartier, has undertaken a research project in order to capture and analyze the real-time and functional requirements of a next generation Command and Control System (CCS) for the Canadian Patrol Frigates, integrating Multi- Sensor Data Fusion (MSDF), Situation and Threat Assessment (STA) and Resource Management (RM). One important aspect of the project is to define how the use of Artificial Intelligence may optimize the performance of an integrated, real-time MSDF/STA/RM system. A closed-loop simulation environment is being developed to facilitate the evaluation of MSDF/STA/RM concepts, algorithms and architectures. This environment comprises (1) a scenario generator, (2) complex sensor, hardkill and softkill weapon models, (3) a real-time monitoring tool, (4) a distributed Knowledge-Base System (KBS) shell. The latter is being completely redesigned and implemented in-house since no commercial KBS shell could adequately satisfy all the project requirements. The closed- loop capability of the simulation environment, together with its `simulated real-time' capability, allows the interaction between the MSDF/STA/RM system and the environment targets during the execution of a scenario. This capability is essential to measure the performance of many STA and RM functionalities. Some benchmark scenarios have been selected to demonstrate quantitatively the capabilities of the selected MSDF/STA/RM algorithms. The paper describes the simulation environment and discusses the MSDF/STA/RM functionalities currently implemented and their performance as an automatic CCS.
The role of opacity and transparency in achieving strategic stability in South Asia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajain, Arpit; Ashraf, Tariq Mahmud
According to international relations theory, deterrence can be used as a tool to achieve stability between potentially hostile nations. India and Pakistan's long history of periodic crises raises the question of how they can achieve deterrence stability. 'Transparency' describes the flow of information between parties and plays a key role in establishing a deterrence relationship. This paper studies the balance needed between opacity and transparency in nuclear topics for the maintenance of deterrence stability between India and Pakistan. States with nuclear weapons are postulated to implement transparency in four categories: potential, capability, intent, and resolve. The study applies these categoriesmore » to the nuclear components of the ongoing India-Pakistan Composite Dialogue Working Group for Peace and Security including CBMs. To focus our efforts, we defined four scenarios to characterize representative strategic/military/political conditions. The scenarios are combinations of these two sets of opposite poles: competition - cooperation; extremism - moderation (to be understood primarily in a religious/nationalistic sense). We describe each scenario in terms of select focal areas (nuclear doctrine, nuclear command and control, nuclear stockpile, nuclear delivery/defensive systems, and conventional force posture). The scenarios help frame the realm of possibilities, and have been described in terms of expected conditions for the focal areas. We then use the conditions in each scenario to prescribe a range of information-sharing actions that the two countries could take to increase stability. We also highlight the information that should not be shared. These actions can be political (e.g., declarations), procedural (e.g., advance notice of certain military activities), or technologically based (e.g., seismic monitoring of the nuclear test moratorium).« less
Crespel, Amélie; Zambonino-Infante, José-Luis; Mazurais, David; Koumoundouros, George; Fragkoulis, Stefanos; Quazuguel, Patrick; Huelvan, Christine; Madec, Laurianne; Servili, Arianna; Claireaux, Guy
2017-01-01
Ocean acidification is a recognized consequence of anthropogenic carbon dioxide (CO 2 ) emission in the atmosphere. Despite its threat to marine ecosystems, little is presently known about the capacity for fish to respond efficiently to this acidification. In adult fish, acid-base regulatory capacities are believed to be relatively competent to respond to hypercapnic conditions. However, fish in early life stage could be particularly sensitive to environmental factors as organs and important physiological functions become progressively operational during this period. In this study, the response of European sea bass ( Dicentrarchus labrax ) larvae reared under three ocean acidification scenarios, i.e., control (present condition, [Formula: see text] = 590 µatm, pH total = 7.9), low acidification (intermediate IPCC scenario, [Formula: see text] = 980 µatm, pH total = 7.7), and high acidification (most severe IPCC scenario, [Formula: see text] = 1520 µatm, pH total = 7.5) were compared across multiple levels of biological organizations. From 2 to 45 days-post-hatching, the chronic exposure to the different scenarios had limited influence on the survival and growth of the larvae (in the low acidification condition only) and had no apparent effect on the digestive developmental processes. The high acidification condition induced both faster mineralization and reduction in skeletal deformities. Global (microarray) and targeted (qPCR) analysis of transcript levels in whole larvae did not reveal any significant changes in gene expression across tested acidification conditions. Overall, this study suggests that contemporary sea bass larvae are already capable of coping with projected acidification conditions without having to mobilize specific defense mechanisms.
77 FR 70124 - Policy Statement on the Scenario Design Framework for Stress Testing
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-23
... Statement on the Scenario Design Framework for Stress Testing AGENCY: Board of Governors of the Federal... Board is requesting public comment on a policy statement on the approach to scenario design for stress testing that would be used in connection with the supervisory and company-run stress tests conducted under...
78 FR 9633 - Policy Statement on the Scenario Design Framework for Stress Testing
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-11
... Statement on the Scenario Design Framework for Stress Testing AGENCY: Board of Governors of the Federal... design for stress testing that would be used in connection with the supervisory and company-run stress...) requesting public comment on a policy statement on the approach to scenario design for stress testing that...
Computer-aided testing of pilot response to critical in-flight events
NASA Technical Reports Server (NTRS)
Giffin, W. C.; Rockwell, T. H.
1984-01-01
This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.
Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing
NASA Technical Reports Server (NTRS)
Doyle, Richard; Bergman, Larry; Some, Raphael; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael
2013-01-01
Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and the mission; it can be aptly viewed as a "technology multiplier" in that advances in onboard computing provide dramatic improvements in flight functions and capabilities across the NASA mission classes, and will enable new flight capabilities and mission scenarios, increasing science and exploration return per mission-dollar.
Barnett, Ralph L; Liber, Theodore
2006-02-22
Use of unassisted human push capability arises from time to time in the areas of crowd and animal control, the security of locked doors, the integrity of railings, the removal of tree stumps and entrenched vehicles, the manoeuvering of furniture, and athletic pursuits such as US football or wrestling. Depending on the scenario, human push capability involves strength, weight, weight distribution, push angle, footwear/floor friction, and the friction between the upper body and the pushed object. Simple models are used to establish the relationships among these factors.
Toward XML Representation of NSS Simulation Scenario for Mission Scenario Exchange Capability
2003-09-01
app.html Deitel , H. M., Deitel , P. J., Nieto, T. R., Lin, T. M., Sadhu, P. (2001). XML How to Program . Upper Saddle River: Prentice Hall...Combat XXI Program ...........................13 2. Transition NSS to a Java Environment ...........................................13 3. Shift to an...STATEMENT The Naval Simulation System (NSS) is a powerful computer program developed by the Navy to provide a force-on-force modeling and simulation
Satellite Survivability Module
NASA Astrophysics Data System (ADS)
Buehler, P.; Smith, J.
The Satellite Survivability Module (SSM) is an end-to-end, physics-based, performance prediction model for directed energy engagement of orbiting spacecraft. SSM was created as an add-on module for the Satellite Tool Kit (STK). Two engagement types are currently supported: laser engagement of the focal plane array of an imaging spacecraft; and Radio Frequency (RF) engagement of spacecraft components. This paper will focus on the laser engagement scenario, the process by which it is defined, and how we use this tool to support a future laser threat detection system experiment. For a laser engagement, the user creates a spacecraft, defines its optical system, adds any protection techniques used by the optical system, introduces a laser threat, and then defines the atmosphere through which the laser will pass. SSM models the laser engagement and its impact on the spacecraft's optical system using four impact levels: degradation, saturation, damage, and destruction. Protection techniques, if employed, will mitigate engagement effects. SSM currently supports two laser protection techniques. SSM allows the user to create and implement a variety of "what if" scenarios. Satellites can be placed in a variety of orbits. Threats can be placed anywhere on the Earth or, for version 2.0, on other satellites. Satellites and threats can be mixed and matched to examine possibilities. Protection techniques for a particular spacecraft can be turned on or off individually; and can be arranged in any order to simulate more complicated protection schemes. Results can be displayed as 2-D or 3-D visualizations, or as textual reports. A new report feature available in version 2.0 will allow laser effects data to be displayed dynamically during scenario execution. In order to test SSM capabilities, the Ball team used SSM to model several engagement scenarios for our future laser threat detection system experiment. Actual test sites, along with actual laser, optics, and detector characteristics were entered into SSM to determine what effects we can expect to see, and to what extent. We concluded that SSM results are accurate when compared to actual field test results. The work is currently funded by the Air Force Research Laboratory, Space Vehicles directorate at Kirtland AFB, New Mexico, under contract number FA9453-06-C-0371.
NASA Technical Reports Server (NTRS)
Allgood, Daniel C.
2016-01-01
The objective of the presented work was to develop validated computational fluid dynamics (CFD) based methodologies for predicting propellant detonations and their associated blast environments. Applications of interest were scenarios relevant to rocket propulsion test and launch facilities. All model development was conducted within the framework of the Loci/CHEM CFD tool due to its reliability and robustness in predicting high-speed combusting flow-fields associated with rocket engines and plumes. During the course of the project, verification and validation studies were completed for hydrogen-fueled detonation phenomena such as shock-induced combustion, confined detonation waves, vapor cloud explosions, and deflagration-to-detonation transition (DDT) processes. The DDT validation cases included predicting flame acceleration mechanisms associated with turbulent flame-jets and flow-obstacles. Excellent comparison between test data and model predictions were observed. The proposed CFD methodology was then successfully applied to model a detonation event that occurred during liquid oxygen/gaseous hydrogen rocket diffuser testing at NASA Stennis Space Center.
Flight test of a passive millimeter-wave imaging system
NASA Astrophysics Data System (ADS)
Martin, Christopher A.; Manning, Will; Kolinko, Vladimir G.; Hall, Max
2005-05-01
A real-time passive millimeter-wave imaging system with a wide-field of view and 3K temperature sensitivity is described. The system was flown on a UH-1H helicopter in a flight test conducted by the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD). We collected approximately eight hours of data over the course of the two-week flight test. Flight data was collected in horizontal and vertical polarizations at look down angles from 0 to 40 degrees. Speeds varied from 0 to 90 knots and altitudes varied from 0' to 1000'. Targets imaged include roads, freeways, railroads, houses, industrial buildings, power plants, people, streams, rivers, bridges, cars, trucks, trains, boats, planes, runways, treelines, shorelines, and the horizon. The imaging system withstood vibration and temperature variations, but experienced some RF interference. The flight test demonstrated the system's capabilities as an airborne navigation and surveillance aid. It also performed in a personnel recovery scenario.
Mars Sample Return and Flight Test of a Small Bimodal Nuclear Rocket and ISRU Plant
NASA Technical Reports Server (NTRS)
George, Jeffrey A.; Wolinsky, Jason J.; Bilyeu, Michael B.; Scott, John H.
2014-01-01
A combined Nuclear Thermal Rocket (NTR) flight test and Mars Sample Return mission (MSR) is explored as a means of "jump-starting" NTR development. Development of a small-scale engine with relevant fuel and performance could more affordably and quickly "pathfind" the way to larger scale engines. A flight test with subsequent inflight postirradiation evaluation may also be more affordable and expedient compared to ground testing and associated facilities and approvals. Mission trades and a reference scenario based upon a single expendable launch vehicle (ELV) are discussed. A novel "single stack" spacecraft/lander/ascent vehicle concept is described configured around a "top-mounted" downward firing NTR, reusable common tank, and "bottom-mount" bus, payload and landing gear. Requirements for a hypothetical NTR engine are described that would be capable of direct thermal propulsion with either hydrogen or methane propellant, and modest electrical power generation during cruise and Mars surface insitu resource utilization (ISRU) propellant production.
NASA Technical Reports Server (NTRS)
Liou, J. C.
2012-01-01
Presentation outlne: (1) The NASA Orbital Debris (OD) Engineering Model -- A mathematical model capable of predicting OD impact risks for the ISS and other critical space assets (2) The NASA OD Evolutionary Model -- A physical model capable of predicting future debris environment based on user-specified scenarios (3) The NASA Standard Satellite Breakup Model -- A model describing the outcome of a satellite breakup (explosion or collision)
Modeling the Cloud to Enhance Capabilities for Crises and Catastrophe Management
2016-11-16
order for cloud computing infrastructures to be successfully deployed in real world scenarios as tools for crisis and catastrophe management, where...Statement of the Problem Studied As cloud computing becomes the dominant computational infrastructure[1] and cloud technologies make a transition to hosting...1. Formulate rigorous mathematical models representing technological capabilities and resources in cloud computing for performance modeling and
Tan, Chee-Heng; Teh, Ying-Wah
2013-08-01
The main obstacles in mass adoption of cloud computing for database operations in healthcare organization are the data security and privacy issues. In this paper, it is shown that IT services particularly in hardware performance evaluation in virtual machine can be accomplished effectively without IT personnel gaining access to actual data for diagnostic and remediation purposes. The proposed mechanisms utilized the hypothetical data from TPC-H benchmark, to achieve 2 objectives. First, the underlying hardware performance and consistency is monitored via a control system, which is constructed using TPC-H queries. Second, the mechanism to construct stress-testing scenario is envisaged in the host, using a single or combination of TPC-H queries, so that the resource threshold point can be verified, if the virtual machine is still capable of serving critical transactions at this constraining juncture. This threshold point uses server run queue size as input parameter, and it serves 2 purposes: It provides the boundary threshold to the control system, so that periodic learning of the synthetic data sets for performance evaluation does not reach the host's constraint level. Secondly, when the host undergoes hardware change, stress-testing scenarios are simulated in the host by loading up to this resource threshold level, for subsequent response time verification from real and critical transactions.
Aguirre, Erik
2018-01-01
In the context of hospital management and operation, Intensive Care Units (ICU) are one of the most challenging in terms of time responsiveness and criticality, in which adequate resource management and signal processing play a key role in overall system performance. In this work, a context aware Intensive Care Unit is implemented and analyzed to provide scalable signal acquisition capabilities, as well as to provide tracking and access control. Wireless channel analysis is performed by means of hybrid optimized 3D Ray Launching deterministic simulation to assess potential interference impact as well as to provide required coverage/capacity thresholds for employed transceivers. Wireless system operation within the ICU scenario, considering conventional transceiver operation, is feasible in terms of quality of service for the complete scenario. Extensive measurements of overall interference levels have also been carried out, enabling subsequent adequate coverage/capacity estimations, for a set of Zigbee based nodes. Real system operation has been tested, with ad-hoc designed Zigbee wireless motes, employing lightweight communication protocols to minimize energy and bandwidth usage. An ICU information gathering application and software architecture for Visitor Access Control has been implemented, providing monitoring of the Boxes external doors and the identification of visitors via a RFID system. The results enable a solution to provide ICU access control and tracking capabilities previously not exploited, providing a step forward in the implementation of a Smart Health framework. PMID:29382148
Stereoscopy in cinematographic synthetic imagery
NASA Astrophysics Data System (ADS)
Eisenmann, Jonathan; Parent, Rick
2009-02-01
In this paper we present experiments and results pertaining to the perception of depth in stereoscopic viewing of synthetic imagery. In computer animation, typical synthetic imagery is highly textured and uses stylized illumination of abstracted material models by abstracted light source models. While there have been numerous studies concerning stereoscopic capabilities, conventions for staging and cinematography in stereoscopic movies have not yet been well-established. Our long-term goal is to measure the effectiveness of various cinematography techniques on the human visual system in a theatrical viewing environment. We would like to identify the elements of stereoscopic cinema that are important in terms of enhancing the viewer's understanding of a scene as well as providing guidelines for the cinematographer relating to storytelling. In these experiments we isolated stereoscopic effects by eliminating as many other visual cues as is reasonable. In particular, we aim to empirically determine what types of movement in synthetic imagery affect the perceptual depth sensing capabilities of our viewers. Using synthetic imagery, we created several viewing scenarios in which the viewer is asked to locate a target object's depth in a simple environment. The scenarios were specifically designed to compare the effectiveness of stereo viewing, camera movement, and object motion in aiding depth perception. Data were collected showing the error between the choice of the user and the actual depth value, and patterns were identified that relate the test variables to the viewer's perceptual depth accuracy in our theatrical viewing environment.
Combined sewer overflow control with LID based on SWMM: an example in Shanghai, China.
Liao, Z L; Zhang, G Q; Wu, Z H; He, Y; Chen, H
2015-01-01
Although low impact development (LID) has been commonly applied across the developed countries for mitigating the negative impacts of combined sewer overflows (CSOs) on urban hydrological environment, it has not been widely used in developing countries yet. In this paper, a typical combined sewer system in an urbanized area of Shanghai, China was used to demonstrate how to design and choose CSO control solutions with LID using stormwater management model. We constructed and simulated three types of CSO control scenarios. Our findings support the notion that LID measures possess favorable capability on CSO reduction. Nevertheless, the green scenarios which are completely comprised by LID measures fail to achieve the maximal effectiveness on CSO reduction, while the gray-green scenarios (LID measure combined with gray measures) achieve it. The unit cost-effectiveness of each type of scenario sorts as: green scenario > gray-green scenario > gray scenario. Actually, as the storage tank is built in the case catchment, a complete application of green scenario is inaccessible here. Through comprehensive evaluation and comparison, the gray-green scenario F which used the combination of storage tank, bio-retention and rain barrels is considered as the most feasible one in this case.
The Agriculture Model Intercomparison and Improvement Project (AgMIP) (Invited)
NASA Astrophysics Data System (ADS)
Rosenzweig, C.
2010-12-01
The Agricultural Model Intercomparison and Improvement Project (AgMIP) is a distributed climate-scenario simulation exercise for historical model intercomparison and future climate change conditions with participation of multiple crop and world agricultural trade modeling groups around the world. The goals of AgMIP are to improve substantially the characterization of risk of hunger and world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. Historical period results will spur model improvement and interaction among major modeling groups, while future period results will lead directly to tests of adaptation and mitigation strategies across a range of scales. AgMIP will consist of a multi-scale impact assessment utilizing the latest methods for climate and agricultural scenario generation. Scenarios and modeling protocols will be distributed on the web, and multi-model results will be collated and analyzed to ensure the widest possible coverage of agricultural crops and regions. AgMIP will place regional changes in agricultural production in a global context that reflects new trading opportunities, imbalances, and shortages in world markets resulting from climate change and other driving forces for food supply. Such projections are essential inputs from the Vulnerability, Impacts, and Adaptation (VIA) research community to the Intergovernmental Panel on Climate Change Fifth Assessment (AR5), now underway, and the UN Framework Convention on Climate Change. They will set the context for local-scale vulnerability and adaptation studies, supply test scenarios for national-scale development of trade policy instruments, provide critical information on changing supply and demand for water resources, and elucidate interactive effects of climate change and land use change. AgMIP will not only provide crucially-needed new global estimates of how climate change will affect food supply and hunger in the agricultural regions of the world, but it will also build the capabilities of developing countries to estimate how climate change will affect their supply and demand for food.
Autonomous Deep-Space Optical Navigation Project
NASA Technical Reports Server (NTRS)
D'Souza, Christopher
2014-01-01
This project will advance the Autonomous Deep-space navigation capability applied to Autonomous Rendezvous and Docking (AR&D) Guidance, Navigation and Control (GNC) system by testing it on hardware, particularly in a flight processor, with a goal of limited testing in the Integrated Power, Avionics and Software (IPAS) with the ARCM (Asteroid Retrieval Crewed Mission) DRO (Distant Retrograde Orbit) Autonomous Rendezvous and Docking (AR&D) scenario. The technology, which will be harnessed, is called 'optical flow', also known as 'visual odometry'. It is being matured in the automotive and SLAM (Simultaneous Localization and Mapping) applications but has yet to be applied to spacecraft navigation. In light of the tremendous potential of this technique, we believe that NASA needs to design a optical navigation architecture that will use this technique. It is flexible enough to be applicable to navigating around planetary bodies, such as asteroids.
Cagiltay, Nergiz Ercil; Ozcelik, Erol; Sengul, Gokhan; Berker, Mustafa
2017-11-01
In neurosurgery education, there is a paradigm shift from time-based training to criterion-based model for which competency and assessment becomes very critical. Even virtual reality simulators provide alternatives to improve education and assessment in neurosurgery programs and allow for several objective assessment measures, there are not many tools for assessing the overall performance of trainees. This study aims to develop and validate a tool for assessing the overall performance of participants in a simulation-based endoneurosurgery training environment. A training program was developed in two levels: endoscopy practice and beginning surgical practice based on four scenarios. Then, three experiments were conducted with three corresponding groups of participants (Experiment 1, 45 (32 beginners, 13 experienced), Experiment 2, 53 (40 beginners, 13 experienced), and Experiment 3, 26 (14 novices, 12 intermediate) participants). The results analyzed to understand the common factors among the performance measurements of these experiments. Then, a factor capable of assessing the overall skill levels of surgical residents was extracted. Afterwards, the proposed measure was tested to estimate the experience levels of the participants. Finally, the level of realism of these educational scenarios was assessed. The factor formed by time, distance, and accuracy on simulated tasks provided an overall performance indicator. The prediction correctness was very high for the beginners than the one for experienced surgeons in Experiments 1 and 2. When non-dominant hand is used in a surgical procedure-based scenario, skill levels of surgeons can be better predicted. The results indicate that the scenarios in Experiments 1 and 2 can be used as an assessment tool for the beginners, and scenario-2 in Experiment 3 can be used as an assessment tool for intermediate and novice levels. It can be concluded that forming the balance between perceived action capacities and skills is critical for better designing and developing skill assessment surgical simulation tools.
Bayesian inference for heterogeneous caprock permeability based on above zone pressure monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Namhata, Argha; Small, Mitchell J.; Dilmore, Rober
The presence of faults/ fractures or highly permeable zones in the primary sealing caprock of a CO2 storage reservoir can result in leakage of CO2. Monitoring of leakage requires the capability to detect and resolve the onset, location, and volume of leakage in a systematic and timely manner. Pressure-based monitoring possesses such capabilities. This study demonstrates a basis for monitoring network design based on the characterization of CO2 leakage scenarios through an assessment of the integrity and permeability of the caprock inferred from above zone pressure measurements. Four representative heterogeneous fractured seal types are characterized to demonstrate seal permeability rangingmore » from highly permeable to impermeable. Based on Bayesian classification theory, the probability of each fractured caprock scenario given above zone pressure measurements with measurement error is inferred. The sensitivity to injection rate and caprock thickness is also evaluated and the probability of proper classification is calculated. The time required to distinguish between above zone pressure outcomes and the associated leakage scenarios is also computed.« less
Design and development of a bio-inspired, under-actuated soft gripper.
Hassan, Taimoor; Manti, Mariangela; Passetti, Giovanni; d'Elia, Nicolò; Cianchetti, Matteo; Laschi, Cecilia
2015-08-01
The development of robotic devices able to perform manipulation tasks mimicking the human hand has been assessed on large scale. This work stands in the challenging scenario where soft materials are combined with bio-inspired design in order to develop soft grippers with improved grasping and holding capabilities. We are going to show a low-cost, under-actuated and adaptable soft gripper, highlighting the design and the manufacturing process. In particular, a critical analysis is made among three versions of the gripper with same design and actuation mechanism, but based on different materials. A novel actuation principle has been implemented in both cases, in order to reduce the encumbrance of the entire system and improve its aesthetics. Grasping and holding capabilities have been tested for each device, with target objects varying in shape, size and material. Results highlight synergy between the geometry and the intrinsic properties of the soft material, showing the way to novel design principles for soft grippers.
Network issues for large mass storage requirements
NASA Technical Reports Server (NTRS)
Perdue, James
1992-01-01
File Servers and Supercomputing environments need high performance networks to balance the I/O requirements seen in today's demanding computing scenarios. UltraNet is one solution which permits both high aggregate transfer rates and high task-to-task transfer rates as demonstrated in actual tests. UltraNet provides this capability as both a Server-to-Server and Server-to-Client access network giving the supercomputing center the following advantages highest performance Transport Level connections (to 40 MBytes/sec effective rates); matches the throughput of the emerging high performance disk technologies, such as RAID, parallel head transfer devices and software striping; supports standard network and file system applications using SOCKET's based application program interface such as FTP, rcp, rdump, etc.; supports access to the Network File System (NFS) and LARGE aggregate bandwidth for large NFS usage; provides access to a distributed, hierarchical data server capability using DISCOS UniTree product; supports file server solutions available from multiple vendors, including Cray, Convex, Alliant, FPS, IBM, and others.
A wetting and drying scheme for ROMS
Warner, John C.; Defne, Zafer; Haas, Kevin; Arango, Hernan G.
2013-01-01
The processes of wetting and drying have many important physical and biological impacts on shallow water systems. Inundation and dewatering effects on coastal mud flats and beaches occur on various time scales ranging from storm surge, periodic rise and fall of the tide, to infragravity wave motions. To correctly simulate these physical processes with a numerical model requires the capability of the computational cells to become inundated and dewatered. In this paper, we describe a method for wetting and drying based on an approach consistent with a cell-face blocking algorithm. The method allows water to always flow into any cell, but prevents outflow from a cell when the total depth in that cell is less than a user defined critical value. We describe the method, the implementation into the three-dimensional Regional Oceanographic Modeling System (ROMS), and exhibit the new capability under three scenarios: an analytical expression for shallow water flows, a dam break test case, and a realistic application to part of a wetland area along the Georgia Coast, USA.
Radiation-Hardened Circuitry Using Mask-Programmable Analog Arrays. Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Britton, Jr., Charles L.; Ericson, Milton Nance; Bobrek, Miljko
As the recent accident at Fukushima Daiichi so vividly demonstrated, telerobotic technologies capable of withstanding high radiation environments need to be readily available to enable operations, repair, and recovery under severe accident scenarios where human entry is extremely dangerous or not possible. Telerobotic technologies that enable remote operation in high dose rate environments have undergone revolutionary improvement over the past few decades. However, much of this technology cannot be employed in nuclear power environments due the radiation sensitivity of the electronics and the organic insulator materials currently in use. This is the final report of the activities involving the NEETmore » 2 project Radiation Hardened Circuitry Using Mask-Programmable Analog Arrays. We present a detailed functional block diagram of the proposed data acquisition system, the thought process leading to technical decisions, the implemented system, and the tested results from the systems. This system will be capable of monitoring at least three parameters of importance to nuclear reactor monitoring: temperature, radiation level, and pressure.« less
A setup for Gaia-DR1: the star formation history of our thin disc environment
NASA Astrophysics Data System (ADS)
Miret-Roig, N.; Romero-Gómez, M.; Figueras, F.; Mor, R.
2017-03-01
The first Gaia Data Release (Gaia-DR1, 14 September 2016) primes the pump and paves the way for a new golden age of the galactic astronomy. Gaia-DR1 will provide new parallaxes and proper motions for about two million well-behaved Tycho-2 stars placed in the solar neighborhood. This TGAS (Tycho-Gaia Astrometric Solution) catalogue is being obtained through the combination of the Gaia observations with the positions of the stars obtained by Hipparcos (ESA 1997) when available, or Tycho-2. The aim of the work presented here has been to evaluate the capabilities of Gaia and future on-ground spectroscopic surveys to derive the dynamical age and place of birth of the Young Local Associations (YLAs). Test particle simulations in realistic galactic potentials and different scenarios for the accuracy on astrometric and spectroscopic data allow us to quantify our future capabilities to trace back in time the star formation history of our thin disc environment.
Anklam, Charles; Kirby, Adam; Sharevski, Filipo; Dietz, J Eric
2015-01-01
Active shooting violence at confined settings, such as educational institutions, poses serious security concerns to public safety. In studying the effects of active shooter scenarios, the common denominator associated with all events, regardless of reason/intent for shooter motives, or type of weapons used, was the location chosen and time expended between the beginning of the event and its culmination. This in turn directly correlates to number of casualties incurred in any given event. The longer the event protracts, the more casualties are incurred until law enforcement or another barrier can react and culminate the situation. Using AnyLogic technology, devise modeling scenarios to test multiple hypotheses against free-agent modeling simulation to determine the best method to reduce casualties associated with active shooter scenarios. Test four possible scenarios of responding to active shooter in a public school setting using agent-based computer modeling techniques-scenario 1: basic scenario where no access control or any type of security is used within the school; scenario 2, scenario assumes that concealed carry individual(s) (5-10 percent of the work force) are present in the school; scenario 3, scenario assumes that the school has assigned resource officer; scenario 4, scenario assumes that the school has assigned resource officer and concealed carry individual(s) (5-10 percent) present in the school. Statistical data from modeling scenarios indicating which tested hypothesis resulted in fewer casualties and quicker culmination of event. The use of AnyLogic proved the initial hypothesis that a decrease on response time to an active shooter scenario directly reduced victim casualties. Modeling tests show statistically significant fewer casualties in scenarios where on scene armed responders such as resource officers and concealed carry personnel were present.
Acadia National Park Climate Change Scenario Planning Workshop summary
Star, Jonathan; Fisichelli, Nicholas; Bryan, Alexander; Babson, Amanda; Cole-Will, Rebecca; Miller-Rushing, Abraham J.
2016-01-01
This report summarizes outcomes from a two-day scenario planning workshop for Acadia National Park, Maine (ACAD). The primary objective of the workshop was to help ACAD senior leadership make management and planning decisions based on up-to-date climate science and assessments of future uncertainty. The workshop was also designed as a training program, helping build participants' capabilities to develop and use scenarios. The details of the workshop are given in later sections. The climate scenarios presented here are based on published global climate model output. The scenario implications for resources and management decisions are based on expert knowledge distilled through scientist-manager interaction during workgroup break-out sessions at the workshop. Thus, the descriptions below are from these small-group discussions in a workshop setting and should not be taken as vetted research statements of responses to the climate scenarios, but rather as insights and examinations of possible futures (Martin et al. 2011, McBride et al. 2012).
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gschwind, Benoit, E-mail: benoit.gschwind@mines-paristech.fr; Lefevre, Mireille, E-mail: mireille.lefevre@mines-paristech.fr; Blanc, Isabelle, E-mail: isabelle.blanc@mines-paristech.fr
This article proposes a new method to assess the health impact of populations exposed to fine particles (PM{sub 2.5}) during their whole lifetime, which is suitable for comparative analysis of energy scenarios. The method takes into account the variation of particle concentrations over time as well as the evolution of population cohorts. Its capabilities are demonstrated for two pathways of European energy system development up to 2050: the Baseline (BL) and the Low Carbon, Maximum Renewable Power (LC-MRP). These pathways were combined with three sets of assumptions about emission control measures: Current Legislation (CLE), Fixed Emission Factors (FEFs), and themore » Maximum Technically Feasible Reductions (MTFRs). Analysis was carried out for 45 European countries. Average PM{sub 2.5} concentration over Europe in the LC-MRP/CLE scenario is reduced by 58% compared with the BL/FEF case. Health impacts (expressed in days of loss of life expectancy) decrease by 21%. For the LC-MRP/MTFR scenario the average PM{sub 2.5} concentration is reduced by 85% and the health impact by 34%. The methodology was developed within the framework of the EU's FP7 EnerGEO project and was implemented in the Platform of Integrated Assessment (PIA). The Platform enables performing health impact assessments for various energy scenarios. - Highlights: • A new method to assess health impact of PM{sub 2.5} for energy scenarios is proposed. • An algorithm to compute Loss of Life Expectancy attributable to exposure to PM{sub 2.5} is depicted. • Its capabilities are demonstrated for two pathways of European energy system development up to 2050. • Integrating the temporal evolution of PM{sub 2.5} is of great interest for assessing the potential impacts of energy scenarios.« less
A Context-Aware Model to Provide Positioning in Disaster Relief Scenarios
Moreno, Daniel; Ochoa, Sergio F.; Meseguer, Roc
2015-01-01
The effectiveness of the work performed during disaster relief efforts is highly dependent on the coordination of activities conducted by the first responders deployed in the affected area. Such coordination, in turn, depends on an appropriate management of geo-referenced information. Therefore, enabling first responders to count on positioning capabilities during these activities is vital to increase the effectiveness of the response process. The positioning methods used in this scenario must assume a lack of infrastructure-based communication and electrical energy, which usually characterizes affected areas. Although positioning systems such as the Global Positioning System (GPS) have been shown to be useful, we cannot assume that all devices deployed in the area (or most of them) will have positioning capabilities by themselves. Typically, many first responders carry devices that are not capable of performing positioning on their own, but that require such a service. In order to help increase the positioning capability of first responders in disaster-affected areas, this paper presents a context-aware positioning model that allows mobile devices to estimate their position based on information gathered from their surroundings. The performance of the proposed model was evaluated using simulations, and the obtained results show that mobile devices without positioning capabilities were able to use the model to estimate their position. Moreover, the accuracy of the positioning model has been shown to be suitable for conducting most first response activities. PMID:26437406
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gratia, Pierre; Hu, Wayne; Enrico Fermi Institute and Kavli Institute for Cosmological Physics, University of Chicago,South Ellis Avenue, Chicago, IL 60637
Attempts to modify gravity in the infrared typically require a screening mechanism to ensure consistency with local tests of gravity. These screening mechanisms fit into three broad classes; we investigate theories which are capable of exhibiting more than one type of screening. Specifically, we focus on a simple model which exhibits both Vainshtein and kinetic screening. We point out that due to the two characteristic length scales in the problem, the type of screening that dominates depends on the mass of the sourcing object, allowing for different phenomenology at different scales. We consider embedding this double screening phenomenology in amore » broader cosmological scenario and show that the simplest examples that exhibit double screening are radiatively stable.« less
Are Caribbean reef sharks, Carcharhinus perezi, able to perceive human body orientation?
Ritter, Erich K; Amin, Raid
2014-05-01
The present study examines the potential capability of Caribbean reef sharks to perceive human body orientation, as well as discussing the sharks' swimming patterns in a person's vicinity. A standardized video method was used to record the scenario of single SCUBA divers kneeling in the sand and the approach patterns of sharks, combined with a control group of two divers kneeling back-to-back. When approaching a single test-subject, significantly more sharks preferred to swim outside the person's field of vision. The results suggest that these sharks are able to identify human body orientation, but the mechanisms used and factors affecting nearest distance of approach remain unclear.
Wright, Adam; Sittig, Dean F
2015-01-01
Objective Clinical decision support (CDS) is essential for delivery of high-quality, cost-effective, and safe healthcare. The authors sought to evaluate the CDS capabilities across electronic health record (EHR) systems. Methods We evaluated the CDS implementation capabilities of 8 Office of the National Coordinator for Health Information Technology Authorized Certification Body (ONC-ACB)-certified EHRs. Within each EHR, the authors attempted to implement 3 user-defined rules that utilized the various data and logic elements expected of typical EHRs and that represented clinically important evidenced-based care. The rules were: 1) if a patient has amiodarone on his or her active medication list and does not have a thyroid-stimulating hormone (TSH) result recorded in the last 12 months, suggest ordering a TSH; 2) if a patient has a hemoglobin A1c result >7% and does not have diabetes on his or her problem list, suggest adding diabetes to the problem list; and 3) if a patient has coronary artery disease on his or her problem list and does not have aspirin on the active medication list, suggest ordering aspirin. Results Most evaluated EHRs lacked some CDS capabilities; 5 EHRs were able to implement all 3 rules, and the remaining 3 EHRs were unable to implement any of the rules. One of these did not allow users to customize CDS rules at all. The most frequently found shortcomings included the inability to use laboratory test results in rules, limit rules by time, use advanced Boolean logic, perform actions from the alert interface, and adequately test rules. Conclusion Significant improvements in the EHR certification and implementation procedures are necessary. PMID:26104739
Synthetic circuit designs for earth terraformation.
Solé, Ricard V; Montañez, Raúl; Duran-Nebreda, Salva
2015-07-18
Mounting evidence indicates that our planet might experience runaway effects associated to rising temperatures and ecosystem overexploitation, leading to catastrophic shifts on short time scales. Remediation scenarios capable of counterbalancing these effects involve geoengineering, sustainable practices and carbon sequestration, among others. None of these scenarios seems powerful enough to achieve the desired restoration of safe boundaries. We hypothesize that synthetic organisms with the appropriate engineering design could be used to safely prevent declines in some stressed ecosystems and help improving carbon sequestration. Such schemes would include engineering mutualistic dependencies preventing undesired evolutionary processes. We hypothesize that some particular design principles introduce unescapable constraints to the engineered organisms that act as effective firewalls. Testing this designed organisms can be achieved by using controlled bioreactor models, with single and heterogeneous populations, and accurate computational models including different scales (from genetic constructs and metabolic pathways to population dynamics). Our hypothesis heads towards a future anthropogenic action that should effectively act as Terraforming processes. It also implies a major challenge in the existing biosafety policies, since we suggest release of modified organisms as potentially necessary strategy for success.
Definition of technology development missions for early space station satellite servicing, volume 2
NASA Technical Reports Server (NTRS)
1983-01-01
The results of all aspects of the early space station satellite servicing study tasks are presented. These results include identification of servicing tasks (and locations), identification of servicing mission system and detailed objectives, functional/operational requirements analyses of multiple servicing scenarios, assessment of critical servicing technology capabilities and development of an evolutionary capability plan, design and validation of selected servicing technology development missions (TDMs), identification of space station satellite servicing accommodation needs, and the cost and schedule implications of acquiring both required technology capability development and conducting the selected TDMs.
Rapid impact testing for quantitative assessment of large populations of bridges
NASA Astrophysics Data System (ADS)
Zhou, Yun; Prader, John; DeVitis, John; Deal, Adrienne; Zhang, Jian; Moon, Franklin; Aktan, A. Emin
2011-04-01
Although the widely acknowledged shortcomings of visual inspection have fueled significant advances in the areas of non-destructive evaluation and structural health monitoring (SHM) over the last several decades, the actual practice of bridge assessment has remained largely unchanged. The authors believe the lack of adoption, especially of SHM technologies, is related to the 'single structure' scenarios that drive most research. To overcome this, the authors have developed a concept for a rapid single-input, multiple-output (SIMO) impact testing device that will be capable of capturing modal parameters and estimating flexibility/deflection basins of common highway bridges during routine inspections. The device is composed of a trailer-mounted impact source (capable of delivering a 50 kip impact) and retractable sensor arms, and will be controlled by an automated data acquisition, processing and modal parameter estimation software. The research presented in this paper covers (a) the theoretical basis for SISO, SIMO and MIMO impact testing to estimate flexibility, (b) proof of concept numerical studies using a finite element model, and (c) a pilot implementation on an operating highway bridge. Results indicate that the proposed approach can estimate modal flexibility within a few percent of static flexibility; however, the estimated modal flexibility matrix is only reliable for the substructures associated with the various SIMO tests. To overcome this shortcoming, a modal 'stitching' approach for substructure integration to estimate the full Eigen vector matrix is developed, and preliminary results of these methods are also presented.
Status of the Correlation Process of the V-HAB Simulation with Ground Tests and ISS Telemetry Data
NASA Technical Reports Server (NTRS)
Ploetner, Peter; Anderson, Molly S.; Czupalla, Markus; Ewert, Micahel K.; Roth, Christof Martin; Zhulov, Anton
2012-01-01
The Virtual Habitat (V-HAB) is a dynamic Life Support System (LSS) simulation, created to investigate future human spaceflight missions. V-HAB provides the capability to optimize LSS during early design phases. Furthermore, it allows simulation of worst case scenarios which cannot be tested in reality. In a nutshell, the tool allows the testing of LSS robustness by means of computer simulations. V-HAB is a modular simulation consisting of a: 1. Closed Environment Module 2. Crew Module 3. Biological Module 4. Physio-Chemical Module The focus of the paper will be the correlation and validation of V-HAB against ground test and flight data. The ECLSS technologies (CDRA, CCAA, OGA, etc.) are correlated one by one against available ground test data, which is briefly described in this paper. The technology models in V-HAB are merged to simulate the ISS ECLSS. This simulation is correlated against telemetry data from the ISS, including the water recovery system and the air revitalization system. Finally, an analysis of the results is included in this paper.
Atmospheric Transport Modelling and Radionuclide Analysis for the NPE 2015 scenario
NASA Astrophysics Data System (ADS)
Ross, J. Ole; Bollhöfer, Andreas; Heidmann, Verena; Krais, Roman; Schlosser, Clemens; Gestermann, Nicolai; Ceranna, Lars
2017-04-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. The International Monitoring System (IMS) is in place and at about 90% complete to verify compliance with the CTBT. The stations of the waveform technologies are capable to detect seismic, hydro-acoustic and infrasonic signals for detection, localization, and characterization of explosions. For practicing Comprehensive Nuclear-Test-Ban Treaty (CTBT) verification procedures and interplay between the International Data Centre (IDC) and National Data Centres (NDC), prepardness exercises (NPE) are regularly performed with selected events of fictitious CTBT-violation. The German NDC's expertise for radionuclide analyses and operation of station RN33 is provided by the Federal Office for Radiation Protection (BfS) while Atmospheric Transport Modelling (ATM) for CTBT purposes is performed at the Federal Institute for Geosciences and Natural Resources (BGR) for the combination of the radionuclide findings with waveform evidence. The radionuclide part of the NPE 2015 scenario is tackled in a joint effort by BfS and BGR. First, the NPE 2015 spectra are analysed, fission products are identified, and respective activity concentrations are derived. Special focus is on isotopic ratios which allow for source characterization and event timing. For atmospheric backtracking the binary coincidence method is applied for both, SRS fields from IDC and WMO-RSMC, and for in-house backward simulations in higher resolution for the first affected samples. Results are compared with the WebGrape PSR and the spatio-temporal domain with high atmospheric release probability is determined. The ATM results together with the radionuclide fingerprint are used for identification of waveform candidate events. Comparative forward simulations of atmospheric dispersion for candidate events are performed. Finally the overall consistency of various source scenarios is assessed and a fictitious government briefing on the findings is given.
2010-03-01
structure. Table 10 below provides a depiction of all the scenarios analyzed and the following subsections detail the advantages and disadvantages of...total cost of approximately $1 million more. Advantageously , the number of bases required to sustain the necessary throughput capability is...is easier to obtain. This is a particular advantage over the first scenario, as long as the host nation(s) is supportive of a two-shift operation
NASA Technical Reports Server (NTRS)
Beach, B. E.
1981-01-01
Beginning with scenario design and development issues, Eastern Airlines committed itself to the full four-hour LOFT training format without the additional time for specific maneuvers. Abnormals and emergency conditions, pacing, and quiet periods are included in the scenarios which are written for the instructor to follow verbatim. Simulator capabilities, performance assessment; training vs. checking; crew composition and scheduling; satisfactory completion; the use of video performance printouts; the number of instructors; instructor training and standardization; and initial, transition, and upgrade training are discussed.
Advanced instrumentation for Solar System gravitational physics
NASA Astrophysics Data System (ADS)
Peron, Roberto; Bellettini, G.; Berardi, S.; Boni, A.; Cantone, C.; Coradini, A.; Currie, D. G.; Dell'Agnello, S.; Delle Monache, G. O.; Fiorenza, E.; Garattini, M.; Iafolla, V.; Intaglietta, N.; Lefevre, C.; Lops, C.; March, R.; Martini, M.; Nozzoli, S.; Patrizi, G.; Porcelli, L.; Reale, A.; Santoli, F.; Tauraso, R.; Vittori, R.
2010-05-01
The Solar System is a complex laboratory for testing gravitational physics. Indeed, its scale and hierarchical structure make possible a wide range of tests for gravitational theories, studying the motion of both natural and artificial objects. The usual methodology makes use of tracking information related to the bodies, fitted by a suitable dynamical model. Different equations of motion are provided by different theories, which can be therefore tested and compared. Future exploration scenarios show the possibility of placing deep-space probes near the Sun or in outer Solar System, thereby extending the available experimental data sets. In particular, the Earth-Moon is the most accurately known gravitational three-body laboratory, which is undergoing a new, strong wave of research and exploration (both robotic and manned). In addition, the benefits of a synergetic study of planetary science and gravitational physics are of the greatest importance (as shown by the success of the Apollo program), especially in the Earth-Moon, Mars-Phobos, Jovian and Saturnian sub-suystems. This scenarios open critical issues regarding the quality of the available dynamical models, i.e. their capability of fitting data without an excessive number of empirical hypotheses. A typical case is represented by the non-gravitational phenomena, which in general are difficult to model. More generally, gravitation tests with Lunar Laser Ranging, inner or outer Solar System probes and the appearance of the so-called 'anomalies'(like the one indicated by the Pioneers), whatever their real origin (either instrumental effects or due to new physics), show the necessity of a coordinated improvement of tracking and modelization techniques. A common research path will be discussed, employing the development and use of advanced instrumentation to cope with current limitations of Solar System gravitational tests. In particular, the use of high-sensitivity accelerometers, combined with microwave and laser tracking, will be discussed.
Twisk, Divera; Vlakveld, Willem; Mesken, Jolieke; Shope, Jean T; Kok, Gerjo
2013-06-01
Road injuries are a prime cause of death in early adolescence. Often road safety education (RSE) is used to target risky road behaviour in this age group. These RSE programmes are frequently based on the assumption that deliberate risk taking rather than lack of competency underlies risk behaviour. This study tested the competency of 10-13 year olds, by examining their decisions - as pedestrians and cyclists - in dealing with blind spot areas around lorries. Also, the effects of an awareness programme and a competency programme on these decisions were evaluated. Table-top models were used, representing seven scenarios that differed in complexity: one basic scenario to test the identification of blind spot areas, and 6 traffic scenarios to test behaviour in traffic situations of low or high task complexity. Using a quasi-experimental design (pre-test and post-test reference group design without randomization), the programme effects were assessed by requiring participants (n=62) to show, for each table-top traffic scenario, how they would act if they were in that traffic situation. On the basic scenario, at pre-test 42% of the youngsters identified all blind spots correctly, but only 27% showed safe behaviour in simple scenarios and 5% in complex scenarios. The competency programme yielded improved performance on the basic scenario but not on the traffic scenarios, whereas the awareness programme did not result in any improvements. The correlation between improvements on the basic scenarios and the traffic scenarios was not significant. Young adolescents have not yet mastered the necessary skills for safe performance in simple and complex traffic situations, thus underlining the need for effective prevention programmes. RSE may improve the understanding of blind spot areas but this does not 'automatically' transfer to performance in traffic situations. Implications for the design of RSE are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.
Prospective testing of neo-deterministic seismic hazard scenarios for the Italian territory
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Vaccari, Franco; Kossobokov, Vladimir; Panza, Giuliano F.
2013-04-01
A reliable and comprehensive characterization of expected seismic ground shaking, eventually including the related time information, is essential in order to develop effective mitigation strategies and increase earthquake preparedness. Moreover, any effective tool for SHA must demonstrate its capability in anticipating the ground shaking related with large earthquake occurrences, a result that can be attained only through rigorous verification and validation process. So far, the major problems in classical probabilistic methods for seismic hazard assessment, PSHA, consisted in the adequate description of the earthquake recurrence, particularly for the largest and sporadic events, and of the attenuation models, which may be unable to account for the complexity of the medium and of the seismic sources and are often weekly constrained by the available observations. Current computational resources and physical knowledge of the seismic waves generation and propagation processes allow nowadays for viable numerical and analytical alternatives to the use of attenuation relations. Accordingly, a scenario-based neo-deterministic approach, NDSHA, to seismic hazard assessment has been proposed, which allows considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement readily applicable to seismic isolation techniques. Based on NDSHA, an operational integrated procedure for seismic hazard assessment has been developed, that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of formally defined earthquake predictions. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with algorithms for the realistic modeling of ground motion. Accordingly, a set of deterministic scenarios of ground motion at bedrock, which refers to the time interval when a strong event is likely to occur within the alerted area, can be defined by means of full waveform modeling, both at regional and local scale. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are regularly updated every two months since 2006. The routine application of the time-dependent NDSHA approach provides information that can be useful in assigning priorities for timely mitigation actions and, at the same time, allows for a rigorous prospective testing and validation of the proposed methodology. As an example, for sites where ground shaking values greater than 0.2 g are estimated at bedrock, further investigations can be performed taking into account the local soil conditions, to assess the performances of relevant structures, such as historical and strategic buildings. The issues related with prospective testing and validation of the time-dependent NDSHA scenarios will be discussed, illustrating the results obtained for the recent strong earthquakes in Italy, including the May 20, 2012 Emilia earthquake.
Executive Summary of Propulsion on the Orion Abort Flight-Test Vehicles
NASA Technical Reports Server (NTRS)
Jones, Daniel S.; Brooks, Syri J.; Barnes, Marvin W.; McCauley, Rachel J.; Wall, Terry M.; Reed, Brian D.; Duncan, C. Miguel
2012-01-01
The National Aeronautics and Space Administration Orion Flight Test Office was tasked with conducting a series of flight tests in several launch abort scenarios to certify that the Orion Launch Abort System is capable of delivering astronauts aboard the Orion Crew Module to a safe environment, away from a failed booster. The first of this series was the Orion Pad Abort 1 Flight-Test Vehicle, which was successfully flown on May 6, 2010 at the White Sands Missile Range in New Mexico. This report provides a brief overview of the three propulsive subsystems used on the Pad Abort 1 Flight-Test Vehicle. An overview of the propulsive systems originally planned for future flight-test vehicles is also provided, which also includes the cold gas Reaction Control System within the Crew Module, and the Peacekeeper first stage rocket motor encased within the Abort Test Booster aeroshell. Although the Constellation program has been cancelled and the operational role of the Orion spacecraft has significantly evolved, lessons learned from Pad Abort 1 and the other flight-test vehicles could certainly contribute to the vehicle architecture of many future human-rated space launch vehicles
Operational modeling system with dynamic-wave routing
Ishii, A.L.; Charlton, T.J.; Ortel, T.W.; Vonnahme, C.C.; ,
1998-01-01
A near real-time streamflow-simulation system utilizing continuous-simulation rainfall-runoff generation with dynamic-wave routing is being developed by the U.S. Geological Survey in cooperation with the Du Page County Department of Environmental Concerns for a 24-kilometer reach of Salt Creek in Du Page County, Illinois. This system is needed in order to more effectively manage the Elmhurst Quarry Flood Control Facility, an off-line stormwater diversion reservoir located along Salt Creek. Near real time simulation capabilities will enable the testing and evaluation of potential rainfall, diversion, and return-flow scenarios on water-surface elevations along Salt Creek before implementing diversions or return-flows. The climatological inputs for the continuous-simulation rainfall-runoff model, Hydrologic Simulation Program - FORTRAN (HSPF) are obtained by Internet access and from a network of radio-telemetered precipitation gages reporting to a base-station computer. The unit area runoff time series generated from HSPF are the input for the dynamic-wave routing model. Full Equations (FEQ). The Generation and Analysis of Model Simulation Scenarios (GENSCN) interface is used as a pre- and post-processor for managing input data and displaying and managing simulation results. The GENSCN interface includes a variety of graphical and analytical tools for evaluation and quick visualization of the results of operational scenario simulations and thereby makes it possible to obtain the full benefit of the fully distributed dynamic routing results.
Coordinated Development and Deployment of Scenarios for Sustained Assessment
NASA Astrophysics Data System (ADS)
Lipschultz, F.; Weaver, C. P.; Leidner, A. K.; Delgado, A.; Grambsch, A.
2017-12-01
There has been a clear need for a more coordinated Federal government approach for authoritative, climate-relevant scenarios to support growing demands by decision-makers, to meet stakeholder needs for consistent approaches and guidance, and to better address the needs of the impacts, adaptation and vulnerability community. To begin to satisfy these decision-support needs, in early 2015 the U.S. Global Change Research Program (USGCRP) began coordinated production of scenario information for use across a suite of USGCRP activities. These have been implemented in the 4th National Climate Assessment (NCA4), the Climate Science Special Report and the Climate Resilience Toolkit (CRT), all of which are intended to help better organize, summarize, and communicate science to decision-makers as they think about our future. First, USGCRP introduced and implemented an explicit risk-framing approach across the entire scenario enterprise to encourage exploration of tail risks. A suite of scenario products was developed framed around three simplified storylines: `Lower', `Higher', and `Upper Bound' departures from current baselines. Second, USGCRP developed future climate information for the U.S. using Representative Concentration Pathway (RCP) 8.5 and RCP 4.5, including a weighted mean of Global Climate Models and adoption of an improved statistical downscaling approach across USGCRP products. Additional variables were derived from the downscaled parameters for use across USGCRP reports and in the CRT's Climate Explorer tool. Third, and given the need to address other tightly-coupled global changes in a more integrated way, a set of population, housing density, and impervious surface projections were developed based on global scenarios. In addition, USGCRP and the National Ocean Council developed scenarios of future sea-level rise and coastal-flood hazard for the U.S. and integrated them into existing Federal capabilities to support preparedness planning. To better convey these scenario components, next steps include capability for dynamic interaction between NCA4 products and CRT to permit users to explore and customize relevant information for their decision at spatial scales that matter to them, as well as links to more in-depth CRT content.
NASA Astrophysics Data System (ADS)
Sussman, A. J.; Macleod, G.; Labak, P.; Malich, G.; Rowlands, A. P.; Craven, J.; Sweeney, J. J.; Chiappini, M.; Tuckwell, G.; Sankey, P.
2015-12-01
The Integrated Field Exercise of 2014 (IFE14) was an event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of an on-site inspection (OSI) within the CTBT verification regime. During an OSI, up to 40 international inspectors will search an area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of a real OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams (which executed the scenario in which the exercise was played) and those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test and integrate Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, suites of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, in addition to other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection using other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials, and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of the goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.
Providing a parallel and distributed capability for JMASS using SPEEDES
NASA Astrophysics Data System (ADS)
Valinski, Maria; Driscoll, Jonathan; McGraw, Robert M.; Meyer, Bob
2002-07-01
The Joint Modeling And Simulation System (JMASS) is a Tri-Service simulation environment that supports engineering and engagement-level simulations. As JMASS is expanded to support other Tri-Service domains, the current set of modeling services must be expanded for High Performance Computing (HPC) applications by adding support for advanced time-management algorithms, parallel and distributed topologies, and high speed communications. By providing support for these services, JMASS can better address modeling domains requiring parallel computationally intense calculations such clutter, vulnerability and lethality calculations, and underwater-based scenarios. A risk reduction effort implementing some HPC services for JMASS using the SPEEDES (Synchronous Parallel Environment for Emulation and Discrete Event Simulation) Simulation Framework has recently concluded. As an artifact of the JMASS-SPEEDES integration, not only can HPC functionality be brought to the JMASS program through SPEEDES, but an additional HLA-based capability can be demonstrated that further addresses interoperability issues. The JMASS-SPEEDES integration provided a means of adding HLA capability to preexisting JMASS scenarios through an implementation of the standard JMASS port communication mechanism that allows players to communicate.
Information Management for Unmanned Systems: Combining DL-Reasoning with Publish/Subscribe
NASA Astrophysics Data System (ADS)
Moser, Herwig; Reichelt, Toni; Oswald, Norbert; Förster, Stefan
Sharing capabilities and information between collaborating entities by using modem information- and communication-technology is a core principle in complex distributed civil or military mission scenarios. Previous work proved the suitability of Service-oriented Architectures for modelling and sharing the participating entities' capabilities. Albeit providing a satisfactory model for capabilities sharing, pure service-orientation curtails expressiveness for information exchange as opposed to dedicated data-centric communication principles. In this paper we introduce an Information Management System which combines OWL-Ontologies and automated reasoning with Publish/Subscribe-Systems, providing for a shared but decoupled data model. While confirming existing related research results, we emphasise the novel application and lack of practical experience of using Semantic Web technologies in areas other than originally intended. That is, aiding decision support and software design in the context of a mission scenario for an unmanned system. Experiments within a complex simulation environment show the immediate benefits of a semantic information-management and -dissemination platform: Clear separation of concerns in code and data model, increased service re-usability and extensibility as well as regulation of data flow and respective system behaviour through declarative rules.
Incorpoaration of Geosensor Networks Into Internet of Things for Environmental Monitoring
NASA Astrophysics Data System (ADS)
Habibi, R.; Alesheikh, A. A.
2015-12-01
Thanks to the recent advances of miniaturization and the falling costs for sensors and also communication technologies, Internet specially, the number of internet-connected things growth tremendously. Moreover, geosensors with capability of generating high spatial and temporal resolution data, measuring a vast diversity of environmental data and automated operations provide powerful abilities to environmental monitoring tasks. Geosensor nodes are intuitively heterogeneous in terms of the hardware capabilities and communication protocols to take part in the Internet of Things scenarios. Therefore, ensuring interoperability is an important step. With this respect, the focus of this paper is particularly on incorporation of geosensor networks into Internet of things through an architecture for monitoring real-time environmental data with use of OGC Sensor Web Enablement standards. This approach and its applicability is discussed in the context of an air pollution monitoring scenario.
Exploration Medical System Technical Development
NASA Technical Reports Server (NTRS)
McGuire, K.; Middour, C.; Cerro, J.; Burba, T.; Hanson, A.; Reilly, J.; Mindock, J.
2017-01-01
The Exploration Medical Capability (ExMC) Element systems engineering goals include defining the technical system needed to implement exploration medical capabilities for Mars. This past year, scenarios captured in the medical system concept of operations laid the foundation for systems engineering technical development work. The systems engineering team analyzed scenario content to identify interactions between the medical system, crewmembers, the exploration vehicle, and the ground system. This enabled the definition of functions the medical system must provide and interfaces to crewmembers and other systems. These analyses additionally lead to the development of a conceptual medical system architecture. The work supports the ExMC community-wide understanding of the functional exploration needs to be met by the medical system, the subsequent development of medical system requirements, and the system verification and validation approach utilizing terrestrial analogs and precursor exploration missions.
Robotic disaster recovery efforts with ad-hoc deployable cloud computing
NASA Astrophysics Data System (ADS)
Straub, Jeremy; Marsh, Ronald; Mohammad, Atif F.
2013-06-01
Autonomous operations of search and rescue (SaR) robots is an ill posed problem, which is complexified by the dynamic disaster recovery environment. In a typical SaR response scenario, responder robots will require different levels of processing capabilities during various parts of the response effort and will need to utilize multiple algorithms. Placing these capabilities onboard the robot is a mediocre solution that precludes algorithm specific performance optimization and results in mediocre performance. Architecture for an ad-hoc, deployable cloud environment suitable for use in a disaster response scenario is presented. Under this model, each service provider is optimized for the task and maintains a database of situation-relevant information. This service-oriented architecture (SOA 3.0) compliant framework also serves as an example of the efficient use of SOA 3.0 in an actual cloud application.
Shuttle Abort Flight Management (SAFM) - Application Overview
NASA Technical Reports Server (NTRS)
Hu, Howard; Straube, Tim; Madsen, Jennifer; Ricard, Mike
2002-01-01
One of the most demanding tasks that must be performed by the Space Shuttle flight crew is the process of determining whether, when and where to abort the vehicle should engine or system failures occur during ascent or entry. Current Shuttle abort procedures involve paging through complicated paper checklists to decide on the type of abort and where to abort. Additional checklists then lead the crew through a series of actions to execute the desired abort. This process is even more difficult and time consuming in the absence of ground communications since the ground flight controllers have the analysis tools and information that is currently not available in the Shuttle cockpit. Crew workload specifically abort procedures will be greatly simplified with the implementation of the Space Shuttle Cockpit Avionics Upgrade (CAU) project. The intent of CAU is to maximize crew situational awareness and reduce flight workload thru enhanced controls and displays, and onboard abort assessment and determination capability. SAFM was developed to help satisfy the CAU objectives by providing the crew with dynamic information about the capability of the vehicle to perform a variety of abort options during ascent and entry. This paper- presents an overview of the SAFM application. As shown in Figure 1, SAFM processes the vehicle navigation state and other guidance information to provide the CAU displays with evaluations of abort options, as well as landing site recommendations. This is accomplished by three main SAFM components: the Sequencer Executive, the Powered Flight Function, and the Glided Flight Function, The Sequencer Executive dispatches the Powered and Glided Flight Functions to evaluate the vehicle's capability to execute the current mission (or current abort), as well as more than IS hypothetical abort options or scenarios. Scenarios are sequenced and evaluated throughout powered and glided flight. Abort scenarios evaluated include Abort to Orbit (ATO), Transatlantic Abort Landing (TAL), East Coast Abort Landing (ECAL) and Return to Launch Site (RTLS). Sequential and simultaneous engine failures are assessed and landing footprint information is provided during actual entry scenarios as well as hypothetical "loss of thrust now" scenarios during ascent.
2013-01-01
Background The extent to which a genomic test will be used in practice is affected by factors such as ability of the test to correctly predict response to treatment (i.e. sensitivity and specificity of the test), invasiveness of the testing procedure, test cost, and the probability and severity of side effects associated with treatment. Methods Using discrete choice experimentation (DCE), we elicited preferences of the public (Sample 1, N = 533 and Sample 2, N = 525) and cancer patients (Sample 3, N = 38) for different attributes of a hypothetical genomic test for guiding cancer treatment. Samples 1 and 3 considered the test/treatment in the context of an aggressive curable cancer (scenario A) while the scenario for sample 2 was based on a non-aggressive incurable cancer (scenario B). Results In aggressive curable cancer (scenario A), everything else being equal, the odds ratio (OR) of choosing a test with 95% sensitivity was 1.41 (versus a test with 50% sensitivity) and willingness to pay (WTP) was $1331, on average, for this amount of improvement in test sensitivity. In this scenario, the OR of choosing a test with 95% specificity was 1.24 times that of a test with 50% specificity (WTP = $827). In non-aggressive incurable cancer (scenario B), the OR of choosing a test with 95% sensitivity was 1.65 (WTP = $1344), and the OR of choosing a test with 95% specificity was 1.50 (WTP = $1080). Reducing severity of treatment side effects from severe to mild was associated with large ORs in both scenarios (OR = 2.10 and 2.24 in scenario A and B, respectively). In contrast, patients had a very large preference for 95% sensitivity of the test (OR = 5.23). Conclusion The type and prognosis of cancer affected preferences for genomically-guided treatment. In aggressive curable cancer, individuals emphasized more on the sensitivity rather than the specificity of the test. In contrast, for a non-aggressive incurable cancer, individuals put similar emphasis on sensitivity and specificity of the test. While the public expressed strong preference toward lowering severity of side effects, improving sensitivity of the test had by far the largest influence on patients’ decision to use genomic testing. PMID:24176050
JSpOC Mission System Application Development Environment
NASA Astrophysics Data System (ADS)
Luce, R.; Reele, P.; Sabol, C.; Zetocha, P.; Echeverry, J.; Kim, R.; Golf, B.
2012-09-01
The Joint Space Operations Center (JSpOC) Mission System (JMS) is the program of record tasked with replacing the legacy Space Defense Operations Center (SPADOC) and Astrodynamics Support Workstation (ASW) capabilities by the end of FY2015 as well as providing additional Space Situational Awareness (SSA) and Command and Control (C2) capabilities post-FY2015. To meet the legacy replacement goal, the JMS program is maturing a government Service Oriented Architecture (SOA) infrastructure that supports the integration of mission applications while acquiring mature industry and government mission applications. Future capabilities required by the JSpOC after 2015 will require development of new applications and procedures as well as the exploitation of new SSA data sources. To support the post FY2015 efforts, the JMS program is partnering with the Air Force Research Laboratory (AFRL) to build a JMS application development environment. The purpose of this environment is to: 1) empower the research & development community, through access to relevant tools and data, to accelerate technology development, 2) allow the JMS program to communicate user capability priorities and requirements to the developer community, 3) provide the JMS program with access to state-of-the-art research, development, and computing capabilities, and 4) support market research efforts by identifying outstanding performers that are available to shepherd into the formal transition process. The application development environment will consist of both unclassified and classified environments that can be accessed over common networks (including the Internet) to provide software developers, scientists, and engineers everything they need (e.g., building block JMS services, modeling and simulation tools, relevant test scenarios, documentation, data sources, user priorities/requirements, and SOA integration tools) to develop and test mission applications. The developed applications will be exercised in these relevant environments with representative data sets to help bridge the gap between development and integration into the operational JMS enterprise.
Health care delivery system for long duration manned space operations
NASA Technical Reports Server (NTRS)
Logan, J. S.; Shulman, E. L.; Johnson, P. C.
1983-01-01
Specific requirements for medical support of a long-duration manned facility in a low earth orbit derive from inflight medical experience, projected medical scenarios, mission related spacecraft and environmental hazards, health maintenance, and preventive medicine. A sequential buildup of medical capabilities tailored to increasing mission complexity is proposed. The space station health maintenance facility must provide preventive, diagnostic, and therapeutic medical support as immediate rescue capability may not exist.
2016-09-01
Congress. Consequently, as prepared now, this report does not help DOD leaders identify assets that could be used in a cyber crisis scenario...Guidance. GAO-13-128. Washington, D.C.: October 24, 2012. Defense Cyber Efforts: Management Improvements Needed to Enhance Programs Protecting the...DEFENSE CIVIL SUPPORT DOD Needs to Identify National Guard’s Cyber Capabilities and Address Challenges in Its
Systematics for checking geometric errors in CNC lathes
NASA Astrophysics Data System (ADS)
Araújo, R. P.; Rolim, T. L.
2015-10-01
Non-idealities presented in machine tools compromise directly both the geometry and the dimensions of machined parts, generating distortions in the project. Given the competitive scenario among different companies, it is necessary to have knowledge of the geometric behavior of these machines in order to be able to establish their processing capability, avoiding waste of time and materials as well as satisfying customer requirements. But despite the fact that geometric tests are important and necessary to clarify the use of the machine correctly, therefore preventing future damage, most users do not apply such tests on their machines for lack of knowledge or lack of proper motivation, basically due to two factors: long period of time and high costs of testing. This work proposes a systematics for checking straightness and perpendicularity errors in CNC lathes demanding little time and cost with high metrological reliability, to be used on factory floors of small and medium-size businesses to ensure the quality of its products and make them competitive.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Paul A.; Cooper, Candice Frances; Burnett, Damon J.
Light body armor development for the warfighter is based on trial-and-error testing of prototype designs against ballistic projectiles. Torso armor testing against blast is virtually nonexistent but necessary to ensure adequate protection against injury to the heart and lungs. In this report, we discuss the development of a high-fidelity human torso model, it's merging with the existing Sandia Human Head-Neck Model, and development of the modeling & simulation (M&S) capabilities necessary to simulate wound injury scenarios. Using the new Sandia Human Torso Model, we demonstrate the advantage of virtual simulation in the investigation of wound injury as it relates tomore » the warfighter experience. We present the results of virtual simulations of blast loading and ballistic projectile impact to the tors o with and without notional protective armor. In this manner, we demonstrate the ad vantages of applying a modeling and simulation approach to the investigation of wound injury and relative merit assessments of protective body armor without the need for trial-and-error testing.« less
Multi-GPU three dimensional Stokes solver for simulating glacier flow
NASA Astrophysics Data System (ADS)
Licul, Aleksandar; Herman, Frédéric; Podladchikov, Yuri; Räss, Ludovic; Omlin, Samuel
2016-04-01
Here we present how we have recently developed a three-dimensional Stokes solver on the GPUs and apply it to a glacier flow. We numerically solve the Stokes momentum balance equations together with the incompressibility equation, while also taking into account strong nonlinearities for ice rheology. We have developed a fully three-dimensional numerical MATLAB application based on an iterative finite difference scheme with preconditioning of residuals. Differential equations are discretized on a regular staggered grid. We have ported it to C-CUDA to run it on GPU's in parallel, using MPI. We demonstrate the accuracy and efficiency of our developed model by manufactured analytical solution test for three-dimensional Stokes ice sheet models (Leng et al.,2013) and by comparison with other well-established ice sheet models on diagnostic ISMIP-HOM benchmark experiments (Pattyn et al., 2008). The results show that our developed model is capable to accurately and efficiently solve Stokes system of equations in a variety of different test scenarios, while preserving good parallel efficiency on up to 80 GPU's. For example, in 3D test scenarios with 250000 grid points our solver converges in around 3 minutes for single precision computations and around 10 minutes for double precision computations. We have also optimized the developed code to efficiently run on our newly acquired state-of-the-art GPU cluster octopus. This allows us to solve our problem on more than 20 million grid points, by just increasing the number of GPU used, while keeping the computation time the same. In future work we will apply our solver to real world applications and implement the free surface evolution capabilities. REFERENCES Leng,W.,Ju,L.,Gunzburger,M. & Price,S., 2013. Manufactured solutions and the verification of three-dimensional stokes ice-sheet models. Cryosphere 7,19-29. Pattyn, F., Perichon, L., Aschwanden, A., Breuer, B., de Smedt, B., Gagliardini, O., Gudmundsson,G.H., Hindmarsh, R.C.A., Hubbard, A., Johnson, J.V., Kleiner, T., Konovalov,Y., Martin, C., Payne, A.J., Pollard, D., Price, S., Rckamp, M., Saito, F., Souk, O.,Sugiyama, S. & Zwinger, T., 2008. Benchmark experiments for higher-order and full-stokes ice sheet models (ismiphom). The Cryosphere 2, 95-108.
Joint FACET: the Canada-Netherlands initiative to study multisensor data fusion systems
NASA Astrophysics Data System (ADS)
Bosse, Eloi; Theil, Arne; Roy, Jean; Huizing, Albert G.; van Aartsen, Simon
1998-09-01
This paper presents the progress of a collaborative effort between Canada and The Netherlands in analyzing multi-sensor data fusion systems, e.g. for potential application to their respective frigates. In view of the overlapping interest in studying and comparing applicability and performance and advanced state-of-the-art Multi-Sensor Data FUsion (MSDF) techniques, the two research establishments involved have decided to join their efforts in the development of MSDF testbeds. This resulted in the so-called Joint-FACET, a highly modular and flexible series of applications that is capable of processing both real and synthetic input data. Joint-FACET allows the user to create and edit test scenarios with multiple ships, sensor and targets, generate realistic sensor outputs, and to process these outputs with a variety of MSDF algorithms. These MSDF algorithms can also be tested using typical experimental data collected during live military exercises.
Large Field Photogrammetry Techniques in Aircraft and Spacecraft Impact Testing
NASA Technical Reports Server (NTRS)
Littell, Justin D.
2010-01-01
The Landing and Impact Research Facility (LandIR) at NASA Langley Research Center is a 240 ft. high A-frame structure which is used for full-scale crash testing of aircraft and rotorcraft vehicles. Because the LandIR provides a unique capability to introduce impact velocities in the forward and vertical directions, it is also serving as the facility for landing tests on full-scale and sub-scale Orion spacecraft mass simulators. Recently, a three-dimensional photogrammetry system was acquired to assist with the gathering of vehicle flight data before, throughout and after the impact. This data provides the basis for the post-test analysis and data reduction. Experimental setups for pendulum swing tests on vehicles having both forward and vertical velocities can extend to 50 x 50 x 50 foot cubes, while weather, vehicle geometry, and other constraints make each experimental setup unique to each test. This paper will discuss the specific calibration techniques for large fields of views, camera and lens selection, data processing, as well as best practice techniques learned from using the large field of view photogrammetry on a multitude of crash and landing test scenarios unique to the LandIR.
Model Capabilities | Regional Energy Deployment System Model | Energy
representation of those effects throughout the scenario. Because those effects are highly non-linear and other models, limited foresight, price penalties for rapid growth, and other non-linear effects
NASA Astrophysics Data System (ADS)
Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir
2018-04-01
Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.
Horowitz, A.J.; Smith, J.J.; Elrick, K.A.
2001-01-01
A prototype 14-L Teflon? churn splitter was evaluated for whole-water sample-splitting capabilities over a range of sediment concentratons and grain sizes as well as for potential chemical contamination from both organic and inorganic constituents. These evaluations represent a 'best-case' scenario because they were performed in the controlled environment of a laboratory, and used monomineralic silica sand slurries of known concentration made up in deionized water. Further, all splitting was performed by a single operator, and all the requisite concentration analyses were performed by a single laboratory. The prototype Teflon? churn splitter did not appear to supply significant concentrations of either organic or inorganic contaminants at current U.S. Geological Survey (USGS) National Water Quality Laboratory detection and reporting limits when test samples were prepared using current USGS protocols. As with the polyethylene equivalent of the prototype Teflon? churn, the maximum usable whole-water suspended sediment concentration for the prototype churn appears to lie between 1,000 and 10,000 milligrams per liter (mg/L). Further, the maximum grain-size limit appears to lie between 125- and 250-microns (m). Tests to determine the efficacy of the valve baffle indicate that it must be retained to facilitate representative whole-water subsampling.
A prototype Crew Medical Restraint System (CMRS) for Space Station Freedom
NASA Technical Reports Server (NTRS)
Johnston, S. L.; Eichstadt, F. T.; Billica, R. D.
1992-01-01
The Crew Medical Restrain System (CMRS) is a prototype system designed and developed for use as a universally deployable medical restraint/workstation on Space Station Freedom (SSF), the Shuttle Transportation System (STS), and the Assured Crew Rescue Vehicle (ACRV) for support of an ill or injured crewmember requiring stabilization and transportation to Earth. The CMRS will support all medical capabilities of the Health Maintenance Facility (HMF) by providing a restraint/interface system for all equipment (advance life support packs, defibrillator, ventilator, portable oxygen supply, IV pump, transport monitor, transport aspirator, and intervenous fluids delivery system) and personnel (patient and crew medical officers). It must be functional within the STS, ACRV, and all SSF habitable volumes. The CMRS will allow for medical capabilities within CPR, ACLS and ATLS standards of care. This must all be accomplished for a worst case transport time scenario of 24 hours from SSF to a definitive medical care facility on Earth. A presentation of the above design prototype with its subsequent one year SSF/HMF and STS/ACRV high fidelity mock-up ground based simulation testing will be given. Also, parabolic flight and underwater Weightless Test Facility evaluations will be demonstrated for various medical contingencies. The final design configuration to date will be discussed with future space program impact considerations.
Tyrer, Jonathan P; Guo, Qi; Easton, Douglas F; Pharoah, Paul D P
2013-06-06
The development of genotyping arrays containing hundreds of thousands of rare variants across the genome and advances in high-throughput sequencing technologies have made feasible empirical genetic association studies to search for rare disease susceptibility alleles. As single variant testing is underpowered to detect associations, the development of statistical methods to combine analysis across variants - so-called "burden tests" - is an area of active research interest. We previously developed a method, the admixture maximum likelihood test, to test multiple, common variants for association with a trait of interest. We have extended this method, called the rare admixture maximum likelihood test (RAML), for the analysis of rare variants. In this paper we compare the performance of RAML with six other burden tests designed to test for association of rare variants. We used simulation testing over a range of scenarios to test the power of RAML compared to the other rare variant association testing methods. These scenarios modelled differences in effect variability, the average direction of effect and the proportion of associated variants. We evaluated the power for all the different scenarios. RAML tended to have the greatest power for most scenarios where the proportion of associated variants was small, whereas SKAT-O performed a little better for the scenarios with a higher proportion of associated variants. The RAML method makes no assumptions about the proportion of variants that are associated with the phenotype of interest or the magnitude and direction of their effect. The method is flexible and can be applied to both dichotomous and quantitative traits and allows for the inclusion of covariates in the underlying regression model. The RAML method performed well compared to the other methods over a wide range of scenarios. Generally power was moderate in most of the scenarios, underlying the need for large sample sizes in any form of association testing.
NASA Astrophysics Data System (ADS)
Wang, Hexiang; Schuster, Eugenio; Rafiq, Tariq; Kritz, Arnold; Ding, Siye
2016-10-01
Extensive research has been conducted to find high-performance operating scenarios characterized by high fusion gain, good confinement, plasma stability and possible steady-state operation. A key plasma property that is related to both the stability and performance of these advanced plasma scenarios is the safety factor profile. A key component of the EAST research program is the exploration of non-inductively driven steady-state plasmas with the recently upgraded heating and current drive capabilities that include lower hybrid current drive and neutral beam injection. Anticipating the need for tight regulation of the safety factor profile in these plasma scenarios, a first-principles-driven (FPD)control-oriented model is proposed to describe the safety factor profile evolution in EAST in response to the different actuators. The TRANSP simulation code is employed to tailor the FPD model to the EAST tokamak geometry and to convert it into a form suitable for control design. The FPD control-oriented model's prediction capabilities are demonstrated by comparing predictions with experimental data from EAST. Supported by the US DOE under DE-SC0010537,DE-FG02-92ER54141 and DE-SC0013977.
Husak, Gregory J.; Michaelsen, Joel C.; Funk, Christopher C.
2007-01-01
Evaluating a range of scenarios that accurately reflect precipitation variability is critical for water resource applications. Inputs to these applications can be provided using location- and interval-specific probability distributions. These distributions make it possible to estimate the likelihood of rainfall being within a specified range. In this paper, we demonstrate the feasibility of fitting cell-by-cell probability distributions to grids of monthly interpolated, continent-wide data. Future work will then detail applications of these grids to improved satellite-remote sensing of drought and interpretations of probabilistic climate outlook forum forecasts. The gamma distribution is well suited to these applications because it is fairly familiar to African scientists, and capable of representing a variety of distribution shapes. This study tests the goodness-of-fit using the Kolmogorov–Smirnov (KS) test, and compares these results against another distribution commonly used in rainfall events, the Weibull. The gamma distribution is suitable for roughly 98% of the locations over all months. The techniques and results presented in this study provide a foundation for use of the gamma distribution to generate drivers for various rain-related models. These models are used as decision support tools for the management of water and agricultural resources as well as food reserves by providing decision makers with ways to evaluate the likelihood of various rainfall accumulations and assess different scenarios in Africa.
NASA Astrophysics Data System (ADS)
Momoh, James A.; Salkuti, Surender Reddy
2016-06-01
This paper proposes a stochastic optimization technique for solving the Voltage/VAr control problem including the load demand and Renewable Energy Resources (RERs) variation. The RERs often take along some inputs like stochastic behavior. One of the important challenges i. e., Voltage/VAr control is a prime source for handling power system complexity and reliability, hence it is the fundamental requirement for all the utility companies. There is a need for the robust and efficient Voltage/VAr optimization technique to meet the peak demand and reduction of system losses. The voltages beyond the limit may damage costly sub-station devices and equipments at consumer end as well. Especially, the RERs introduces more disturbances and some of the RERs are not even capable enough to meet the VAr demand. Therefore, there is a strong need for the Voltage/VAr control in RERs environment. This paper aims at the development of optimal scheme for Voltage/VAr control involving RERs. In this paper, Latin Hypercube Sampling (LHS) method is used to cover full range of variables by maximally satisfying the marginal distribution. Here, backward scenario reduction technique is used to reduce the number of scenarios effectively and maximally retain the fitting accuracy of samples. The developed optimization scheme is tested on IEEE 24 bus Reliability Test System (RTS) considering the load demand and RERs variation.
Development of crash imminent test scenarios for Integrated Vehicle-Based Safety Systems
DOT National Transportation Integrated Search
2007-04-01
This report identifies crash imminent test scenarios based on common pre-crash scenarios for integrated vehicle-based safety systems that alert the driver of a light vehicle or a heavy truck to an impending rear-end, lane change, or run-off-road cras...
Education and Public Outreach and Engagement at NASA's Analog Missions in 2012
NASA Technical Reports Server (NTRS)
Watkins, Wendy L.; Janoiko, Barbara A.; Mahoney, Erin; Hermann, Nicole B.
2013-01-01
Analog missions are integrated, multi-disciplinary activities that test key features of future human space exploration missions in an integrated fashion to gain a deeper understanding of system-level interactions and operations early in conceptual development. These tests often are conducted in remote and extreme environments that are representative in one or more ways to that of future spaceflight destinations. They may also be conducted at NASA facilities, using advanced modeling and human-in-the-loop scenarios. As NASA develops a capability driven framework to transport crew to a variety of space environments, it will use analog missions to gather requirements and develop the technologies necessary to ensure successful exploration beyond low Earth orbit. NASA s Advanced Exploration Systems (AES) Division conducts these high-fidelity integrated tests, including the coordination and execution of a robust education and public outreach (EPO) and engagement program for each mission. Conducting these mission scenarios in unique environments not only provides an opportunity to test the EPO concepts for the particular future-mission scenario, such as the best methods for conducting events with a communication time delay, but it also provides an avenue to deliver NASA s human space exploration key messages. These analogs are extremely exciting to students and the public, and they are performed in such a way that the public can feel like part of the mission. They also provide an opportunity for crew members to obtain training in education and public outreach activities similar to what they would perform in space. The analog EPO team is responsible for the coordination and execution of the events, the overall social media component for each mission, and public affairs events such as media visits and interviews. They also create new and exciting ways to engage the public, manage and create website content, coordinate video footage for missions, and coordinate and integrate each activity into the mission timeline. In 2012, the AES Analog Missions Project performed three distinct missions - NASA Extreme Environment Mission Operations (NEEMO), which simulated a mission to an asteroid using an undersea laboratory; In-Situ Resource Utilization (ISRU) Field Test, which simulated a robotic mission to the moon searching and drilling for water; and Research and Technology Studies (RATS) integrated tests, which also simulated a mission to an asteroid. This paper will discuss the education and public engagement that occurred during these missions.
Defining an Approach for Future Close Air Support Capability
2017-01-01
may take on the form of a force-mix study that considers multiple joint scenarios and missions. viii Acknowledgments The authors would like to thank...the Army and other services on an approach for defining future CAS capability. 9 Colin Clark, “Air...unit; one British soldier was killed, and five others were wounded.15 Only one A-10 was shot down during all of OIF and OEF. However, it should be
2011-01-01
Background Postnatal and antenatal anti-D prophylaxis have dramatically reduced maternal sensitisations and cases of rhesus disease in babies born to women with RhD negative blood group. Recent scientific advances mean that non-invasive prenatal diagnosis (NIPD), based on the presence of cell-free fetal DNA in maternal plasma, could be used to target prophylaxis on "at risk" pregnancies where the fetus is RhD positive. This paper provides the first assessment of cost-effectiveness of NIPD-targeted prophylaxis compared to current policies. Methods We conducted an economic analysis of NIPD implementation in England and Wales. Two scenarios were considered. Scenario 1 assumed that NIPD will be only used to target antenatal prophylaxis with serology tests continuing to direct post-delivery prophylaxis. In Scenario 2, NIPD would also displace postnatal serology testing if an RhD negative fetus was identified. Costs were estimated from the provider's perspective for both scenarios together with a threshold royalty fee per test. Incremental costs were compared with clinical implications. Results The basic cost of an NIPD in-house test is £16.25 per sample (excluding royalty fee). The two-dose antenatal prophylaxis policy recommended by NICE is estimated to cost the NHS £3.37 million each year. The estimated threshold royalty fee is £2.18 and £8.83 for Scenarios 1 and 2 respectively. At a £2.00 royalty fee, mass NIPD testing would produce no saving for Scenario 1 and £507,154 per annum for Scenario 2. Incremental cost-effectiveness analysis indicates that, at a test sensitivity of 99.7% and this royalty fee, NIPD testing in Scenario 2 will generate one additional sensitisation for every £9,190 saved. If a single-dose prophylaxis policy were implemented nationally, as recently recommended by NICE, Scenario 2 savings would fall. Conclusions Currently, NIPD testing to target anti-D prophylaxis is unlikely to be sufficiently cost-effective to warrant its large scale introduction in England and Wales. Only minor savings are calculated and, balanced against this, the predicted increase in maternal sensitisations may be unacceptably high. Reliability of NIPD assays still needs to be demonstrated rigorously in different ethnic minority populations. First trimester testing is unlikely to alter this picture significantly although other emerging technologies may. PMID:21244652
Szczepura, Ala; Osipenko, Leeza; Freeman, Karoline
2011-01-18
Postnatal and antenatal anti-D prophylaxis have dramatically reduced maternal sensitisations and cases of rhesus disease in babies born to women with RhD negative blood group. Recent scientific advances mean that non-invasive prenatal diagnosis (NIPD), based on the presence of cell-free fetal DNA in maternal plasma, could be used to target prophylaxis on "at risk" pregnancies where the fetus is RhD positive. This paper provides the first assessment of cost-effectiveness of NIPD-targeted prophylaxis compared to current policies. We conducted an economic analysis of NIPD implementation in England and Wales. Two scenarios were considered. Scenario 1 assumed that NIPD will be only used to target antenatal prophylaxis with serology tests continuing to direct post-delivery prophylaxis. In Scenario 2, NIPD would also displace postnatal serology testing if an RhD negative fetus was identified. Costs were estimated from the provider's perspective for both scenarios together with a threshold royalty fee per test. Incremental costs were compared with clinical implications. The basic cost of an NIPD in-house test is £16.25 per sample (excluding royalty fee). The two-dose antenatal prophylaxis policy recommended by NICE is estimated to cost the NHS £3.37 million each year. The estimated threshold royalty fee is £2.18 and £8.83 for Scenarios 1 and 2 respectively. At a £2.00 royalty fee, mass NIPD testing would produce no saving for Scenario 1 and £507,154 per annum for Scenario 2. Incremental cost-effectiveness analysis indicates that, at a test sensitivity of 99.7% and this royalty fee, NIPD testing in Scenario 2 will generate one additional sensitisation for every £9,190 saved. If a single-dose prophylaxis policy were implemented nationally, as recently recommended by NICE, Scenario 2 savings would fall. Currently, NIPD testing to target anti-D prophylaxis is unlikely to be sufficiently cost-effective to warrant its large scale introduction in England and Wales. Only minor savings are calculated and, balanced against this, the predicted increase in maternal sensitisations may be unacceptably high. Reliability of NIPD assays still needs to be demonstrated rigorously in different ethnic minority populations. First trimester testing is unlikely to alter this picture significantly although other emerging technologies may.
Nour, Svetlana; LaRosa, Jerry; Inn, Kenneth G W
2011-08-01
The present challenge for the international emergency radiobioassay community is to analyze contaminated samples rapidly while maintaining high quality results. The National Institute of Standards and Technology (NIST) runs a radiobioassay measurement traceability testing program to evaluate the radioanalytical capabilities of participating laboratories. The NIST Radiochemistry Intercomparison Program (NRIP) started more than 10 years ago, and emergency performance testing was added to the program seven years ago. Radiobioassay turnaround times under the NRIP program for routine production and under emergency response scenarios are 60 d and 8 h, respectively. Because measurement accuracy and sample turnaround time are very critical in a radiological emergency, response laboratories' analytical systems are best evaluated and improved through traceable Performance Testing (PT) programs. The NRIP provides participant laboratories with metrology tools to evaluate their performance and to improve it. The program motivates the laboratories to optimize their methodologies and minimize the turnaround time of their results. Likewise, NIST has to make adjustments and periodical changes in the bioassay test samples in order to challenge the participating laboratories continually. With practice, radioanalytical measurements turnaround time can be reduced to 3-4 h.
Use of Model Payload for Europa Mission Development
NASA Technical Reports Server (NTRS)
Lewis, Kari; Klaasan, Ken; Susca, Sara; Oaida, Bogdan; Larson, Melora; Vanelli, Tony; Murray, Alex; Jones, Laura; Thomas, Valerie; Frank, Larry
2016-01-01
This paper discusses the basis for the Model Payload and how it was used to develop the mission design, observation and data acquisition strategy, needed spacecraft capabilities, spacecraft-payload interface needs, mission system requirements and operational scenarios.
Direct Observation of Accretion onto a Hypernova's Newly Formed Black Hole
NASA Astrophysics Data System (ADS)
Milisavljevic, Dan
2017-09-01
Models of energetic core-collapse supernovae and long-duration gamma-ray bursts often invoke engine-driven scenarios associated with the formation of compact objects that input energy into the explosion. To date, only indirect evidence of black holes or magnetars formed in these events exists from observations obtained when the explosions are most luminous. Here we request a modest 15 ks Chandra pilot observation of the exceptionally important nearby hypernova SN2002ap to test models that predict X-ray emission associated with its remnant black hole to be detectable after 15 yr of ejecta expansion. Direct observation a newly formed "baby" black hole would be a landmark discovery capable of opening up new ways to investigate fundamental aspects of the core collapse process.
OntoGene web services for biomedical text mining.
Rinaldi, Fabio; Clematide, Simon; Marques, Hernani; Ellendorff, Tilia; Romacker, Martin; Rodriguez-Esteban, Raul
2014-01-01
Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges,with top ranked results in several of them.
Fritscher, Karl; Schuler, Benedikt; Link, Thomas; Eckstein, Felix; Suhm, Norbert; Hänni, Markus; Hengg, Clemens; Schubert, Rainer
2008-01-01
Fractures of the proximal femur are one of the principal causes of mortality among elderly persons. Traditional methods for the determination of femoral fracture risk use methods for measuring bone mineral density. However, BMD alone is not sufficient to predict bone failure load for an individual patient and additional parameters have to be determined for this purpose. In this work an approach that uses statistical models of appearance to identify relevant regions and parameters for the prediction of biomechanical properties of the proximal femur will be presented. By using Support Vector Regression the proposed model based approach is capable of predicting two different biomechanical parameters accurately and fully automatically in two different testing scenarios.
Integration of the Electrodynamic Dust Shield on a Lunar Habitat Demonstration Unit
NASA Technical Reports Server (NTRS)
Calle, C. I.; Immer, C. D.; Ferreira, J.; Hogue, M. D.; Chen, A.; Csonka, M. W.; VanSuetendael, N.; Snyder, S. J.
2010-01-01
NASA is developing a Habitat Demonstration Unit (HDU) to investigate the feasibility of lunar surface technologies and lunar ground operations. The HDU will define and validate lunar scenario architecture through field analog testing. It will contain a four-port vertical habitat module with docking demonstration capabilities. The Electrodynamic Oust Shield (EDS) is being incorporated into the HDU to demonstrate dust removal from a viewport and from a door prior to docking procedures. In this paper, we will describe our efforts to scale up the EDS to protect a viewport 20 cm in diameter. We will also describe the development of several 20 cm x 25 cm EDS patches to demonstrate dust removal from one of the HDU doors.
OntoGene web services for biomedical text mining
2014-01-01
Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges, with top ranked results in several of them. PMID:25472638
NASA Technical Reports Server (NTRS)
Hall, Laverne
1995-01-01
Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.
Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linke, J.; Bolt. H.; Breitbach, G.
1994-12-31
To assess the lifetime and the long term heat removal capabilities of plasma facing components in future thermonuclear fusion reactors such as ITER, neutron irradiation and subsequent high heat flux tests will be most essential. The effect of neutron damage will be simulated in material test reactors (such as the HFR-Petten) in a fission neutron environment. To investigate the heat loads during normal and off-normal operation scenarios a 60 kW electron beam test stand (Juelich Divertor Test Facility in Hot Cells, JUDITH) has been installed in a hot cell which can be operated by remote handling techniques. In this facilitymore » inertially cooled test coupons can be handled as well as small actively cooled divertor mock-ups. A special clamping mechanism for small test coupons (25 mm x 25 mm x 35 mm) with an integrated coolant channel within a copper or TZM heat sink has been developed and tested in an electron beam test bed. This method is an attractive alternative to costly large scale tests on complete divertor modules. The temperature and stress fields in individual CFC or beryllium tiles brazed to metallic heat sink (e.g. copper or TZM) can be investigated before and after neutron irradiation with moderate efforts.« less
Steady state scenario development with elevated minimum safety factor on DIII-D
Holcomb, Christopher T.; Ferron, John R.; Luce, Timothy C.; ...
2014-08-15
On DIII-D, a high β scenario with minimum safety factor (q min) near 1.4 has been optimized with new tools and shown to be a favourable candidate for long pulse or steady state operation in future devices. Furthermore, the new capability to redirect up to 5 MW of neutral beam injection (NBI) from on- to off-axis improves the ability to sustain elevated q min with a less peaked pressure profile. The observed changes increase the ideal magnetohydrodynamics (MHD) n = 1 mode β N limit thus providing a path forward for increasing the noninductive current drive fraction by operating atmore » high β N. Quasi-stationary discharges free of tearing modes have been sustained at βN = 3.5 and β T = 3.6% for two current profile diffusion timescales (about 3 s) limited by neutral beam duration. The discharge performance has normalized fusion performance expected to give fusion gain Q ≈ 5 in a device the size of ITER. Analysis of the poloidal flux evolution and current drive balance show that the loop voltage profile is almost relaxed even with 25% of the current driven inductively, and q min remains elevated near 1.4. Our observations increase confidence that the current profile will not evolve to one unstable to a tearing mode. In preliminary tests a divertor heat flux reduction technique based on producing a radiating mantle with neon injection appears compatible with this operating scenario. 0D model extrapolations suggest it may be possible to push this scenario up to 100% noninductive current drive by raising β N. Similar discharges with q min = 1.5–2 were susceptible to tearing modes and off-axis fishbones, and with q min > 2 lower normalized global energy confinement time is observed.« less
Europa Explorer Operational Scenarios Development
NASA Technical Reports Server (NTRS)
Lock, Robert E.; Pappalardo, Robert T.; Clark, Karla B.
2008-01-01
In 2007, NASA conducted four advanced mission concept studies for outer planets targets: Europa, Ganymede, Titan and Enceladus. The studies were conducted in close cooperation with the planetary science community. Of the four, the Europa Explorer Concept Study focused on refining mission options, science trades and implementation details for a potential flagship mission to Europa in the 2015 timeframe. A science definition team (SDT) was appointed by NASA to guide the study. A JPL-led engineering team worked closely with the science team to address 3 major focus areas: 1) credible cost estimates, 2) rationale and logical discussion of radiation risk and mitigation approaches, and 3) better definition and exploration of science operational scenario trade space. This paper will address the methods and results of the collaborative process used to develop Europa Explorer operations scenarios. Working in concert with the SDT, and in parallel with the SDT's development of a science value matrix, key mission capabilities and constraints were challenged by the science and engineering members of the team. Science goals were advanced and options were considered for observation scenarios. Data collection and return strategies were tested via simulation, and mission performance was estimated and balanced with flight and ground system resources and science priorities. The key to this successful collaboration was a concurrent development environment in which all stakeholders could rapidly assess the feasibility of strategies for their success in the full system context. Issues of science and instrument compatibility, system constraints, and mission opportunities were treated analytically and objectively leading to complementary strategies for observation and data return. Current plans are that this approach, as part of the system engineering process, will continue as the Europa Explorer Concept Study moves toward becoming a development project.
NASA Astrophysics Data System (ADS)
Wang, G.; Mayes, M. A.
2017-12-01
Microbially-explicit soil organic matter (SOM) decomposition models are thought to be more biologically realistic than conventional models. Current testing or evaluation of microbial models majorly uses steady-state analysis with time-invariant forces (i.e., soil temperature, moisture and litter input). The findings from such simplified analyses are assumed to be capable of representing the model responses in field soil conditions with seasonal driving forces. Here we show that the steady-state modeling results with seasonal forces may result in distinct findings from the simulations with time-invariant forcing data. We evaluate the response of soil organic C (SOC) to litter addition (L+) in a subtropical pine forest using the calibrated Microbial-ENzyme Decomposition (MEND) model. We implemented two sets of modeling analyses, with each set including two scenarios, i.e., control (CR) vs. litter-addition (L+). The first set (Set1) uses fixed soil temperature and moisture, and constant litter input under Scenario CR vs. increased constant litter input under Scenario L+. The second set (Set2) employs hourly soil temperature and moisture and monthly litter input under Scenario CR. Under Scenario L+ of Set2, A logistic function with an upper plateau represents the increasing trend of litter input to SOM. We conduct long-term simulations to ensure that the models reach steady-states for Set1 or dynamic equilibrium for Set2. Litter addition of Set2 causes an increase of SOC by 29%. However, the steady-state SOC pool sizes of Set1 would not respond to L+ as long as the chemical composition of litter remained the same. Our results indicate the necessity to implement dynamic model simulations with seasonal forcing data, which could lead to modeling results qualitatively different from the steady-state analysis with time-invariant forcing data.
Morgan, Simon; Morgan, Andy; Kerr, Rohan; Tapley, Amanda; Magin, Parker
2016-01-01
Abstract Objective To assess the effectiveness of an educational intervention on test-ordering attitudes and intended practice of GP trainees, and any associations between changes in test ordering and trainee characteristics. Design Preworkshop and postworkshop survey of attitudes to test ordering, intended test-ordering practices for 3 clinical scenarios (fatigue, screening, and shoulder pain), and tolerance for uncertainty. Setting Three Australian regional general practice training providers. Participants General practice trainees (N = 167). Intervention A 2-hour workshop session and an online module. Main outcome measures Proportion of trainees who agreed with attitudinal statements before and after the workshop; proportion of trainees who would order tests, mean number of tests ordered, and number of appropriate and inappropriate tests ordered for each scenario before and after the workshop. Results Of 167 trainees, 132 (79.0%) completed both the preworkshop and postworkshop questionnaires. A total of 122 trainees attended the workshop. At baseline, 88.6% thought that tests can harm patients, 84.8% believed overtesting was a problem, 72.0% felt pressured by patients, 52.3% believed that tests would reassure patients, and 50.8% thought that they were less likely to be sued if they ordered tests. There were desirable changes in all attitudes after the workshop. Before the workshop, the mean number of tests that trainees would have ordered was 4.4, 4.8, and 1.5 for the fatigue, screening, and shoulder pain scenarios, respectively. After the workshop there were decreases in the mean number of both appropriate tests (decrease of 0.94) and inappropriate tests (decrease of 0.24) in the fatigue scenario; there was no change in the mean number of appropriate tests and a decrease in inappropriate tests (decrease of 0.76) in the screening scenario; and there was an increase in the proportion of trainees who would appropriately not order tests in the shoulder pain scenario. There were no significant associations between changes in test ordering and trainee demographic characteristics or tolerance for uncertainty subscale scores. Conclusion General practice trainees have conflicting attitudes to test ordering and demonstrate nonrational test ordering in 3 common scenarios. A workshop on rational test ordering led to desirable changes in attitudes and more rational intended test ordering. Our findings inform the development of appropriate educational interventions that address nonrational testing in family medicine. PMID:27629671
Morgan, Simon; Morgan, Andy; Kerr, Rohan; Tapley, Amanda; Magin, Parker
2016-09-01
To assess the effectiveness of an educational intervention on test-ordering attitudes and intended practice of GP trainees, and any associations between changes in test ordering and trainee characteristics. Preworkshop and postworkshop survey of attitudes to test ordering, intended test-ordering practices for 3 clinical scenarios (fatigue, screening, and shoulder pain), and tolerance for uncertainty. Three Australian regional general practice training providers. General practice trainees (N = 167). A 2-hour workshop session and an online module. Proportion of trainees who agreed with attitudinal statements before and after the workshop; proportion of trainees who would order tests, mean number of tests ordered, and number of appropriate and inappropriate tests ordered for each scenario before and after the workshop. Of 167 trainees, 132 (79.0%) completed both the preworkshop and postworkshop questionnaires. A total of 122 trainees attended the workshop. At baseline, 88.6% thought that tests can harm patients, 84.8% believed overtesting was a problem, 72.0% felt pressured by patients, 52.3% believed that tests would reassure patients, and 50.8% thought that they were less likely to be sued if they ordered tests. There were desirable changes in all attitudes after the workshop. Before the workshop, the mean number of tests that trainees would have ordered was 4.4, 4.8, and 1.5 for the fatigue, screening, and shoulder pain scenarios, respectively. After the workshop there were decreases in the mean number of both appropriate tests (decrease of 0.94) and inappropriate tests (decrease of 0.24) in the fatigue scenario; there was no change in the mean number of appropriate tests and a decrease in inappropriate tests (decrease of 0.76) in the screening scenario; and there was an increase in the proportion of trainees who would appropriately not order tests in the shoulder pain scenario. There were no significant associations between changes in test ordering and trainee demographic characteristics or tolerance for uncertainty subscale scores. General practice trainees have conflicting attitudes to test ordering and demonstrate nonrational test ordering in 3 common scenarios. A workshop on rational test ordering led to desirable changes in attitudes and more rational intended test ordering. Our findings inform the development of appropriate educational interventions that address nonrational testing in family medicine. Copyright© the College of Family Physicians of Canada.
Scenario analysis and strategic planning: practical applications for radiology practices.
Lexa, Frank James; Chan, Stephen
2010-05-01
Modern business science has many tools that can be of great value to radiologists and their practices. One of the most important and underused is long-term planning. Part of the problem has been the pace of change. Making a 5-year plan makes sense only if your develop robust scenarios of possible future conditions you will face. Scenario analysis is one of many highly regarded tools that can improve your predictive capability. However, as with many tools, it pays to have some training and to get practical tips on how to improve their value. It also helps to learn from other people's mistakes rather than your own. The authors discuss both theoretical and practical issues in using scenario analysis to improve your planning process. They discuss actionable ways this set of tools can be applied in a group meeting or retreat. Copyright (c) 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Severe Accident Test Station Design Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snead, Mary A.; Yan, Yong; Howell, Michael
The purpose of the ORNL severe accident test station (SATS) is to provide a platform for evaluation of advanced fuels under projected beyond design basis accident (BDBA) conditions. The SATS delivers the capability to map the behavior of advanced fuels concepts under accident scenarios across various temperature and pressure profiles, steam and steam-hydrogen gas mixtures, and thermal shock. The overall facility will include parallel capabilities for examination of fuels and irradiated materials (in-cell) and non-irradiated materials (out-of-cell) at BDBA conditions as well as design basis accident (DBA) or loss of coolant accident (LOCA) conditions. Also, a supporting analytical infrastructure tomore » provide the data-needs for the fuel-modeling components of the Fuel Cycle Research and Development (FCRD) program will be put in place in a parallel manner. This design report contains the information for the first, second and third phases of design and construction of the SATS. The first phase consisted of the design and construction of an out-of-cell BDBA module intended for examination of non-irradiated materials. The second phase of this work was to construct the BDBA in-cell module to test irradiated fuels and materials as well as the module for DBA (i.e. LOCA) testing out-of-cell, The third phase was to build the in-cell DBA module. The details of the design constraints and requirements for the in-cell facility have been closely captured during the deployment of the out-of-cell SATS modules to ensure effective future implementation of the in-cell modules.« less
ARCADE small-scale docking mechanism for micro-satellites
NASA Astrophysics Data System (ADS)
Boesso, A.; Francesconi, A.
2013-05-01
The development of on-orbit autonomous rendezvous and docking (ARD) capabilities represents a key point for a number of appealing mission scenarios that include activities of on-orbit servicing, automated assembly of modular structures and active debris removal. As of today, especially in the field of micro-satellites ARD, many fundamental technologies are still missing or require further developments and micro-gravity testing. In this framework, the University of Padova, Centre of Studies and Activities for Space (CISAS), developed the Autonomous Rendezvous Control and Docking Experiment (ARCADE), a technology demonstrator intended to fly aboard a BEXUS stratospheric balloon. The goal was to design, build and test, in critical environment conditions, a proximity relative navigation system, a custom-made reaction wheel and a small-size docking mechanism. The ARCADE docking mechanism was designed against a comprehensive set of requirements and it can be classified as small-scale, central, gender mating and unpressurized. The large use of commercial components makes it low-cost and simple to be manufactured. Last, it features a good tolerance to off-nominal docking conditions and a by-design soft docking capability. The final design was extensively verified to be compliant with its requirements by means of numerical simulations and physical testing. In detail, the dynamic behaviour of the mechanism in both nominal and off-nominal conditions was assessed with the multibody dynamics analysis software MD ADAMS 2010 and functional tests were carried out within the fully integrated ARCADE experiment to ensure the docking system efficacy and to highlight possible issues. The most relevant results of the study will be presented and discussed in conclusion to this paper.
Wound Ballistics Modeling for Blast Loading Blunt Force Impact and Projectile Penetration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Paul A.
Light body armor development for the warfighter is based on trial-and-error testing of prototype designs against ballistic projectiles. Torso armor testing against blast is nonexistent but necessary to protect the heart and lungs. In tests against ballistic projectiles, protective apparel is placed over ballistic clay and the projectiles are fired into the armor/clay target. The clay represents the human torso and its behind-armor, permanent deflection is the principal metric used to assess armor protection. Although this approach provides relative merit assessment of protection, it does not examine the behind-armor blunt trauma to crucial torso organs. We propose a modeling andmore » simulation (M&S) capability for wound injury scenarios to the head, neck, and torso of the warfighter. We will use this toolset to investigate the consequences of, and mitigation against, blast exposure, blunt force impact, and ballistic projectile penetration leading to damage of critical organs comprising the central nervous, cardiovascular, and respiratory systems. We will leverage Sandia codes and our M&S expertise on traumatic brain injury to develop virtual anatomical models of the head, neck, and torso and the simulation methodology to capture the physics of wound mechanics. Specifically, we will investigate virtual wound injuries to the head, neck, and torso without and with protective armor to demonstrate the advantages of performing injury simulations for the development of body armor. The proposed toolset constitutes a significant advance over current methods by providing a virtual simulation capability to investigate wound injury and optimize armor design without the need for extensive field testing.« less
Constraints and Approach for Selecting the Mars Surveyor '01 Landing Site
NASA Technical Reports Server (NTRS)
Golombek, M.; Bridges, N.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Weitz, C.
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.
Constraints, Approach and Present Status for Selecting the Mars Surveyor 2001 Landing Site
NASA Technical Reports Server (NTRS)
Golombek, M.; Anderson, F.; Bridges, N.; Briggs, G.; Gilmore, M.; Gulick, V.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.;
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough, defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities.
Archaeogeophysical tests in water saturated and under water scenarios at the Hydrogeosite Laboratory
NASA Astrophysics Data System (ADS)
Capozzoli, Luigi; De Martino, Gregory; Giampaolo, Valeria; Perciante, Felice; Rizzo, Enzo
2016-04-01
The growing interest in underwater archaeology as witnessed by numerous archaeological campaigns carried out in the Mediterranean region in marine and lacustrine environments involves a challenge of great importance for archaeogeophysical discipline. Through a careful use of geophysical techniques it is possible support archaeological research to identify and analyse the undiscovered cultural heritage placed under water located near rivers and sea. Over the past decades, geophysical methods were applied successfully in the field of archaeology: an integrated approach based on the use of electric, electromagnetic and magnetic techniques have showed the ability to individuate and reconstruct the presence of archaeological remains in the subsoil allowing to define their distribution in the space limiting the excavation activities. Moreover the capability of geophysics could be limited cause the low geophysical contrasts occurring between archaeological structures and surrounding environment; in particular problems of resolution, depth of investigation and sensitivity related to each adopted technique can result in a distorted reading of the subsurface behaviour preventing the identification of archaeological remains. This problem is amplified when geophysical approach is applied in very humid environments such as in lacustrine and marine scenarios, or in soils characterized by high clay content that make more difficult the propagation of geophysical signals. In order to improve our geophysical knowledge in lacustrine and coastal scenarios a complex and innovative research project was realized at the CNR laboratory of Hydrogeosite which permitted to perform an archaeogeophysical experiment in controlled conditions. The designed archaeological context was focused on the Roman age and various elements characterized by different shapes and materials were placed at different depths in the sub-soil. The preliminary project activities with some scenarios were presented last year, now we would like to show the final results of the project where different scenarios were set up for GPR and ERT investigations. Severale phases were performed: buried objects were covered by different thickness of sediments and different soil water contents were defined. Moreover, geophysical measurements were acquired on an underwater scenario. The 2D and 3D acquisitions have allowed to identify the limits and the abilities of the GPR and resistivity measurements.
Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance
NASA Technical Reports Server (NTRS)
Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.
2014-01-01
This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.
Comprehension and engagement in survey interviews with virtual agents
Conrad, Frederick G.; Schober, Michael F.; Jans, Matt; Orlowski, Rachel A.; Nielsen, Daniel; Levenstein, Rachel
2015-01-01
This study investigates how an onscreen virtual agent's dialog capability and facial animation affect survey respondents' comprehension and engagement in “face-to-face” interviews, using questions from US government surveys whose results have far-reaching impact on national policies. In the study, 73 laboratory participants were randomly assigned to respond in one of four interviewing conditions, in which the virtual agent had either high or low dialog capability (implemented through Wizard of Oz) and high or low facial animation, based on motion capture from a human interviewer. Respondents, whose faces were visible to the Wizard (and videorecorded) during the interviews, answered 12 questions about housing, employment, and purchases on the basis of fictional scenarios designed to allow measurement of comprehension accuracy, defined as the fit between responses and US government definitions. Respondents answered more accurately with the high-dialog-capability agents, requesting clarification more often particularly for ambiguous scenarios; and they generally treated the high-dialog-capability interviewers more socially, looking at the interviewer more and judging high-dialog-capability agents as more personal and less distant. Greater interviewer facial animation did not affect response accuracy, but it led to more displays of engagement—acknowledgments (verbal and visual) and smiles—and to the virtual interviewer's being rated as less natural. The pattern of results suggests that a virtual agent's dialog capability and facial animation differently affect survey respondents' experience of interviews, behavioral displays, and comprehension, and thus the accuracy of their responses. The pattern of results also suggests design considerations for building survey interviewing agents, which may differ depending on the kinds of survey questions (sensitive or not) that are asked. PMID:26539138
Benefit Assessment of the Precision Departure Release Capability Concept
NASA Technical Reports Server (NTRS)
Palopo, Kee; Chatterji, Gano B.; Lee, Hak-Tae
2011-01-01
A Precision Departure Release Capability concept is being evaluated by both the National Aeronautics and Space Administration and the Federal Aviation Administration as part of a larger goal of improving throughput, efficiency and capacity in integrated departure, arrival and surface operations. The concept is believed to have the potential of increasing flight efficiency and throughput by avoiding missing assigned slots and minimizing speed increase or path stretch to recover the slot. The main thrust of the paper is determining the impact of early and late departures from the departure runway when an aircraft has a slot assigned either at a meter fix or at the arrival airport. Results reported in the paper are for two scenarios. The first scenario considers flights out of Dallas/Fort Worth destined for Hartsfield-Jackson International Airport in Atlanta flying through the Meridian meter-fix in the Memphis Center with miles-in-trail constraints. The second scenario considers flights destined to George Bush Intercontinental/Houston Airport with specified airport arrival rate constraint. Results show that delay reduction can be achieved by allowing reasonable speed changes in scheduling. It was determined that the traffic volume between Dallas/Fort Worth and Atlanta via the Meridian fix is low and the departures times are spread enough that large departure schedule uncertainty can be tolerated. Flights can depart early or late within 90 minutes without accruing much more delay due to miles-in-trail constraint at the Meridian fix. In the Houston scenario, 808 arrivals from 174 airports were considered. Results show that delay experienced by the 16 Dallas/Fort Worth departures is higher if initial schedules of the remaining 792 flights are kept unaltered while they are rescheduled. Analysis shows that the probability of getting the initially assigned slot back after perturbation and rescheduling decreases with increasing standard deviation of the departure delay distributions. Results show that most Houston arrivals can be expected to be on time based on the assumed zero-mean Normal departure delay distributions achievable by Precision Departure Release Capability. In the current system, airport-departure delay, which is the sum of gate-departure delay and taxi-out delay, is observed at the airports. This delay acts as a bias, which can be reduced by Precision Departure Release Capability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paladino, Domenico; Auban, Olivier; Zboray, Robert
The benefits of using codes with 3-D capabilities to address safety issues of LWRs will be applicable to both the current generation of nuclear reactors as well to future ALWRs. The phenomena governing the containment response in case of some postulated severe accident scenarios include gas (air, hydrogen, steam) stratification in the containment, gas distribution between containment compartments, wall condensation, etc. These phenomena are driven by buoyant high momentum injection (jets) and/or low momentum injection (plumes). For instance, mixing in the immediate vicinity of the postulated line break is mainly dominated by very high velocity efflux, while low-momentum flows aremore » responsible for most of the transport processes within the containment. A project named SETH is currently in progress under the auspices of 15 OECD countries, with the aim of creating an experimental database suitable to assess the 3-D code capabilities in analyzing key-physical phenomena relevant for LWR safety analysis. This paper describes some results of two SETH tests, performed in the PANDA facility (located at PSI in Switzerland), focusing on plumes flowing near a containment wall. The plumes are generated by injecting a constant amount of steam in one of two interconnected vessels initially filled with air. In one of the two tests the temperature of the injected steam and the initial containment wall and fluid temperatures allowed for condensation during the test. (authors)« less
Closed Environment Module - modularization and extension of the V-HAB
NASA Astrophysics Data System (ADS)
Plötner, Peter; Czupalla, M. Markus; Zhukov, Anton
2012-07-01
The `Virtual Habitat' (V-HAB), is a Life Support System (LSS) simulation, created to provide the possibility for dynamic simulation of LSS for future human spaceflight missions. V-HAB creates the option to optimize LSS during early design phases. Furthermore, it allows simulating e.g. worst case scenarios which cannot be tested in reality. In a nutshell the tool allows the testing of LSS robustness by means of computer simulations. V-HAB is a modular simulation consisting of a: Closed Environment Module (CEM) Crew Module Biological Module Physio-Chemical Module The focus of the paper will be the Closed Environment Module (CEM) which is the core of V-HAB. The main function of the CEM is the embedding of all modules in the entire simulation and the control of the LSS. The CEM includes the possibility to simulate an arbitrary number of compartments and tanks with the interaction between connected compartments. Furthermore, a control program to actuate the LSS Technologies was implemented in the CEM, and is also introduced. In this paper the capabilities of the CEM are introduced based on selected test cases. In particular the following capabilities are demonstrated: Supply Leakage ON/OFF controller Power management Un-/docking Controller for tanks with maximum filling degree The CEM of the V-HAB simulation was verified by simulating the Atmosphere Revitalization part of the ISS and comparing it to actual measurement data. The results of this analysis are also presented in the paper.
Robustness of assembly supply chain networks by considering risk propagation and cascading failure
NASA Astrophysics Data System (ADS)
Tang, Liang; Jing, Ke; He, Jie; Stanley, H. Eugene
2016-10-01
An assembly supply chain network (ASCN) is composed of manufacturers located in different geographical regions. To analyze the robustness of this ASCN when it suffers from catastrophe disruption events, we construct a cascading failure model of risk propagation. In our model, different disruption scenarios s are considered and the probability equation of all disruption scenarios is developed. Using production capability loss as the robustness index (RI) of an ASCN, we conduct a numerical simulation to assess its robustness. Through simulation, we compare the network robustness at different values of linking intensity and node threshold and find that weak linking intensity or high node threshold increases the robustness of the ASCN. We also compare network robustness levels under different disruption scenarios.
The Integrated Landscape Modeling partnership - Current status and future directions
Mushet, David M.; Scherff, Eric J.
2016-01-28
The Integrated Landscape Modeling (ILM) partnership is an effort by the U.S. Geological Survey (USGS) and U.S. Department of Agriculture (USDA) to identify, evaluate, and develop models to quantify services derived from ecosystems, with a focus on wetland ecosystems and conservation effects. The ILM partnership uses the Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST) modeling platform to facilitate regional quantifications of ecosystem services under various scenarios of land-cover change that are representative of differing conservation program and practice implementation scenarios. To date, the ILM InVEST partnership has resulted in capabilities to quantify carbon stores, amphibian habitat, plant-community diversity, and pollination services. Work to include waterfowl and grassland bird habitat quality is in progress. Initial InVEST modeling has been focused on the Prairie Pothole Region (PPR) of the United States; future efforts might encompass other regions as data availability and knowledge increase as to how functions affecting ecosystem services differ among regions.The ILM partnership is also developing the capability for field-scale process-based modeling of depressional wetland ecosystems using the Agricultural Policy/Environmental Extender (APEX) model. Progress was made towards the development of techniques to use the APEX model for closed-basin depressional wetlands of the PPR, in addition to the open systems that the model was originally designed to simulate. The ILM partnership has matured to the stage where effects of conservation programs and practices on multiple ecosystem services can now be simulated in selected areas. Future work might include the continued development of modeling capabilities, as well as development and evaluation of differing conservation program and practice scenarios of interest to partner agencies including the USDA’s Farm Service Agency (FSA) and Natural Resources Conservation Service (NRCS). When combined, the ecosystem services modeling capabilities of InVEST and the process-based abilities of the APEX model should provide complementary information needed to meet USDA and the Department of the Interior information needs.
Cooperative Collision Avoidance Step 1 - Technology Demonstration Flight Test Report. Revision 1
NASA Technical Reports Server (NTRS)
Trongale, Nicholas A.
2006-01-01
The National Aeronautics and Space Administration (NASA) Access 5 Project Office sponsored a cooperative collision avoidance flight demonstration program for unmanned aircraft systems (UAS). This flight test was accomplished between September 21st and September 27th 2005 from the Mojave Airport, Mojave, California. The objective of these flights was to collect data for the Access 5 Cooperative Collision Avoidance (CCA) Work Package simulation effort, i.e., to gather data under select conditions to allow validation of the CCA simulation. Subsequent simulation to be verified were: Demonstrate the ability to detect cooperative traffic and provide situational awareness to the ROA pilot; Demonstrate the ability to track the detected cooperative traffic and provide position information to the ROA pilot; Demonstrate the ability to determine collision potential with detected cooperative traffic and provide notification to the ROA pilot; Demonstrate that the CCA subsystem provides information in sufficient time for the ROA pilot to initiate an evasive maneuver to avoid collision; Demonstrate an evasive maneuver that avoids collision with the threat aircraft; and lastly, Demonstrate the ability to assess the adequacy of the maneuver and determine that the collision potential has been avoided. The Scaled Composites, LLC Proteus Optionally Piloted Vehicle (OPV) was chosen as the test platform. Proteus was manned by two on-board pilots but was also capable of being controlled from an Air Vehicle Control Station (AVCS) located on the ground. For this demonstration, Proteus was equipped with cooperative collision sensors and the required hardware and software to place the data on the downlink. Prior to the flight phase, a detailed set of flight test scenarios were developed to address the flight test objectives. Two cooperative collision avoidance sensors were utilized for detecting aircraft in the evaluation: Traffic Alert and Collision Avoidance System-II (TCAS-II) and Automatic Dependent Surveillance Broadcast (ADS-B). A single intruder aircraft was used during all the flight testing, a NASA Gulfstream III (G-III). During the course of the testing, six geometrically different near-collision scenarios were evaluated. These six scenarios were each tested using various combinations of sensors and collision avoidance software. Of the 54 planned test points 49 were accomplished successfully. Proteus flew a total of 21.5 hours during the testing and the G-III flew 19.8 hours. The testing fully achieved all flight test objectives. The Flight IPT performed an analysis to determine the accuracy of the simulation model used to predict the location of the host aircraft downstream during an avoidance maneuver. The data collected by this flight program was delivered to the Access 5 Cooperative Collision Avoidance (CCA) Work Package Team who was responsible for reporting on their analysis of this flight data.
NASA Astrophysics Data System (ADS)
Crutcher, Richard I.; Jones, R. W.; Moore, Michael R.; Smith, S. F.; Tolley, Alan L.; Rochelle, Robert W.
1997-02-01
A prototype 'smart' repeater that provides interoperability capabilities for radio communication systems in multi-agency and multi-user scenarios is being developed by the Oak Ridge National Laboratory. The smart repeater functions as a deployable communications platform that can be dynamically reconfigured to cross-link the radios of participating federal, state, and local government agencies. This interconnection capability improves the coordination and execution of multi-agency operations, including coordinated law enforcement activities and general emergency or disaster response scenarios. The repeater provides multiple channels of operation in the 30-50, 118-136, 138-174, and 403-512 MHz land mobile communications and aircraft bands while providing the ability to cross-connect among multiple frequencies, bands, modulation types, and encryption formats. Additionally, two telephone interconnects provide links to the fixed and cellular telephone networks. The 800- and 900-MHz bands are not supported by the prototype, but the modular design of the system accommodates future retrofits to extend frequency capabilities with minimal impact to the system. Configuration of the repeater is through a portable personal computer with a Windows-based graphical interface control screen that provides dynamic reconfiguration of network interconnections and formats.
Feedback controlled, reactor relevant, high-density, high-confinement scenarios at ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Lang, P. T.; Blanken, T. C.; Dunne, M.; McDermott, R. M.; Wolfrum, E.; Bobkov, V.; Felici, F.; Fischer, R.; Janky, F.; Kallenbach, A.; Kardaun, O.; Kudlacek, O.; Mertens, V.; Mlynek, A.; Ploeckl, B.; Stober, J. K.; Treutterer, W.; Zohm, H.; ASDEX Upgrade Team
2018-03-01
One main programme topic at the ASDEX Upgrade all-metal-wall tokamak is development of a high-density regime with central densities at reactor grade level while retaining high-confinement properties. This required development of appropriate control techniques capable of coping with the pellet tool, a powerful means of fuelling but one which presented challenges to the control system for handling of related perturbations. Real-time density profile control was demonstrated, raising the core density well above the Greenwald density while retaining the edge density in order to avoid confinement losses. Recently, a new model-based approach was implemented that allows direct control of the central density. Investigations focussed first on the N-seeding scenario owing to its proven potential to yield confinement enhancements. Combining pellets and N seeding was found to improve the divertor buffering further and enhance the operational range accessible. For core densities up to about the Greenwald density, a clear improvement with respect to the non-seeding reference was achieved; however, at higher densities this benefit is reduced. This behaviour is attributed to recurrence of an outward shift of the edge density profile, resulting in a reduced peeling-ballooning stability. This is similar to the shift seen during strong gas puffing, which is required to prevent impurity influx in ASDEX Upgrade. First tests indicate that highly-shaped plasma configurations like the ITER base-line scenario, respond very well to pellet injection, showing efficient fuelling with no measurable impact on the edge density profile.
Amendola, Alessandra; Coen, Sabrina; Belladonna, Stefano; Pulvirenti, F Renato; Clemens, John M; Capobianchi, M Rosaria
2011-08-01
Diagnostic laboratories need automation that facilitates efficient processing and workflow management to meet today's challenges for expanding services and reducing cost, yet maintaining the highest levels of quality. Processing efficiency of two commercially available automated systems for quantifying HIV-1 and HCV RNA, Abbott m2000 system and Roche COBAS Ampliprep/COBAS TaqMan 96 (docked) systems (CAP/CTM), was evaluated in a mid/high throughput workflow laboratory using a representative daily workload of 24 HCV and 72 HIV samples. Three test scenarios were evaluated: A) one run with four batches on the CAP/CTM system, B) two runs on the Abbott m2000 and C) one run using the Abbott m2000 maxCycle feature (maxCycle) for co-processing these assays. Cycle times for processing, throughput and hands-on time were evaluated. Overall processing cycle time was 10.3, 9.1 and 7.6 h for Scenarios A), B) and C), respectively. Total hands-on time for each scenario was, in order, 100.0 (A), 90.3 (B) and 61.4 min (C). The interface of an automated analyzer to the laboratory workflow, notably system set up for samples and reagents and clean up functions, are as important as the automation capability of the analyzer for the overall impact to processing efficiency and operator hands-on time.
Computer modeling of tank track elastomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesuer, D.R.; Goldberg, A.; Patt, J.
Computer models of the T142, T156 and the British Chieftain tank tracks have been studied as part of a program to examine the tank-track-pad failure problem. The modeling is based on the finite element method with two different models being used to evaluate the thermal and mechanical response of the tracks. Modeling has enabled us to evaluate the influence of track design, elastomer formulation and operating scenario on the response of the track. the results of these analyses have been evaluated with experimental tests that quantify the extent of damage development in elastomers and thus indicate the likelihood of padmore » failure due to ''cutting and chunking.'' The primary characteristics influencing the temperatures achieved in the track are the heat-generation rate and the track geometry. The heat-generation rate is related to the viscoelastic material properties of the elastomer, track design and loading/operating scenario. For all designs and materials studied, stresses produced during contact with a flat roadway surface were not considered large enough to damage the pad. Operating scenarios were studied in which the track pad contacts rigid bars representing idealized obstacles in cross country terrain. A highly localized obstacle showed the possibility for subsurface mechanical damage to the track pad due to obstacle contact. Contact with a flat rigid bar produced higher tensile stresses that were near the damage thresholds for this material and thus capable of producing cutting and chunking failures.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... of the Two-Pronged Test 2. Identification of Affected Class I Areas 3. Control Scenarios Examined 4... two-pronged test to the Transport Rule control scenario and the source-specific BART control scenario... approaches to identify the Class I areas ``affected'' by the Transport Rule as an alternative control program...
In-Space Cryogenic Propellant Depot (ISCPD) Architecture Definitions and Systems Studies
NASA Technical Reports Server (NTRS)
Fikes, John C.; Howell, Joe T.; Henley, Mark
2006-01-01
The objectives of the ISCPD Architecture Definitions and Systems Studies were to determine high leverage propellant depot architecture concepts, system configuration trades, and related technologies to enable more ambitious and affordable human and robotic exploration of the Earth Neighborhood and beyond. This activity identified architectures and concepts that preposition and store propellants in space for exploration and commercial space activities, consistent with Exploration Systems Research and Technology (ESR&T) objectives. Commonalities across mission scenarios for these architecture definitions, depot concepts, technologies, and operations were identified that also best satisfy the Vision of Space Exploration. Trade studies were conducted, technology development needs identified and assessments performed to drive out the roadmap for obtaining an in-space cryogenic propellant depot capability. The Boeing Company supported the NASA Marshall Space Flight Center (MSFC) by conducting this Depot System Architecture Development Study. The primary objectives of this depot architecture study were: (1) determine high leverage propellant depot concepts and related technologies; (2) identify commonalities across mission scenarios of depot concepts, technologies, and operations; (3) determine the best depot concepts and key technology requirements and (4) identify technology development needs including definition of ground and space test article requirements.
Track classification within wireless sensor network
NASA Astrophysics Data System (ADS)
Doumerc, Robin; Pannetier, Benjamin; Moras, Julien; Dezert, Jean; Canevet, Loic
2017-05-01
In this paper, we present our study on track classification by taking into account environmental information and target estimated states. The tracker uses several motion model adapted to different target dynamics (pedestrian, ground vehicle and SUAV, i.e. small unmanned aerial vehicle) and works in centralized architecture. The main idea is to explore both: classification given by heterogeneous sensors and classification obtained with our fusion module. The fusion module, presented in his paper, provides a class on each track according to track location, velocity and associated uncertainty. To model the likelihood on each class, a fuzzy approach is used considering constraints on target capability to move in the environment. Then the evidential reasoning approach based on Dempster-Shafer Theory (DST) is used to perform a time integration of this classifier output. The fusion rules are tested and compared on real data obtained with our wireless sensor network.In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of this system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).
Life sciences biomedical research planning for Space Station
NASA Technical Reports Server (NTRS)
Primeaux, Gary R.; Michaud, Roger; Miller, Ladonna; Searcy, Jim; Dickey, Bernistine
1987-01-01
The Biomedical Research Project (BmRP), a major component of the NASA Life Sciences Space Station Program, incorporates a laboratory for the study of the effects of microgravity on the human body, and the development of techniques capable of modifying or counteracting these effects. Attention is presently given to a representative scenario of BmRP investigations and associated engineering analyses, together with an account of the evolutionary process by which the scenarios and the Space Station design requirements they entail are identified. Attention is given to a tether-implemented 'variable gravity centrifuge'.
Scenario management and automated scenario generation
NASA Astrophysics Data System (ADS)
McKeever, William; Gilmour, Duane; Lehman, Lynn; Stirtzinger, Anthony; Krause, Lee
2006-05-01
The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition based to effects based strategies, as well as the complexities of 4 th generation warfare and asymmetric adversaries have placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment, planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope simulations. This paper will discuss a case study in which the scenario generation capability was employed to support COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The paper will discuss how scenario generation technology can be employed to allow military commanders and mission planning staff to understand the impact of command decisions on the battlespace of tomorrow.
Non-invasive prenatal testing for single gene disorders: exploring the ethics.
Deans, Zuzana; Hill, Melissa; Chitty, Lyn S; Lewis, Celine
2013-07-01
Non-invasive prenatal testing for single gene disorders is now clearly on the horizon. This new technology offers obvious clinical benefits such as safe testing early in pregnancy. Before widespread implementation, it is important to consider the possible ethical implications. Four hypothetical scenarios are presented that highlight how ethical ideals of respect for autonomy, privacy and fairness may come into play when offering non-invasive prenatal testing for single gene disorders. The first scenario illustrates the moral case for using these tests for 'information only', identifying a potential conflict between larger numbers of women seeking the benefits of the test and the wider social impact of funding tests that do not offer immediate clinical benefit. The second scenario shows how the simplicity and safety of non-invasive prenatal testing could lead to more autonomous decision-making and, conversely, how this could also lead to increased pressure on women to take up testing. In the third scenario we show how, unless strong safeguards are put in place, offering non-invasive prenatal testing could be subject to routinisation with informed consent undermined and that woman who are newly diagnosed as carriers may be particularly vulnerable. The final scenario introduces the possibility of a conflict of the moral rights of a woman and her partner through testing for single gene disorders. This analysis informs our understanding of the potential impacts of non-invasive prenatal testing for single gene disorders on clinical practice and has implications for future policy and guidelines for prenatal care.
ORIGEN-based Nuclear Fuel Inventory Module for Fuel Cycle Assessment: Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skutnik, Steven E.
The goal of this project, “ORIGEN-based Nuclear Fuel Depletion Module for Fuel Cycle Assessment" is to create a physics-based reactor depletion and decay module for the Cyclus nuclear fuel cycle simulator in order to assess nuclear fuel inventories over a broad space of reactor operating conditions. The overall goal of this approach is to facilitate evaluations of nuclear fuel inventories for a broad space of scenarios, including extended used nuclear fuel storage and cascading impacts on fuel cycle options such as actinide recovery in used nuclear fuel, particularly for multiple recycle scenarios. The advantages of a physics-based approach (compared tomore » a recipe-based approach which has been typically employed for fuel cycle simulators) is in its inherent flexibility; such an approach can more readily accommodate the broad space of potential isotopic vectors that may be encountered under advanced fuel cycle options. In order to develop this flexible reactor analysis capability, we are leveraging the Origen nuclear fuel depletion and decay module from SCALE to produce a standalone “depletion engine” which will serve as the kernel of a Cyclus-based reactor analysis module. The ORIGEN depletion module is a rigorously benchmarked and extensively validated tool for nuclear fuel analysis and thus its incorporation into the Cyclus framework can bring these capabilities to bear on the problem of evaluating long-term impacts of fuel cycle option choices on relevant metrics of interest, including materials inventories and availability (for multiple recycle scenarios), long-term waste management and repository impacts, etc. Developing this Origen-based analysis capability for Cyclus requires the refinement of the Origen analysis sequence to the point where it can reasonably be compiled as a standalone sequence outside of SCALE; i.e., wherein all of the computational aspects of Origen (including reactor cross-section library processing and interpolation, input and output processing, and depletion/decay solvers) can be self-contained into a single executable sequence. Further, to embed this capability into other software environments (such as the Cyclus fuel cycle simulator) requires that Origen’s capabilities be encapsulated into a portable, self-contained library which other codes can then call directly through function calls, thereby directly accessing the solver and data processing capabilities of Origen. Additional components relevant to this work include modernization of the reactor data libraries used by Origen for conducting nuclear fuel depletion calculations. This work has included the development of new fuel assembly lattices not previously available (such as for CANDU heavy-water reactor assemblies) as well as validation of updated lattices for light-water reactors updated to employ modern nuclear data evaluations. The CyBORG reactor analysis module as-developed under this workscope is fully capable of dynamic calculation of depleted fuel compositions from all commercial U.S. reactor assembly types as well as a number of international fuel types, including MOX, VVER, MAGNOX, and PHWR CANDU fuel assemblies. In addition, the Origen-based depletion engine allows for CyBORG to evaluate novel fuel assembly and reactor design types via creation of Origen reactor data libraries via SCALE. The establishment of this new modeling capability affords fuel cycle modelers a substantially improved ability to model dynamically-changing fuel cycle and reactor conditions, including recycled fuel compositions from fuel cycle scenarios involving material recycle into thermal-spectrum systems.« less
Energy Absorbing Seat System for an Agricultural Aircraft
NASA Technical Reports Server (NTRS)
Kellas, Sotiris; Jones, Lisa E. (Technical Monitor)
2002-01-01
A task was initiated to improve the energy absorption capability of an existing aircraft seat through cost-effective retrofitting, while keeping seat-weight increase to a minimum. This task was undertaken as an extension of NASA ongoing safety research and commitment to general aviation customer needs. Only vertical crash scenarios have been considered in this task which required the energy absorbing system to protect the seat occupant in a range of crash speeds up to 31 ft/sec. It was anticipated that, the forward and/or side crash accelerations could be attenuated with the aid of airbags, the technology of which is currently available in automobiles and military helicopters. Steps which were followed include, preliminary crush load determination, conceptual design of cost effective energy absorbers, fabrication and testing (static and dynamic) of energy absorbers, system analysis, design and fabrication of dummy seat/rail assembly, dynamic testing of dummy seat/rail assembly, and finally, testing of actual modified seat system with a dummy occupant. A total of ten full scale tests have been performed including three of the actual aircraft seat. Results from full-scale tests indicated that occupant loads were attenuated successfully to survivable levels.
Simulation of future stream alkalinity under changing deposition and climate scenarios.
Welsch, Daniel L; Cosby, B Jack; Hornberger, George M
2006-08-31
Models of soil and stream water acidification have typically been applied under scenarios of changing acidic deposition, however, climate change is usually ignored. Soil air CO2 concentrations have potential to increase as climate warms and becomes wetter, thus affecting soil and stream water chemistry by initially increasing stream alkalinity at the expense of reducing base saturation levels on soil exchange sites. We simulate this change by applying a series of physically based coupled models capable of predicting soil air CO2 and stream water chemistry. We predict daily stream water alkalinity for a small catchment in the Virginia Blue Ridge for 60 years into the future given stochastically generated daily climate values. This is done for nine different combinations of climate and deposition. The scenarios for both climate and deposition include a static scenario, a scenario of gradual change, and a scenario of abrupt change. We find that stream water alkalinity continues to decline for all scenarios (average decrease of 14.4 microeq L-1) except where climate is gradually warming and becoming more moist (average increase of 13 microeq L-1). In all other scenarios, base cation removal from catchment soils is responsible for limited alkalinity increase resulting from climate change. This has implications given the extent that acidification models are used to establish policy and legislation concerning deposition and emissions.
Caregivers' willingness-to-pay for Alzheimer's disease medications in Canada.
Oremus, Mark; Tarride, Jean-Eric; Pullenayegum, Eleanor; Clayton, Natasha; Mugford, Gerry; Godwin, Marshall; Huan, Allen; Bacher, Yves; Villalpando, Juan-Manual; Gill, Sudeep S; Lanctôt, Krista L; Herrmann, Nathan; Raina, Parminder
2015-01-01
We studied caregivers' willingness-to-pay for Alzheimer's disease drug therapy. We recruited 216 caregivers of persons with mild or moderate Alzheimer's disease and presented them with four scenarios describing a hypothetical Alzheimer's disease medication. The scenarios described the medication as capable of either treating the symptoms of disease or modifying the course of disease. The scenarios also presented two different probabilities of adverse effects occurrence, i.e., 0% or 30%. Most caregivers said they would pay out-of-pocket for the medication, with support for such payment ranging from 68% to 93%, depending on the specific scenario. The highest level of support was for the 'disease modifying and no adverse effects' scenario, while the lowest level was for the 'symptom treatment and 30% chance of adverse effects' scenario. On average, caregivers' monthly willingness-to-pay out-of-pocket for the medication ranged from $214 to $277 (Canadian dollars). Dollar amounts were highest for the 'disease modifying and no adverse effects' scenario and lowest for the 'symptom treatment and 30% chance of adverse effects' scenario. Support for out-of-pocket payment and specific dollar amounts were highest when the medication did not involve adverse effects. Caregivers placed more value on the absence of adverse effects than on drug efficacy. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
The Electronic Library Workstation--Today.
ERIC Educational Resources Information Center
Nolte, James
1990-01-01
Describes the components--hardware, software and applications, CD-ROM and online reference resources, and telecommunications links--of an electronic library workstation in use at Clarkson University (Potsdam, New York). Data manipulation, a hypothetical research scenario, and recommended workstation capabilities are also discussed. (MES)
Conducting Safe and Efficient Airport Surface Operations in a NextGen Environment
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III; Barnes, James R.
2016-01-01
The Next Generation Air Transportation System (NextGen) vision proposes many revolutionary operational concepts, such as surface trajectory-based operations (STBO) and technologies, including display of traffic information and movements, airport moving maps (AMM), and proactive alerts of runway incursions and surface traffic conflicts, to deliver an overall increase in system capacity and safety. A piloted simulation study was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center to evaluate the ability of a flight crew to conduct safe and efficient airport surface operations while utilizing an AMM. Position accuracy of traffic was varied, and the effect of traffic position accuracy on airport conflict detection and resolution (CD&R) capability was measured. Another goal was to evaluate the crew's ability to safely conduct STBO by assessing the impact of providing traffic intent information, CD&R system capability, and the display of STBO guidance to the flight crew on both head-down and head-up displays (HUD). Nominal scenarios and off-nominal conflict scenarios were conducted using 12 airline crews operating in a simulated Memphis International Airport terminal environment. The data suggest that all traffic should be shown on the airport moving map, whether qualified or unqualified, and conflict detection and resolution technologies provide significant safety benefits. Despite the presence of traffic information on the map, collisions or near-collisions still occurred; when indications or alerts were generated in these same scenarios, the incidents were averted. During the STBO testing, the flight crews met their required time-of-arrival at route end within 10 seconds on 98 percent of the trials, well within the acceptable performance bounds of 15 seconds. Traffic intent information was found to be useful in determining the intent of conflicting traffic, with graphical presentation preferred. The CD&R system was only minimally effective during STBO because the prevailing visibility was sufficient for visual detection of conflicting traffic. Overall, the pilots indicated STBO increased general situation awareness but also negatively impacted workload, reduced the ability to watch for other traffic, and increased head-down time.
A Water Rich Mars Surface Mission Scenario
NASA Technical Reports Server (NTRS)
Hoffman, Stephen J.; Andrews, Alida; Joosten, B. Kent; Watts, Kevin
2017-01-01
In an on-going effort to make human Mars missions more affordable and sustainable, NASA continues to investigate the innovative leveraging of technological advances in conjunction with the use of accessible Martian resources directly applicable to these missions. One of the resources with the broadest utility for human missions is water. Many past studies of human Mars missions assumed a complete lack of water derivable from local sources. However, recent advances in our understanding of the Martian environment provides growing evidence that Mars may be more "water rich" than previously suspected. This is based on data indicating that substantial quantities of water are mixed with surface regolith, bound in minerals located at or near the surface, and buried in large glacier-like forms. This paper describes an assessment of what could be done in a "water rich" human Mars mission scenario. A description of what is meant by "water rich" in this context is provided, including a quantification of the water that would be used by crews in this scenario. The different types of potential feedstock that could be used to generate these quantities of water are described, drawing on the most recently available assessments of data being returned from Mars. This paper specifically focuses on sources that appear to be buried quantities of water ice. (An assessment of other potential feedstock materials is documented in another paper.) Technologies and processes currently used in terrestrial Polar Regions are reviewed. One process with a long history of use on Earth and with potential application on Mars - the Rodriguez Well - is described and results of an analysis simulating the performance of such a well on Mars are presented. These results indicate that a Rodriguez Well capable of producing the quantities of water identified for a "water rich" human mission are within the capabilities assumed to be available on the Martian surface, as envisioned in other comparable Evolvable Mars Campaign assessments. The paper concludes by capturing additional findings and describing additional simulations and tests that should be conducted to better characterize the performance of the identified terrestrial technologies for accessing subsurface ice, as well as the Rodriguez Well, under Mars environmental conditions.
A Water Rich Mars Surface Mission Scenario
NASA Technical Reports Server (NTRS)
Hoffman, Stephen J.; Andrews, Alida; Joosten, B. Kent; Watts, Kevin
2017-01-01
In an on-going effort to make human Mars missions more affordable and sustainable, NASA continues to investigate the innovative leveraging of technological advances in conjunction with the use of accessible Martian resources directly applicable to these missions. One of the resources with the broadest utility for human missions is water. Many past studies of human Mars missions assumed a complete lack of water derivable from local sources. However, recent advances in our understanding of the Martian environment provides growing evidence that Mars may be more "water rich" than previously suspected. This is based on data indicating that substantial quantities of water are mixed with surface regolith, bound in minerals located at or near the surface, and buried in large glacier-like forms. This paper describes an assessment of what could be done in a "water rich" human Mars mission scenario. A description of what is meant by "water rich" in this context is provided, including a quantification of the water that would be used by crews in this scenario. The different types of potential feedstock that could be used to generate these quantities of water are described, drawing on the most recently available assessments of data being returned from Mars. This paper specifically focuses on sources that appear to be buried quantities of water ice. (An assessment of other potential feedstock materials is documented in another paper.) Technologies and processes currently used in terrestrial polar regions is reviewed. One process with a long history of use on Earth and with potential application on Mars - the Rodriguez Well - is described and results of an analysis simulating the performance of such a well on Mars are presented. These results indicate that a Rodriguez Well capable of producing the quantities of water identified for a "water rich" human mission are within the capabilities assumed to be available on the Martian surface, as envisioned in other comparable Evolvable Mars Campaign assessments. The paper concludes by capturing additional findings and describing additional simulations and tests that should be conducted to better characterize the performance of the identified terrestrial technologies for accessing subsurface ice, as well as the Rodriguez Well, under Mars environmental conditions.
Crew Exploration Vehicle Service Module Ascent Abort Coverage
NASA Technical Reports Server (NTRS)
Tedesco, Mark B.; Evans, Bryan M.; Merritt, Deborah S.; Falck, Robert D.
2007-01-01
The Crew Exploration Vehicle (CEV) is required to maintain continuous abort capability from lift off through destination arrival. This requirement is driven by the desire to provide the capability to safely return the crew to Earth after failure scenarios during the various phases of the mission. This paper addresses abort trajectory design considerations, concept of operations and guidance algorithm prototypes for the portion of the ascent trajectory following nominal jettison of the Launch Abort System (LAS) until safe orbit insertion. Factors such as abort system performance, crew load limits, natural environments, crew recovery, and vehicle element disposal were investigated to determine how to achieve continuous vehicle abort capability.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
.... OCC-2012-0016] Policy Statement on the Principles for Development and Distribution of Annual Stress... in developing and distributing the stress test scenarios for the annual stress test required by the... by the Annual Stress Test final rule (Stress Test Rule) published on October 9, 2012. Under the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-03
... Development and Distribution of Annual Stress Test Scenarios AGENCY: Federal Deposit Insurance Corporation... (``covered banks'') to conduct annual stress tests, report the results of such stress tests to the... summary of the results of the stress tests. On October 15, 2012, the FDIC published in the Federal...
NASA Technical Reports Server (NTRS)
Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh
2017-01-01
Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, while automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies to be carried out that can inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. The paper describes the initial validation of the automated simulation capability against results from previous IDM HITL experiments, quantifying the differences. The simulator is then used to explore the performance of the IDM concept under the simple scenario of a capacity constrained airport under a wide range of wind conditions.
NASA Astrophysics Data System (ADS)
Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; White, R. B.
2017-09-01
Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. In this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that has been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Additional information from the actual experiment enables further tuning of the model’s parameters to achieve a close match with measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.
Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less
Podestà, M.; Gorelenkova, M.; Gorelenkov, N. N.; ...
2017-07-20
Alfvénic instabilities (AEs) are well known as a potential cause of enhanced fast ion transport in fusion devices. Given a specific plasma scenario, quantitative predictions of (i) expected unstable AE spectrum and (ii) resulting fast ion transport are required to prevent or mitigate the AE-induced degradation in fusion performance. Reduced models are becoming an attractive tool to analyze existing scenarios as well as for scenario prediction in time-dependent simulations. Here, in this work, a neutral beam heated NSTX discharge is used as reference to illustrate the potential of a reduced fast ion transport model, known as kick model, that hasmore » been recently implemented for interpretive and predictive analysis within the framework of the time-dependent tokamak transport code TRANSP. Predictive capabilities for AE stability and saturation amplitude are first assessed, based on given thermal plasma profiles only. Predictions are then compared to experimental results, and the interpretive capabilities of the model further discussed. Overall, the reduced model captures the main properties of the instabilities and associated effects on the fast ion population. Finally, additional information from the actual experiment enables further tuning of the model's parameters to achieve a close match with measurements.« less
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Daigle, Matthew; Goebel, Kai; Spirkovska, Lilly; Sankararaman, Shankar; Ossenfort, John; Kulkarni, Chetan; McDermott, William; Poll, Scott
2016-01-01
As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have an automated framework to provide an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecast, predicted health of assets in the airspace, and so on. Towards this end, as part of our earlier work, we formulated the Real-Time Safety Monitoring (RTSM) framework for monitoring and predicting the state of safety and to predict unsafe events. In our previous work, the RTSM framework was demonstrated in simulation on three different constructed scenarios. In this paper, we further develop the framework and demonstrate it on real flight data from multiple data sources. Specifically, the flight data is obtained through the Shadow Mode Assessment using Realistic Technologies for the National Airspace System (SMART-NAS) Testbed that serves as a central point of collection, integration, and access of information from these different data sources. By testing and evaluating using real-world scenarios, we may accelerate the acceptance of the RTSM framework towards deployment. In this paper we demonstrate the framework's capability to not only estimate the state of safety in the NAS, but predict the time and location of unsafe events such as a loss of separation between two aircraft, or an aircraft encountering convective weather. The experimental results highlight the capability of the approach, and the kind of information that can be provided to operators to improve their situational awareness in the context of safety.
Trevors, J T
2010-06-01
Methods to research the origin of microbial life are limited. However, microorganisms were the first organisms on the Earth capable of cell growth and division, and interactions with their environment, other microbial cells, and eventually with diverse eukaryotic organisms. The origin of microbial life and the supporting scientific evidence are both an enigma and a scientific priority. Numerous hypotheses have been proposed, scenarios imagined, speculations presented in papers, insights shared, and assumptions made without supporting experimentation, which have led to limited progress in understanding the origin of microbial life. The use of the human imagination to envision the origin of life events, without supporting experimentation, observation and independently replicated experiments required for science, is a significant constraint. The challenge remains how to better understand the origin of microbial life using observations and experimental methods as opposed to speculation, assumptions, scenarios, envisioning events and un-testable hypotheses. This is not an easy challenge as experimental design and plausible hypothesis testing are difficult. Since past approaches have been inconclusive in providing evidence for the origin of microbial life mechanisms and the manner in which genetic instructions was encoded into DNA/RNA, it is reasonable and logical to propose that progress will be made when testable, plausible hypotheses and methods are used in the origin of microbial life research, and the experimental observations are, or are not reproduced in independent laboratories. These perspectives will be discussed in this article as well as the possibility that a pre-biotic film preceded a microbial biofilm as a possible micro-location for the origin of microbial cells capable of growth and division. 2010 Elsevier B.V. All rights reserved.
Schampaert, Stéphanie; van't Veer, Marcel; van de Vosse, Frans N; Pijls, Nico H J; de Mol, Bas A; Rutten, Marcel C M
2011-09-01
The Impella 2.5 left percutaneous (LP), a relatively new transvalvular assist device, challenges the position of the intra-aortic balloon pump (IABP), which has a long record in supporting patients after myocardial infarction and cardiac surgery. However, while more costly and more demanding in management, the advantages of the Impella 2.5 LP are yet to be established. The aim of this study was to evaluate the benefits of the 40 cc IABP and the Impella 2.5 LP operating at 47,000 rpm in vitro, and compare their circulatory support capabilities in terms of cardiac output, coronary flow, cardiac stroke work, and arterial blood pressure. Clinical scenarios of cardiogenic preshock and cardiogenic shock (CS), with blood pressure depression, lowered cardiac output, and constant heart rate of 80 bpm, were modeled in a model-controlled mock circulation, featuring a systemic, pulmonary, and coronary vascular bed. The ventricles, represented by servomotor-operated piston pumps, included the Frank-Starling mechanism. The systemic circulation was modeled with a flexible tube having close-to-human aortic dimensions and compliance properties. Proximally, it featured a branch mimicking the brachiocephalic arteries and a physiological correct coronary flow model. The rest of the systemic and pulmonary impedance was modeled by four-element Windkessel models. In this system, the enhancement of coronary flow and blood pressure was tested with both support systems under healthy and pathological conditions. Hemodynamic differences between the IABP and the Impella 2.5 LP were small. In our laboratory model, both systems approximately yielded a 10% cardiac output increase and a 10% coronary flow increase. However, since the Impella 2.5 LP provided significantly better left ventricular unloading, the circulatory support capabilities were slightly in favor of the Impella 2.5 LP. On the other hand, pulsatility was enhanced with the IABP and lowered with the Impella 2.5 LP. The support capabilities of both the IABP and the Impella 2.5 LP strongly depended on the simulated hemodynamic conditions. Maximum hemodynamic benefits were achieved when mechanical circulatory support was applied on a simulated scenario of deep CS. © 2011, Copyright Eindhoven University of Technology (TU/e). Artificial Organs © 2011, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
12 CFR Appendix A to Part 252 - Policy Statement on the Scenario Design Framework for Stress Testing
Code of Federal Regulations, 2014 CFR
2014-01-01
... 12 Banks and Banking 4 2014-01-01 2014-01-01 false Policy Statement on the Scenario Design... YY) Pt. 252, App. A Appendix A to Part 252—Policy Statement on the Scenario Design Framework for... (stress test rules) implementing section 165(i) of the Dodd-Frank Wall Street Reform and Consumer...
Constellation Architecture Team-Lunar Scenario 12.0 Habitation Overview
NASA Technical Reports Server (NTRS)
Kennedy, Kriss J.; Toups, Larry D.; Rudisill, Marianne
2010-01-01
This paper will describe an overview of the Constellation Architecture Team Lunar Scenario 12.0 (LS-12) surface habitation approach and concept performed during the study definition. The Lunar Scenario 12 architecture study focused on two primary habitation approaches: a horizontally-oriented habitation module (LS-12.0) and a vertically-oriented habitation module (LS-12.1). This paper will provide an overview of the 12.0 lunar surface campaign, the associated outpost architecture, habitation functionality, concept description, system integration strategy, mass and power resource estimates. The Scenario 12 architecture resulted from combining three previous scenario attributes from Scenario 4 "Optimized Exploration", Scenario 5 "Fission Surface Power System" and Scenario 8 "Initial Extensive Mobility" into Scenario 12 along with an added emphasis on defining the excursion ConOps while the crew is away from the outpost location. This paper will describe an overview of the CxAT-Lunar Scenario 12.0 habitation concepts and their functionality. The Crew Operations area includes basic crew accommodations such as sleeping, eating, hygiene and stowage. The EVA Operations area includes additional EVA capability beyond the suitlock function such as suit maintenance, spares stowage, and suit stowage. The Logistics Operations area includes the enhanced accommodations for 180 days such as enhanced life support systems hardware, consumable stowage, spares stowage, interconnection to the other habitation elements, a common interface mechanism for future growth, and mating to a pressurized rover or Pressurized Logistics Module (PLM). The Mission & Science Operations area includes enhanced outpost autonomy such as an IVA glove box, life support, medical operations, and exercise equipment.
Reverse Engineering Crosswind Limits - A New Flight Test Technique?
NASA Technical Reports Server (NTRS)
Asher, Troy A.; Willliams, Timothy L.; Strovers, Brian K.
2013-01-01
During modification of a Gulfstream III test bed aircraft for an experimental flap project, all roll spoiler hardware had to be removed to accommodate the test article. In addition to evaluating the effects on performance and flying qualities resulting from the modification, the test team had to determine crosswind limits for an airplane previously certified with roll spoilers. Predictions for the modified aircraft indicated the maximum amount of steady state sideslip available during the approach and landing phase would be limited by aileron authority rather than by rudder. Operating out of a location that tends to be very windy, an arbitrary and conservative wind limit would have either been overly restrictive or potentially unsafe if chosen poorly. When determining a crosswind limit, how much reserve roll authority was necessary? Would the aircraft, as configured, have suitable handling qualities for long-term use as a flying test bed? To answer these questions, the test team combined two typical flight test techniques into a new maneuver called the sideslip-to-bank maneuver, and was able to gather flying qualities data, evaluate aircraft response and measure trends for various crosswind scenarios. This paper will describe the research conducted, the maneuver, flight conditions, predictions, and results from this in-flight evaluation of crosswind capability.
Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meisner, Robert; McCoy, Michel; Archer, Bill
2013-09-11
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less
Overview of Experimental Capabilities - Supersonics
NASA Technical Reports Server (NTRS)
Banks, Daniel W.
2007-01-01
This viewgraph presentation gives an overview of experimental capabilities applicable to the area of supersonic research. The contents include: 1) EC Objectives; 2) SUP.11: Elements; 3) NRA; 4) Advanced Flight Simulator Flexible Aircraft Simulation Studies; 5) Advanced Flight Simulator Flying Qualities Guideline Development for Flexible Supersonic Transport Aircraft; 6) Advanced Flight Simulator Rigid/Flex Flight Control; 7) Advanced Flight Simulator Rapid Sim Model Exchange; 8) Flight Test Capabilities Advanced In-Flight Infrared (IR) Thermography; 9) Flight Test Capabilities In-Flight Schlieren; 10) Flight Test Capabilities CLIP Flow Calibration; 11) Flight Test Capabilities PFTF Flowfield Survey; 12) Ground Test Capabilities Laser-Induced Thermal Acoustics (LITA); 13) Ground Test Capabilities Doppler Global Velocimetry (DGV); 14) Ground Test Capabilities Doppler Global Velocimetry (DGV); and 15) Ground Test Capabilities EDL Optical Measurement Capability (PIV) for Rigid/Flexible Decelerator Models.
Li, Chih-Huang; Kuan, Win-Sen; Mahadevan, Malcolm; Daniel-Underwood, Lynda; Chiu, Te-Fa; Nguyen, H Bryant
2012-07-01
Medical simulation has been used to teach critical illness in a variety of settings. This study examined the effect of didactic lectures compared with simulated case scenario in a medical simulation course on the early management of severe sepsis. A prospective multicentre randomised study was performed enrolling resident physicians in emergency medicine from four hospitals in Asia. Participants were randomly assigned to a course that included didactic lectures followed by a skills workshop and simulated case scenario (lecture-first) or to a course that included a skills workshop and simulated case scenario followed by didactic lectures (simulation-first). A pre-test was given to the participants at the beginning of the course, post-test 1 was given after the didactic lectures or simulated case scenario depending on the study group assignment, then a final post-test 2 was given at the end of the course. Performance on the simulated case scenario was evaluated with a performance task checklist. 98 participants were enrolled in the study. Post-test 2 scores were significantly higher than pre-test scores in all participants (80.8 ± 12.0% vs 65.4 ± 12.2%, p<0.01). There was no difference in pre-test scores between the two study groups. The lecture-first group had significantly higher post-test 1 scores than the simulation-first group (78.8 ± 10.6% vs 71.6 ± 12.6%, p<0.01). There was no difference in post-test 2 scores between the two groups. The simulated case scenario task performance completion was 90.8% (95% CI 86.6% to 95.0%) in the lecture-first group compared with 83.8% (95% CI 79.5% to 88.1%) in the simulation-first group (p=0.02). A medical simulation course can improve resident physician knowledge in the early management of severe sepsis. Such a course should include a comprehensive curriculum that includes didactic lectures followed by simulation experience.
Prospects for steady-state scenarios on JET
NASA Astrophysics Data System (ADS)
Litaudon, X.; Bizarro, J. P. S.; Challis, C. D.; Crisanti, F.; DeVries, P. C.; Lomas, P.; Rimini, F. G.; Tala, T. J. J.; Akers, R.; Andrew, Y.; Arnoux, G.; Artaud, J. F.; Baranov, Yu F.; Beurskens, M.; Brix, M.; Cesario, R.; DeLa Luna, E.; Fundamenski, W.; Giroud, C.; Hawkes, N. C.; Huber, A.; Joffrin, E.; Pitts, R. A.; Rachlew, E.; Reyes-Cortes, S. D. A.; Sharapov, S. E.; Zastrow, K. D.; Zimmermann, O.; JET EFDA contributors, the
2007-09-01
In the 2006 experimental campaign, progress has been made on JET to operate non-inductive scenarios at higher applied powers (31 MW) and density (nl ~ 4 × 1019 m-3), with ITER-relevant safety factor (q95 ~ 5) and plasma shaping, taking advantage of the new divertor capabilities. The extrapolation of the performance using transport modelling benchmarked on the experimental database indicates that the foreseen power upgrade (~45 MW) will allow the development of non-inductive scenarios where the bootstrap current is maximized together with the fusion yield and not, as in present-day experiments, at its expense. The tools for the long-term JET programme are the new ITER-like ICRH antenna (~15 MW), an upgrade of the NB power (35 MW/20 s or 17.5 MW/40 s), a new ITER-like first wall, a new pellet injector for edge localized mode control together with improved diagnostic and control capability. Operation with the new wall will set new constraints on non-inductive scenarios that are already addressed experimentally and in the modelling. The fusion performance and driven current that could be reached at high density and power have been estimated using either 0D or 1-1/2D validated transport models. In the high power case (45 MW), the calculations indicate the potential for the operational space of the non-inductive regime to be extended in terms of current (~2.5 MA) and density (nl > 5 × 1019 m-3), with high βN (βN > 3.0) and a fraction of the bootstrap current within 60-70% at high toroidal field (~3.5 T).
McCoy, Allison B; Wright, Adam; Sittig, Dean F
2015-09-01
Clinical decision support (CDS) is essential for delivery of high-quality, cost-effective, and safe healthcare. The authors sought to evaluate the CDS capabilities across electronic health record (EHR) systems. We evaluated the CDS implementation capabilities of 8 Office of the National Coordinator for Health Information Technology Authorized Certification Body (ONC-ACB)-certified EHRs. Within each EHR, the authors attempted to implement 3 user-defined rules that utilized the various data and logic elements expected of typical EHRs and that represented clinically important evidenced-based care. The rules were: 1) if a patient has amiodarone on his or her active medication list and does not have a thyroid-stimulating hormone (TSH) result recorded in the last 12 months, suggest ordering a TSH; 2) if a patient has a hemoglobin A1c result >7% and does not have diabetes on his or her problem list, suggest adding diabetes to the problem list; and 3) if a patient has coronary artery disease on his or her problem list and does not have aspirin on the active medication list, suggest ordering aspirin. Most evaluated EHRs lacked some CDS capabilities; 5 EHRs were able to implement all 3 rules, and the remaining 3 EHRs were unable to implement any of the rules. One of these did not allow users to customize CDS rules at all. The most frequently found shortcomings included the inability to use laboratory test results in rules, limit rules by time, use advanced Boolean logic, perform actions from the alert interface, and adequately test rules. Significant improvements in the EHR certification and implementation procedures are necessary. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Autonomous Vision Navigation for Spacecraft in Lunar Orbit
NASA Astrophysics Data System (ADS)
Bader, Nolan A.
NASA aims to achieve unprecedented navigational reliability for the first manned lunar mission of the Orion spacecraft in 2023. A technique for accomplishing this is to integrate autonomous feature tracking as an added means of improving position and velocity estimation. In this thesis, a template matching algorithm and optical sensor are tested onboard three simulated lunar trajectories using linear covariance techniques under various conditions. A preliminary characterization of the camera gives insight into its ability to determine azimuth and elevation angles to points on the surface of the Moon. A navigation performance analysis shows that an optical camera sensor can aid in decreasing position and velocity errors, particularly in a loss of communication scenario. Furthermore, it is found that camera quality and computational capability are driving factors affecting the performance of such a system.
Simulation of a weather radar display for over-water airborne radar approaches
NASA Technical Reports Server (NTRS)
Clary, G. R.
1983-01-01
Airborne radar approach (ARA) concepts are being investigated as a part of NASA's Rotorcraft All-Weather Operations Research Program on advanced guidance and navigation methods. This research is being conducted using both piloted simulations and flight test evaluations. For the piloted simulations, a mathematical model of the airborne radar was developed for over-water ARAs to offshore platforms. This simulated flight scenario requires radar simulation of point targets, such as oil rigs and ships, distributed sea clutter, and transponder beacon replies. Radar theory, weather radar characteristics, and empirical data derived from in-flight radar photographs are combined to model a civil weather/mapping radar typical of those used in offshore rotorcraft operations. The resulting radar simulation is realistic and provides the needed simulation capability for ongoing ARA research.
Improving Conflict Alert Performance Using Moving Target Detector Data.
1982-06-01
2 L136 IIIII I lIlS 1 1 10 11120 125 11111I ~1.6 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU Of SIANDARDg 19bi A DOT/FAA/RD-82/47 DOT/FAA/CT-81...Differences for Stochastic Case 23 7 Illustration of Scenarios for Warning Time Tests 30 8 Illustration of Scenarios Used for Nuisance Alert 35 Area...Nuisance Alert Area Analysis of Scenario 3 with a Target 64 Velocity of 480 Knots and SPMB= SPPB =2.8 nmi 12 Nuisance Alert Area Analysis of Scenario 3
Toward Interactive Scenario Analysis and Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gayle, Thomas R.; Summers, Kenneth Lee; Jungels, John
2015-01-01
As Modeling and Simulation (M&S) tools have matured, their applicability and importance have increased across many national security challenges. In particular, they provide a way to test how something may behave without the need to do real world testing. However, current and future changes across several factors including capabilities, policy, and funding are driving a need for rapid response or evaluation in ways that many M&S tools cannot address. Issues around large data, computational requirements, delivery mechanisms, and analyst involvement already exist and pose significant challenges. Furthermore, rising expectations, rising input complexity, and increasing depth of analysis will only increasemore » the difficulty of these challenges. In this study we examine whether innovations in M&S software coupled with advances in ''cloud'' computing and ''big-data'' methodologies can overcome many of these challenges. In particular, we propose a simple, horizontally-scalable distributed computing environment that could provide the foundation (i.e. ''cloud'') for next-generation M&S-based applications based on the notion of ''parallel multi-simulation''. In our context, the goal of parallel multi- simulation is to consider as many simultaneous paths of execution as possible. Therefore, with sufficient resources, the complexity is dominated by the cost of single scenario runs as opposed to the number of runs required. We show the feasibility of this architecture through a stable prototype implementation coupled with the Umbra Simulation Framework [6]. Finally, we highlight the utility through multiple novel analysis tools and by showing the performance improvement compared to existing tools.« less
A Novel UAV Electric Propulsion Testbed for Diagnostics and Prognostics
NASA Technical Reports Server (NTRS)
Gorospe, George E., Jr.; Kulkarni, Chetan S.
2017-01-01
This paper presents a novel hardware-in-the-loop (HIL) testbed for systems level diagnostics and prognostics of an electric propulsion system used in UAVs (unmanned aerial vehicle). Referencing the all electric, Edge 540T aircraft used in science and research by NASA Langley Flight Research Center, the HIL testbed includes an identical propulsion system, consisting of motors, speed controllers and batteries. Isolated under a controlled laboratory environment, the propulsion system has been instrumented for advanced diagnostics and prognostics. To produce flight like loading on the system a slave motor is coupled to the motor under test (MUT) and provides variable mechanical resistance, and the capability of introducing nondestructive mechanical wear-like frictional loads on the system. This testbed enables the verification of mathematical models of each component of the propulsion system, the repeatable generation of flight-like loads on the system for fault analysis, test-to-failure scenarios, and the development of advanced system level diagnostics and prognostics methods. The capabilities of the testbed are extended through the integration of a LabVIEW-based client for the Live Virtual Constructive Distributed Environment (LVCDC) Gateway which enables both the publishing of generated data for remotely located observers and prognosers and the synchronization the testbed propulsion system with vehicles in the air. The developed HIL testbed gives researchers easy access to a scientifically relevant portion of the aircraft without the overhead and dangers encountered during actual flight.
Development of the PRSEUS Multi-Bay Pressure Box for a Hybrid Wing Body Vehicle
NASA Technical Reports Server (NTRS)
Jegley, Dawn C.; Velicki, Alexander
2015-01-01
NASA has created the Environmentally Responsible Aviation Project to explore and document the feasibility, benefits, and technical risk of advanced vehicle configurations and enabling technologies that will reduce the impact of aviation on the environment. A critical aspect of this pursuit is the development of a lighter, more robust airframe that will enable the introduction of unconventional aircraft configurations that have higher lift-to-drag ratios, reduced drag, and lower community noise. Although such novel configurations like the Hybrid Wing Body (HWB) offer better aerodynamic performance as compared to traditional tube-and-wing aircraft, their blended wing shapes also pose significant new design challenges. Developing an improved structural concept that is capable of meeting the structural weight fraction allocated for these non-circular pressurized cabins is the primary obstacle in implementing large lifting-body designs. To address this challenge, researchers at NASA and The Boeing Company are working together to advance new structural concepts like the Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS), which is an integrally stiffened panel design that is stitched together and designed to maintain residual load-carrying capabilities under a variety of damage scenarios. The large-scale multi-bay fuselage test article described in this paper is the final specimen in a building-block test program that was conceived to demonstrate the feasibility of meeting the structural weight goals established for the HWB pressure cabin.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Austin; Martin, Gregory; Hurtt, James
As revised interconnection standards for grid-tied photovoltaic (PV) inverters address new advanced grid support functions (GSFs), there is increasing interest in inverter performance in the case of abnormal grid conditions. The growth of GSF-enabled inverters has outpaced the industry standards that define their operation, although recently published updates to UL1741 with Supplement SA define test conditions for GSFs such as volt-var control, frequency-watt control, and volt-age/frequency ride-through, among others. A comparative experimental evaluation has been completed on four commercially available, three-phase PV inverters in the 24.0-39.8 kVA power range on their GSF capability and the effect on abnormal grid conditionmore » response. This study examines the impact particular GSF implementations have on run-on times during islanding conditions, peak voltages in load rejection overvoltage scenarios, and peak currents during single-phase and three-phase fault events for individual inverters. This report reviews comparative test data, which shows that GSFs have little impact on the metrics of interest in most tests cases.« less
Experimental Evaluation of Grid Support Enabled PV Inverter Response to Abnormal Grid Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Austin A; Martin, Gregory D; Hurtt, James
As revised interconnection standards for grid-tied photovoltaic (PV) inverters address new advanced grid support functions (GSFs), there is increasing interest in inverter performance in the case of abnormal grid conditions. The growth of GSF-enabled inverters has outpaced the industry standards that define their operation, although recently published updates to UL1741 Supplement SA define test conditions for GSFs such as volt-var control, frequency-watt control, and voltage/frequency ride-through, among others. This paper describes the results of a comparative experimental evaluation on four commercially available, three-phase PV inverters in the 24.0-39.8 kVA power range on their GSF capability and its effect on abnormalmore » grid condition response. The evaluation examined the impact particular GSF implementations have on run-on times during islanding conditions, peak voltages in load rejection overvoltage scenarios, and peak currents during single-phase and three-phase fault events for individual inverters. Testing results indicated a wide variance in the performance of GSF enabled inverters to various test cases.« less
Moolenaar, Lobke M; Broekmans, Frank J M; van Disseldorp, Jeroen; Fauser, Bart C J M; Eijkemans, Marinus J C; Hompes, Peter G A; van der Veen, Fulco; Mol, Ben Willem J
2011-10-01
To compare the cost effectiveness of ovarian reserve testing in in vitro fertilization (IVF). A Markov decision model based on data from the literature and original patient data. Decision analytic framework. Computer-simulated cohort of subfertile women aged 20 to 45 years who are eligible for IVF. [1] No treatment, [2] up to three cycles of IVF limited to women under 41 years and no ovarian reserve testing, [3] up to three cycles of IVF with dose individualization of gonadotropins according to ovarian reserve, and [4] up to three cycles of IVF with ovarian reserve testing and exclusion of expected poor responders after the first cycle, with no treatment scenario as the reference scenario. Cumulative live birth over 1 year, total costs, and incremental cost-effectiveness ratios. The cumulative live birth was 9.0% in the no treatment scenario, 54.8% for scenario 2, 70.6% for scenario 3 and 51.9% for scenario 4. Absolute costs per woman for these scenarios were €0, €6,917, €6,678, and €5,892 for scenarios 1, 2, 3, and 4, respectively. Incremental cost-effectiveness ratios (ICER) for scenarios 2, 3, and 4 were €15,166, €10,837, and €13,743 per additional live birth. Sensitivity analysis showed the model to be robust over a wide range of values. Individualization of the follicle-stimulating hormone dose according to ovarian reserve is likely to be cost effective in women who are eligible for IVF, but this effectiveness needs to be confirmed in randomized clinical trials. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Cassim, Naseem; Coetzee, Lindi Marie; Schnippel, Kathryn; Glencross, Deborah Kim
2017-01-01
During 2016, the National Health Laboratory Service (NHLS) introduced laboratory-based reflexed Cryptococcal antigen (CrAg) screening to detect early Cryptococcal disease in immunosuppressed HIV+ patients with a confirmed CD4 count of 100 cells/μl or less. The aim of this study was to assess cost-per-result of a national screening program across different tiers of laboratory service, with variable daily CrAg test volumes. The impact of potential ART treatment guideline and treatment target changes on CrAg volumes, platform choice and laboratory workflow are considered. CD4 data (with counts < = 100 cells/μl) from the fiscal year 2015/16 were extracted from the NHLS Corporate Date Warehouse and used to project anticipated daily CrAg testing volumes with appropriately-matched CrAg testing platforms allocated at each of 52 NHLS CD4 laboratories. A cost-per-result was calculated for four scenarios, including the existing service status quo (Scenario-I), and three other settings (as Scenarios II-IV) which were based on information from recent antiretroviral (ART) guidelines, District Health Information System (DHIS) data and UNAIDS 90/90/90 HIV/AIDS treatment targets. Scenario-II forecast CD4 testing offered only to new ART initiates recorded at DHIS. Scenario-III projected all patients notified as HIV+, but not yet on ART (recorded at DHIS) and Scenario-IV forecast CrAg screening in 90% of estimated HIV+ patients across South Africa (also DHIS). Stata was used to assess daily CrAg volumes at the 5th, 10th, 25th, 50th, 75th, 90th and 95th percentiles across 52 CD4-laboratories. Daily volumes were used to determine technical effort/ operator staff costs (% full time equivalent) and cost-per-result for all scenarios. Daily volumes ranged between 3 and 64 samples for Scenario-I at the 5th and 95th percentile. Similarly, daily volumes ranges of 1-12, 2-45 and 5-100 CrAg-directed samples were noted for Scenario's II, III and IV respectively. A cut-off of 30 CrAg tests per day defined use of either LFA or EIA platform. LFA cost-per-result ranged from $8.24 to $5.44 and EIA cost-per-result between $5.58 and $4.88 across the range of test volumes. The technical effort across scenarios ranged from 3.2-27.6% depending on test volumes and platform used. The study reported the impact of programmatic testing requirements on varying CrAg test volumes that subsequently influenced choice of testing platform, laboratory workflow and cost-per-result. A novel percentiles approach is described that enables an overview of the cost-per-result across a national program. This approach facilitates cross-subsidisation of more expensive lower volume sites with cost-efficient, more centralized higher volume laboratories, mitigating against the risk of costing tests at a single site.
Final Report: Assessment of Combined Heat and Power Premium Power Applications in California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norwood, Zack; Lipman, Tim; Marnay, Chris
2008-09-30
This report analyzes the current economic and environmental performance of combined heat and power (CHP) systems in power interruption intolerant commercial facilities. Through a series of three case studies, key trade-offs are analyzed with regard to the provision of black-out ridethrough capability with the CHP systems and the resutling ability to avoid the need for at least some diesel backup generator capacity located at the case study sites. Each of the selected sites currently have a CHP or combined heating, cooling, and power (CCHP) system in addition to diesel backup generators. In all cases the CHP/CCHP system have a smallmore » fraction of the electrical capacity of the diesel generators. Although none of the selected sites currently have the ability to run the CHP systems as emergency backup power, all could be retrofitted to provide this blackout ride-through capability, and new CHP systems can be installed with this capability. The following three sites/systems were used for this analysis: (1) Sierra Nevada Brewery - Using 1MW of installed Molten Carbonate Fuel Cells operating on a combination of digestor gas (from the beer brewing process) and natural gas, this facility can produce electricty and heat for the brewery and attached bottling plant. The major thermal load on-site is to keep the brewing tanks at appropriate temperatures. (2) NetApp Data Center - Using 1.125 MW of Hess Microgen natural gas fired reciprocating engine-generators, with exhaust gas and jacket water heat recovery attached to over 300 tons of of adsorption chillers, this combined cooling and power system provides electricity and cooling to a data center with a 1,200 kW peak electrical load. (3) Kaiser Permanente Hayward Hospital - With 180kW of Tecogen natural gas fired reciprocating engine-generators this CHP system generates steam for space heating, and hot water for a city hospital. For all sites, similar assumptions are made about the economic and technological constraints of the power generation system. Using the Distributed Energy Resource Customer Adoption Model (DER-CAM) developed at the Lawrence Berkeley National Laboratory, we model three representative scenarios and find the optimal operation scheduling, yearly energy cost, and energy technology investments for each scenario below: Scenario 1 - Diesel generators and CHP/CCHP equipment as installed in the current facility. Scenario 1 represents a baseline forced investment in currently installed energy equipment. Scenario 2 - Existing CHP equipment installed with blackout ride-through capability to replace approximately the same capacity of diesel generators. In Scenario 2 the cost of the replaced diesel units is saved, however additional capital cost for the controls and switchgear for blackout ride-through capability is necessary. Scenario 3 - Fully optimized site analysis, allowing DER-CAM to specify the number of diesel and CHP/CCHP units (with blackout ride-through capability) that should be installed ignoring any constraints on backup generation. Scenario 3 allows DER-CAM to optimize scheduling and number of generation units from the currently available technologies at a particular site. The results of this analysis, using real data to model the optimal schedulding of hypothetical and actual CHP systems for a brewery, data center, and hospital, lead to some interesting conclusions. First, facilities with high heating loads will typically prove to be the most appropriate for CHP installation from a purely economic standpoint. Second, absorption/adsorption cooling systems may only be economically feasible if the technology for these chillers can increase above current best system efficiency. At a coefficient of performance (COP) of 0.8, for instance, an adsorption chiller paired with a natural gas generator with waste heat recovery at a facility with large cooling loads, like a data center, will cost no less on a yearly basis than purchasing electricity and natural gas directly from a utility. Third, at marginal additional cost, if the reliability of CHP systems proves to be at least as high as diesel generators (which we expect to be the case), the CHP system could replace the diesel generator at little or no additional cost. This is true if the thermal to electric (relative) load of those facilities was already high enough to economically justify a CHP system. Last, in terms of greenhouse gas emissions, the modeled CHP and CCHP systems provide some degree of decreased emissions relative to systems with less CHP installed. The emission reduction can be up to 10% in the optimized case (Scenario 3) in the application with the highest relative thermal load, in this case the hospital. Although these results should be qualified because they are only based on the three case studies, the general results and lessons learned are expected to be applicable across a broad range of potential and existing CCHP systems.« less
Limardi, S; Rocco, G; Stievano, A; Vellone, E; Valle, A; Torino, F; Alvaro, R
2014-01-01
Nurses, following their ethical mandate, collaborate with other health and social professionals or people involved in caring activities. Caregivers in this context are becoming more and more significant for the family or the cared person, who for their stable presence and emotional proximity play a pivotal caring role. To maximize the contribution of caregivers, objective tools that emphasize their skill sets are necessary. The cross-cultural adaptation and validation of the Family Decision Making Self-Efficacy Scale is part of a larger project aimed at understanding the resilience of caregivers in the field of palliative care. Self-efficacy is one of the aspects of personality most closely associated with resilience. Self-efficacy is shown in a specific context, therefore, its study and evaluation of its level, require capabilities that enable individuals perceive themselves as effective in a particular circumstance. The Family Decision Making Self- Efficacy Scale assesses the behavior of caregivers of patients at the end of their life. The Family Decision Making Self-Efficacy Scale was translated (forward and back translation) and was adapted to the Italian clinical cultural setting by a research team that included experts in palliative care, native translators with experience in nursing and experts in nursing. A consensus on the wording of each item in relation to semantic, idiomatic, experiential and conceptual equivalence was sought. The clarity of the wording and the pertinence of the items of the scenario with the conscious patient and with the unconscious patient were evaluated by a group of caregivers who tested the instrument. The Italian version of the instrument included 12 items for the scenario with the conscious patient and 12 for the scenario with the unconscious patient. The working group expressed consensus on the pretesting version of the instrument. The pre-testing version of the scale was tested on 60 caregivers, 47 taking care of conscious patients and 13 taking care of unconscious patients. In both cases the content of the items was judged relevant and understandable. The results for the cross-cultural validation were satisfactory and allowed the application of the instrument in the Italian context.
NASA Technical Reports Server (NTRS)
Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh
2017-01-01
Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies that will inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. This paper also describes the initial validation of individual components of the automated simulation capability, and an example application comparing the performance of the IDM concept under two TBFM scheduling paradigms. The results and conclusions from this simulation compare closely to those from previous HITL simulations using similar scenarios, providing an initial validation of the automated simulation capability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Xunxiang; Ang, Caen K.; Singh, Gyanender P.
Driven by the need to enlarge the safety margins of nuclear fission reactors in accident scenarios, research and development of accident-tolerant fuel has become an important topic in the nuclear engineering and materials community. A continuous-fiber SiC/SiC composite is under consideration as a replacement for traditional zirconium alloy cladding owing to its high-temperature stability, chemical inertness, and exceptional irradiation resistance. An important task is the development of characterization techniques for SiC/SiC cladding, since traditional work using rectangular bars or disks cannot directly provide useful information on the properties of SiC/SiC composite tubes for fuel cladding applications. At Oak Ridge Nationalmore » Laboratory, experimental capabilities are under development to characterize the modulus, microcracking, and hermeticity of as-fabricated, as-irradiated SiC/SiC composite tubes. Resonant ultrasound spectroscopy has been validated as a promising technique to evaluate the elastic properties of SiC/SiC composite tubes and microcracking within the material. A similar technique, impulse excitation, is efficient in determining the basic mechanical properties of SiC bars prepared by chemical vapor deposition; it also has potential for application in studying the mechanical properties of SiC/SiC composite tubes. Complete evaluation of the quality of the developed coatings, a major mitigation strategy against gas permeation and hydrothermal corrosion, requires the deployment of various experimental techniques, such as scratch indentation, tensile pulling-off tests, and scanning electron microscopy. In addition, a comprehensive permeation test station is being established to assess the hermeticity of SiC/SiC composite tubes and to determine the H/D/He permeability of SiC/SiC composites. This report summarizes the current status of the development of these experimental capabilities.« less
Ahmadi, Mahmoud Kamal; Fawaz, Samar; Jones, Charles H.; Zhang, Guojian
2015-01-01
Yersiniabactin (Ybt) is a mixed nonribosomal peptide-polyketide natural product natively produced by the pathogen Yersinia pestis. The compound enables iron scavenging capabilities upon host infection and is biosynthesized by a nonribosomal peptide synthetase featuring a polyketide synthase module. This pathway has been engineered for expression and biosynthesis using Escherichia coli as a heterologous host. In the current work, the biosynthetic process for Ybt formation was improved through the incorporation of a dedicated step to eliminate the need for exogenous salicylate provision. When this improvement was made, the compound was tested in parallel applications that highlight the metal-chelating nature of the compound. In the first application, Ybt was assessed as a rust remover, demonstrating a capacity of ∼40% compared to a commercial removal agent and ∼20% relative to total removal capacity. The second application tested Ybt in removing copper from a variety of nonbiological and biological solution mixtures. Success across a variety of media indicates potential utility in diverse scenarios that include environmental and biomedical settings. PMID:26025901
2013-07-08
LAS VEGAS, Nev. – The Boeing Company performed simulated contingency water landing scenarios with a mock-up CST-100 spacecraft at Bigelow Aerospace's headquarters near Las Vegas. The CST-100 is designed for ground landings, but could splash down on the water, if necessary. During the water tests, Department of Defense search-and-recovery personnel practiced pulling five Boeing engineers out of the capsule and to safety. The tests are part of the company’s ongoing work supporting its funded Space Act Agreement with NASA’s Commercial Crew Program, or CCP, during the Commercial Crew Integrated Capability, or CCiCap, initiative. CCP is intended to lead to the availability of commercial human spaceflight services for government and commercial customers to low-Earth orbit. Future development and certification initiatives eventually will lead to the availability of human spaceflight services for NASA to send its astronauts to the International Space Station, where critical research is taking place daily. For more information about CCP, go to http://www.nasa.gov/commercialcrew. Photo credit: Boeing
Evolution of CO2 and H2O on Mars: A cold Early History?
NASA Technical Reports Server (NTRS)
Niles, P. B.; Michalski, J.
2011-01-01
The martian climate has long been thought to have evolved substantially through history from a warm and wet period to the current cold and dry conditions on the martian surface. This view has been challenged based primarily on evidence that the early Sun had a substantially reduced luminosity and that a greenhouse atmosphere would be difficult to sustain on Mars for long periods of time. In addition, the evidence for a warm, wet period of martian history is far from conclusive with many of the salient features capable of being explained by an early cold climate. An important test of the warm, wet early Mars hypothesis is the abundance of carbonates in the crust [1]. Recent high precision isotopic measurements of the martian atmosphere and discoveries of carbonates on the martian surface provide new constraints on the evolution of the martian atmosphere. This work seeks to apply these constraints to test the feasibility of the cold early scenario
NSCL and FRIB at Michigan State University: Nuclear science at the limits of stability
NASA Astrophysics Data System (ADS)
Gade, A.; Sherrill, B. M.
2016-05-01
The National Superconducting Cyclotron Laboratory (NSCL) at Michigan State University (MSU) is a scientific user facility that offers beams of rare isotopes at a wide range of energies. This article describes the facility, its capabilities, and some of the experimental devices used to conduct research with rare isotopes. The versatile nuclear science program carried out by researchers at NSCL continues to address the broad challenges of the field, employing sensitive experimental techniques that have been developed and optimized for measurements with rare isotopes produced by in-flight separation. Selected examples showcase the broad program, capabilities, and the relevance for forefront science questions in nuclear physics, addressing, for example, the limits of nuclear existence; the nature of the nuclear force; the origin of the elements in the cosmos; the processes that fuel explosive scenarios in the Universe; and tests for physics beyond the standard model of particle physics. NSCL will cease operations in approximately 2021. The future program will be carried out at the Facility for Rare Isotope Beams, FRIB, presently under construction on the MSU campus adjacent to NSCL. FRIB will provide fast, stopped, and reaccelerated beams of rare isotopes at intensities exceeding NSCL’s capabilities by three orders of magnitude. An outlook will be provided on the enormous opportunities that will arise upon completion of FRIB in the early 2020s.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamanini, Nicola; Caprini, Chiara; Barausse, Enrico
We investigate the capability of various configurations of the space interferometer eLISA to probe the late-time background expansion of the universe using gravitational wave standard sirens. We simulate catalogues of standard sirens composed by massive black hole binaries whose gravitational radiation is detectable by eLISA, and which are likely to produce an electromagnetic counterpart observable by future surveys. The main issue for the identification of a counterpart resides in the capability of obtaining an accurate enough sky localisation with eLISA. This seriously challenges the capability of four-link (2 arm) configurations to successfully constrain the cosmological parameters. Conversely, six-link (3 arm)more » configurations have the potential to provide a test of the expansion of the universe up to z ∼ 8 which is complementary to other cosmological probes based on electromagnetic observations only. In particular, in the most favourable scenarios, they can provide a significant constraint on H{sub 0} at the level of 0.5%. Furthermore, (Ω{sub M}, Ω{sub Λ}) can be constrained to a level competitive with present SNIa results. On the other hand, the lack of massive black hole binary standard sirens at low redshift allows to constrain dark energy only at the level of few percent.« less
ENERGY AND OUR ENVIRONMENT: A SYSTEMS AND LIFE CYCLE PERSPECTIVE
This is a presentation to the North Carolina BREATE Conference on March 28, 2017. This presentation provides an overview of energy modeling capabilities in ORD, and includes examples related to scenario development, water-energy nexus, bioenergy, etc. The focus is on system ap...
Estimated migration rates under scenarios of global climate change.
Jay R. Malcolm; Adam Markham; Ronald P. Neilson; Michael Oaraci
2002-01-01
Greefihouse-induced warming and resulting shifts in climatic zones may exceed the migration capabilities of some species. We used fourteen combinations of General Circulation Models (GCMs) and Global Vegetation Models (GVMs) to investigate possible migration rates required under CO2 doubled climatic forcing.
USDA-ARS?s Scientific Manuscript database
Ballistic delivery capability is essential to delivering vaccines and other therapeutics effectively to both livestock and wildlife in many global scenarios. Here, lyophilized poly(ethylene glycol) (PEG)-glycolide dimethacrylate crosslinked but degradable hydrogels were assessed as payload vehicles ...
Development, Demonstration, and Control of a Testbed for Multiterminal HVDC System
Li, Yalong; Shi, Xiaojie M.; Liu, Bo; ...
2016-10-21
This paper presents the development of a scaled four-terminal high-voltage direct current (HVDC) testbed, including hardware structure, communication architecture, and different control schemes. The developed testbed is capable of emulating typical operation scenarios including system start-up, power variation, line contingency, and converter station failure. Some unique scenarios are also developed and demonstrated, such as online control mode transition and station re-commission. In particular, a dc line current control is proposed, through the regulation of a converter station at one terminal. By controlling a dc line current to zero, the transmission line can be opened by using relatively low-cost HVDC disconnectsmore » with low current interrupting capability, instead of the more expensive dc circuit breaker. Utilizing the dc line current control, an automatic line current limiting scheme is developed. As a result, when a dc line is overloaded, the line current control will be automatically activated to regulate current within the allowable maximum value.« less
Development of a GIS-based spill management information system.
Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D
2004-08-30
Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.
Graves, Robert W.; Aagaard, Brad T.
2011-01-01
Using a suite of five hypothetical finite-fault rupture models, we test the ability of long-period (T>2.0 s) ground-motion simulations of scenario earthquakes to produce waveforms throughout southern California consistent with those recorded during the 4 April 2010 Mw 7.2 El Mayor-Cucapah earthquake. The hypothetical ruptures are generated using the methodology proposed by Graves and Pitarka (2010) and require, as inputs, only a general description of the fault location and geometry, event magnitude, and hypocenter, as would be done for a scenario event. For each rupture model, two Southern California Earthquake Center three-dimensional community seismic velocity models (CVM-4m and CVM-H62) are used, resulting in a total of 10 ground-motion simulations, which we compare with recorded ground motions. While the details of the motions vary across the simulations, the median levels match the observed peak ground velocities reasonably well, with the standard deviation of the residuals generally within 50% of the median. Simulations with the CVM-4m model yield somewhat lower variance than those with the CVM-H62 model. Both models tend to overpredict motions in the San Diego region and underpredict motions in the Mojave desert. Within the greater Los Angeles basin, the CVM-4m model generally matches the level of observed motions, whereas the CVM-H62 model tends to overpredict the motions, particularly in the southern portion of the basin. The variance in the peak velocity residuals is lowest for a rupture that has significant shallow slip (<5 km depth), whereas the variance in the residuals is greatest for ruptures with large asperities below 10 km depth. Overall, these results are encouraging and provide confidence in the predictive capabilities of the simulation methodology, while also suggesting some regions in which the seismic velocity models may need improvement.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY ANNUAL STRESS TEST § 46.2 Definitions. For... appropriate for use in the stress tests under this part, including, but not limited to, baseline, adverse, and severely adverse scenarios. Stress test means a process to assess the potential impact of scenarios on the...
Code of Federal Regulations, 2014 CFR
2014-01-01
... Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY ANNUAL STRESS TEST § 46.2 Definitions. For... appropriate for use in the stress tests under this part, including, but not limited to, baseline, adverse, and severely adverse scenarios. Stress test means a process to assess the potential impact of scenarios on the...
NASA Astrophysics Data System (ADS)
Demir, I.
2013-12-01
Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.
Scenario Decomposition for 0-1 Stochastic Programs: Improvements and Asynchronous Implementation
Ryan, Kevin; Rajan, Deepak; Ahmed, Shabbir
2016-05-01
We recently proposed scenario decomposition algorithm for stochastic 0-1 programs finds an optimal solution by evaluating and removing individual solutions that are discovered by solving scenario subproblems. In our work, we develop an asynchronous, distributed implementation of the algorithm which has computational advantages over existing synchronous implementations of the algorithm. Improvements to both the synchronous and asynchronous algorithm are proposed. We also test the results on well known stochastic 0-1 programs from the SIPLIB test library and is able to solve one previously unsolved instance from the test set.
Spacesuit Portable Life Support System Breadboard (PLSS 1.0) Development and Test Results
NASA Technical Reports Server (NTRS)
Watts, Carly A.; Vogel, Matt
2012-01-01
A multi-year effort has been carried out at the Johnson Space Center to develop an advanced EVA PLSS design intended to further the current state of the art by increasing operational flexibility, reducing consumables, and increasing robustness. This multi-year effort has culminated in the construction and operation of PLSS 1.0, a test rig that simulates full functionality of the advanced PLSS design. PLSS 1.0 integrates commercial off-the-shelf hardware with prototype technology development components, including the primary and secondary oxygen regulators, ventilation loop fan, Rapid Cycle Amine (RCA) swingbed, and Spacesuit Water Membrane Evaporator (SWME). PLSS 1.0 was tested from June 17th through September 30th, 2011. Testing accumulated 233 hours over 45 days, while executing 119 test points. An additional 164 hours of operational time were accrued during the test series, bringing the total operational time for PLSS 1.0 testing to 397 hours. Specific PLSS 1.0 test objectives assessed during this testing include: (1) Confirming prototype components perform in a system level test as they have performed during component level testing, (2) Identifying unexpected system-level interactions (3) Operating PLSS 1.0 in nominal steady-state EVA modes to baseline subsystem performance with respect to metabolic rate, ventilation loop pressure and flow rate, and environmental conditions (4) Simulating nominal transient EVA operational scenarios (5) Simulating contingency EVA operational scenarios (6) Further evaluating prototype technology development components Successful testing of the PLSS 1.0 provided a large database of test results that characterize system level and component performance. With the exception of several minor anomalies, the PLSS 1.0 test rig performed as expected. Documented anomalies and observations include: (1) Ventilation loop fan controller issues at high fan speeds (near 70,000 rpm, whereas the fan speed during nominal operations would be closer to 35,000 rpm) (2) RCA performance at boundary conditions, including carbon dioxide and water vapor saturation events, as well as reduced vacuum quality (3) SWME valve anomalies (4 documented cases where the SWME failed to respond to a control signal or physically jammed, preventing SWME control) (4) Reduction of SWME hollow fiber hydrophobicity and significant reduction of the SWME degassing capability after significant accumulated test time.
Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...
2016-06-09
Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less
Human Exploration Missions - Maturing Technologies to Sustain Crews
NASA Technical Reports Server (NTRS)
Mukai, Chiaki; Koch, Bernhard; Reese, Terrence G.
2012-01-01
Human exploration missions beyond low earth orbit will be long duration with abort scenarios of days to months. Providing crews with the essentials of life such as clean air and potable water means recycling human metabolic wastes back to useful products. Individual technologies are under development for such things as CO2 scrubbing, recovery of O2 from CO2, turning waste water into potable water, and so on. But in order to fully evaluate and mature technologies fully they must be tested in a relevant, high-functionality environment; a systems environment where technologies are challenged with real human metabolic wastes. It is for this purpose that an integrated systems ground testing capability at the Johnson Space Center is being readied for testing. The relevant environment will include deep space habitat human accommodations, sealed atmosphere of 8 psi total pressure and 32% oxygen concentration, life support systems (food, air, water), communications, crew accommodations, medical, EVA, tools, etc. Testing periods will approximate those of the expected missions (such as a near Earth asteroid, Earth ]Moon L2 or L1, the moon, and Mars). This type of integrated testing is needed not only for research and technology development but later during the mission design, development, test, and evaluation phases of preparing for the mission.
Does the thought of death contribute to the memory benefit of encoding with a survival scenario?
Bugaiska, Aurélia; Mermillod, Martial; Bonin, Patrick
2015-01-01
Four studies tested whether the thought of death contributes to the survival processing advantage found in memory tests (i.e., the survival effect). In the first study, we replicated the "Dying To Remember" (DTR) effect identified by Burns and colleagues whereby activation of death thoughts led to better retention than an aversive control situation. In Study 2, we compared an ancestral survival scenario, a modern survival scenario and a "life-after-death" scenario. The modern survival scenario and the dying scenario led to higher levels of recall than the ancestral scenario. In Study 3, we used a more salient death-thought scenario in which people imagine themselves on death row. Results showed that the "death-row" scenario yielded a level of recall similar to that of the ancestral survival condition. We also collected ratings of death-related thoughts (Studies 3 and 4) and of survival-related and planning thoughts (Study 4). The ratings indicated that death-related thoughts were induced more by the dying scenarios than by the survival scenarios, whereas the reverse was observed for both survival-related and planning thoughts. The findings are discussed in the light of two contrasting views of the influence of mortality salience in the survival effect.
Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal
NASA Astrophysics Data System (ADS)
Wronna, M.; Omira, R.; Baptista, M. A.
2015-08-01
In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines - Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING a Non-linear Shallow Water Model With Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water), MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawback, runup and inundation distance. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at Sines test site considering the single scenarios at mean sea level, the aggregate scenario and the influence of the tide on the aggregate scenario. The results confirm the composite of Horseshoe and Marques Pombal fault as the worst case scenario. It governs the aggregate scenario with about 60 % and inundates an area of 3.5 km2.
Testing and Analysis Validation of a Metallic Repair Applied to a PRSEUS Tension Panel
NASA Technical Reports Server (NTRS)
Przekop, Adam; Jegley, Dawn C.
2013-01-01
A design and analysis of a repair concept applicable to a stiffened composite panel based on the Pultruded Rod Stitched Efficient Unitized Structure was recently completed. The damage scenario considered was a midbay-to-midbay saw-cut with a severed stiffener, flange and skin. Advanced modeling techniques such as mesh-independent definition of compliant fasteners and elastic-plastic material properties for metal parts were utilized in the finite element analysis supporting the design effort. A bolted metallic repair was selected so that it could be easily applied in the operational environment. The present work describes results obtained from a tension panel test conducted to validate both the repair concept and finite element analysis techniques used in the design effort. The test proved that the proposed repair concept is capable of sustaining load levels that are higher than those resulting from the current working stress allowables. This conclusion enables upward revision of the stress allowables that had been kept at an overly-conservative level due to concerns associated with repairability of the panels. Correlation of test data with finite element analysis results is also presented and assessed.
Simulation Test of a Head-Worn Display with Ambient Vision Display for Unusual Attitude Recovery
NASA Technical Reports Server (NTRS)
Arthur, Jarvis (Trey) J., III; Nicholas, Stephanie N.; Shelton, Kevin J.; Ballard, Kathryn; Prinzel, Lawrence J., III; Ellis, Kyle E.; Bailey, Randall E.; Williams, Steven P.
2017-01-01
Head-Worn Displays (HWDs) are envisioned as a possible equivalent to a Head-Up Display (HUD) in commercial and general aviation. A simulation experiment was conducted to evaluate whether the HWD can provide an equivalent or better level of performance to a HUD in terms of unusual attitude recognition and recovery. A prototype HWD was tested with ambient vision capability which were varied (on/off) as an independent variable in the experiment testing for attitude awareness. The simulation experiment was conducted in two parts: 1) short unusual attitude recovery scenarios where the aircraft is placed in an unusual attitude and a single-pilot crew recovered the aircraft; and, 2) a two-pilot crew operating in a realistic flight environment with "off-nominal" events to induce unusual attitudes. The data showed few differences in unusual attitude recognition and recovery performance between the tested head-down, head-up, and head-worn display concepts. The presence and absence of ambient vision stimulation was inconclusive. The ergonomic influences of the head-worn display, necessary to implement the ambient vision experimentation, may have influenced the pilot ratings and acceptance of the concepts.
Mobile and stationary laser weapon demonstrators of Rheinmetall Waffe Munition
NASA Astrophysics Data System (ADS)
Ludewigt, K.; Riesbeck, Th.; Baumgärtel, Th.; Schmitz, J.; Graf, A.; Jung, M.
2014-10-01
For some years Rheinmetall Waffe Munition has successfully developed, realised and tested a variety of versatile high energy laser (HEL) weapon systems for air- and ground-defence scenarios like C-RAM, UXO clearing. By employing beam superimposition technology and a modular laser weapon concept, the total optical power has been successively increased. Stationary weapon platforms and now military mobile vehicles were equipped with high energy laser effectors. Our contribution summarises the most recent development stages of Rheinmetalls high energy laser weapon program. We present three different vehicle based HEL demonstrators: the 5 kW class Mobile HEL Effector Track V integrated in an M113 tank, the 20 kW class Mobile HEL Effector Wheel XX integrated in a multirole armoured vehicle GTK Boxer 8x8 and the 50 kW class Mobile HEL Effector Container L integrated in a reinforced container carried by an 8x8 truck. As a highlight, a stationary 30 kW Laser Weapon Demonstrator shows the capability to defeat saturated attacks of RAM targets and unmanned aerial vehicles. 2013 all HEL demonstrators were tested in a firing campaign at the Rheinmetall testing centre in Switzerland. Major results of these tests are presented.
An EEG-Based Person Authentication System with Open-Set Capability Combining Eye Blinking Signals
Wu, Qunjian; Zeng, Ying; Zhang, Chi; Tong, Li; Yan, Bin
2018-01-01
The electroencephalogram (EEG) signal represents a subject’s specific brain activity patterns and is considered as an ideal biometric given its superior forgery prevention. However, the accuracy and stability of the current EEG-based person authentication systems are still unsatisfactory in practical application. In this paper, a multi-task EEG-based person authentication system combining eye blinking is proposed, which can achieve high precision and robustness. Firstly, we design a novel EEG-based biometric evoked paradigm using self- or non-self-face rapid serial visual presentation (RSVP). The designed paradigm could obtain a distinct and stable biometric trait from EEG with a lower time cost. Secondly, the event-related potential (ERP) features and morphological features are extracted from EEG signals and eye blinking signals, respectively. Thirdly, convolutional neural network and back propagation neural network are severally designed to gain the score estimation of EEG features and eye blinking features. Finally, a score fusion technology based on least square method is proposed to get the final estimation score. The performance of multi-task authentication system is improved significantly compared to the system using EEG only, with an increasing average accuracy from 92.4% to 97.6%. Moreover, open-set authentication tests for additional imposters and permanence tests for users are conducted to simulate the practical scenarios, which have never been employed in previous EEG-based person authentication systems. A mean false accepted rate (FAR) of 3.90% and a mean false rejected rate (FRR) of 3.87% are accomplished in open-set authentication tests and permanence tests, respectively, which illustrate the open-set authentication and permanence capability of our systems. PMID:29364848
An EEG-Based Person Authentication System with Open-Set Capability Combining Eye Blinking Signals.
Wu, Qunjian; Zeng, Ying; Zhang, Chi; Tong, Li; Yan, Bin
2018-01-24
The electroencephalogram (EEG) signal represents a subject's specific brain activity patterns and is considered as an ideal biometric given its superior forgery prevention. However, the accuracy and stability of the current EEG-based person authentication systems are still unsatisfactory in practical application. In this paper, a multi-task EEG-based person authentication system combining eye blinking is proposed, which can achieve high precision and robustness. Firstly, we design a novel EEG-based biometric evoked paradigm using self- or non-self-face rapid serial visual presentation (RSVP). The designed paradigm could obtain a distinct and stable biometric trait from EEG with a lower time cost. Secondly, the event-related potential (ERP) features and morphological features are extracted from EEG signals and eye blinking signals, respectively. Thirdly, convolutional neural network and back propagation neural network are severally designed to gain the score estimation of EEG features and eye blinking features. Finally, a score fusion technology based on least square method is proposed to get the final estimation score. The performance of multi-task authentication system is improved significantly compared to the system using EEG only, with an increasing average accuracy from 92.4% to 97.6%. Moreover, open-set authentication tests for additional imposters and permanence tests for users are conducted to simulate the practical scenarios, which have never been employed in previous EEG-based person authentication systems. A mean false accepted rate (FAR) of 3.90% and a mean false rejected rate (FRR) of 3.87% are accomplished in open-set authentication tests and permanence tests, respectively, which illustrate the open-set authentication and permanence capability of our systems.
NASA Astrophysics Data System (ADS)
Snarski, Steve; Menozzi, Alberico; Sherrill, Todd; Volpe, Chris; Wille, Mark
2010-04-01
This paper describes experimental results from recent live-fire data collects that demonstrate the capability of a prototype system for projectile detection and tracking. This system, which is being developed at Applied Research Associates, Inc., under the FightSight program, consists of a high-speed thermal camera and sophisticated image processing algorithms to detect and track projectiles. The FightSight operational vision is automated situational intelligence to detect, track, and graphically map large-scale firefights and individual shooting events onto command and control (C2) systems in real time (shot location and direction, weapon ID, movements and trends). Gaining information on enemy-fire trajectories allows educated inferences on the enemy's intent, disposition, and strength. Our prototype projectile detection and tracking system has been tested at the Joint Readiness Training Center (Ft Polk, LA) during live-fire convoy and mortar registration exercises, in the summer of 2009. It was also tested during staged military-operations- on-urban-terrain (MOUT) firefight events at Aberdeen Test Center (Aberdeen, MD) under the Hostile Fire Defeat Army Technology Objective midterm experiment, also in the summer of 2009, where we introduced fusion with acoustic and EO sensors to provide 3D localization and near-real time display of firing events. Results are presented in this paper that demonstrate effective and accurate detection and localization of weapon fire (5.56mm, 7.62mm, .50cal, 81/120mm mortars, 40mm) in diverse and challenging environments (dust, heat, day and night, rain, arid open terrain, urban clutter). FightSight's operational capabilities demonstrated under these live-fire data collects can support closecombat scenarios. As development continues, FightSight will be able to feed C2 systems with a symbolic map of enemy actions.
Access to edge scenarios for testing a scraper element in early operation phases of Wendelstein 7-X
Holbe, H.; Pedersen, T. Sunn; Geiger, J.; ...
2016-01-29
The edge topology of magnetic fusion devices is decisive for the control of the plasma exhaust. In Wendelstein 7-X, the island divertor concept will be used, for which the edge topology can change significantly as the internal currents in a plasma discharge evolve towards steady-state. Consequently, the device has been optimized to minimize such internal currents, in particular the bootstrap current [1]. Nonetheless, there are predicted pulse scenarios where effects of the remaining internal currents could potentially lead to overload of plasma-facing components. These internal currents are predicted to evolve on long time scales (tens of seconds) so their effectsmore » on the edge topology and the divertor heat loads may not be experimentally accessible in the first years of W7-X operation, where only relatively short pulses are possible. However, we show here that for at least one important long-pulse divertor operation issue, relevant physics experiments can be performed already in short-pulse operation, through judicious adjustment of the edge topology by the use of the existing coil sets. The specific issue studied here is a potential overload of the divertor element edges. This overload might be mitigated by the installation of an extra set of plasma-facing components, so-called scraper elements, as suggested in earlier publications. It is shown here that by a targeted control of edge topology, the effectiveness of such scraper elements can be tested already with uncooled test-scraper elements in short-pulse operation. Furthermore, this will allow an early and well-informed decision on whether long-pulse-capable (actively cooled) scraper elements should be built and installed.« less
A space transportation system operations model
NASA Technical Reports Server (NTRS)
Morris, W. Douglas; White, Nancy H.
1987-01-01
Presented is a description of a computer program which permits assessment of the operational support requirements of space transportation systems functioning in both a ground- and space-based environment. The scenario depicted provides for the delivery of payloads from Earth to a space station and beyond using upper stages based at the station. Model results are scenario dependent and rely on the input definitions of delivery requirements, task times, and available resources. Output is in terms of flight rate capabilities, resource requirements, and facility utilization. A general program description, program listing, input requirements, and sample output are included.
Envisioning Cognitive Robots for Future Space Exploration
NASA Technical Reports Server (NTRS)
Huntsberger, Terry; Stoica, Adrian
2010-01-01
Cognitive robots in the context of space exploration are envisioned with advanced capabilities of model building, continuous planning/re-planning, self-diagnosis, as well as the ability to exhibit a level of 'understanding' of new situations. An overview of some JPL components (e.g. CASPER, CAMPOUT) and a description of the architecture CARACaS (Control Architecture for Robotic Agent Command and Sensing) that combines these in the context of a cognitive robotic system operating in a various scenarios are presented. Finally, two examples of typical scenarios of a multi-robot construction mission and a human-robot mission, involving direct collaboration with humans is given.
Nuclear Thermal Rocket/Vehicle Design Options for Future NASA Missions to the Moon and Mars
NASA Technical Reports Server (NTRS)
Borowski, Stanley K.; Corban, Robert R.; Mcguire, Melissa L.; Beke, Erik G.
1995-01-01
The nuclear thermal rocket (NTR) provides a unique propulsion capability to planners/designers of future human exploration missions to the Moon and Mars. In addition to its high specific impulse (approximately 850-1000 s) and engine thrust-to-weight ratio (approximately 3-10), the NTR can also be configured as a 'dual mode' system capable of generating electrical power for spacecraft environmental systems, communications, and enhanced stage operations (e.g., refrigeration for long-term liquid hydrogen storage). At present the Nuclear Propulsion Office (NPO) is examining a variety of mission applications for the NTR ranging from an expendable, single-burn, trans-lunar injection (TLI) stage for NASA's First Lunar Outpost (FLO) mission to all propulsive, multiburn, NTR-powered spacecraft supporting a 'split cargo-piloted sprint' Mars mission architecture. Each application results in a particular set of requirements in areas such as the number of engines and their respective thrust levels, restart capability, fuel operating temperature and lifetime, cryofluid storage, and stage size. Two solid core NTR concepts are examined -- one based on NERVA (Nuclear Engine for Rocket Vehicle Application) derivative reactor (NDR) technology, and a second concept which utilizes a ternary carbide 'twisted ribbon' fuel form developed by the Commonwealth of Independent States (CIS). The NDR and CIS concepts have an established technology database involving significant nuclear testing at or near representative operating conditions. Integrated systems and mission studies indicate that clusters of two to four 15 to 25 klbf NDR or CIS engines are sufficient for most of the lunar and Mars mission scenarios currently under consideration. This paper provides descriptions and performance characteristics for the NDR and CIS concepts, summarizes NASA's First Lunar Outpost and Mars mission scenarios, and describes characteristics for representative cargo and piloted vehicles compatible with a reference 240 t-class heavy lift launch vehicle (HLLV) and smaller 120 t HLLV option. Attractive performance characteristics and high-leverage technologies associated with both the engine and stage are identified, and supporting parametric sensitivity data is provided. The potential for commonality of engine and stage components to satisfy a broad range of lunar and Mars missions is also discussed.
Representation of Probability Density Functions from Orbit Determination using the Particle Filter
NASA Technical Reports Server (NTRS)
Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell
2012-01-01
Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.
Validation of the SEPHIS Program for the Modeling of the HM Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyser, E.A.
The SEPHIS computer program is currently being used to evaluate the effect of all process variables on the criticality safety of the HM 1st Uranium Cycle process in H Canyon. The objective of its use has three main purposes. (1) To provide a better technical basis for those process variables that do not have any realistic effect on the criticality safety of the process. (2) To qualitatively study those conditions that have been previously recognized to affect the nuclear safety of the process or additional conditions that modeling has indicated may pose a criticality safety issue. (3) To judge themore » adequacy of existing or future neutron monitors locations in the detection of the initial stages of reflux for specific scenarios.Although SEPHIS generally over-predicts the distribution of uranium to the organic phase, it is a capable simulation tool as long as the user recognizes its biases and takes special care when using the program for scenarios where the prediction bias is non-conservative. The temperature coefficient used by SEPHIS is poor at predicting effect of temperature on uranium extraction for the 7.5 percent TBP used in the HM process. Therefore, SEPHIS should not be used to study temperature related scenarios. However, within normal operating temperatures when other process variables are being studied, it may be used. Care must be is given to understanding the prediction bias and its effect on any conclusion for the particular scenario that is under consideration. Uranium extraction with aluminum nitrate is over-predicted worse than for nitric acid systems. However, the extraction section of the 1A bank has sufficient excess capability that these errors, while relatively large, still allow SEPHIS to be used to develop reasonable qualitative assessments for reflux scenarios. However, high losses to the 1AW stream cannot be modeled by SEPHIS.« less
Human in the Loop Integrated Life Support Systems Ground Testing
NASA Technical Reports Server (NTRS)
Henninger, Donald L.; Marmolejo, Jose A.; Seaman, Calvin H.
2012-01-01
Human exploration missions beyond low earth orbit will be long duration with abort scenarios of days to months. This necessitates provisioning the crew with all the things they will need to sustain themselves while carrying out mission objectives. Systems engineering and integration is critical to the point where extensive integrated testing of life support systems on the ground is required to identify and mitigate risks. Ground test facilities (human-rated altitude chambers) at the Johnson Space Center are being readied to integrate all the systems for a mission along with a human test crew. The relevant environment will include deep space habitat human accommodations, sealed atmosphere capable of 14.7 to 8 psi total pressure and 21 to 32% oxygen concentration, life support systems (food, air, and water), communications, crew accommodations, medical, EVA, tools, etc. Testing periods will approximate those of the expected missions (such as a near Earth asteroid, Earth-Moon L2 or L1, the moon, Mars). This type of integrated testing is needed for research and technology development as well as later during the mission design, development, test, and evaluation (DDT&E) phases of an approved program. Testing will evolve to be carried out at the mission level fly the mission on the ground . Mission testing will also serve to inform the public and provide the opportunity for active participation by international, industrial and academic partners.
Cost-effectiveness of tubal patency tests.
Verhoeve, H R; Moolenaar, L M; Hompes, P; van der Veen, F; Mol, B W J
2013-04-01
Guidelines are not in agreement on the most effective diagnostic scenario for tubal patency testing; therefore, we evaluated the cost-effectiveness of invasive tubal testing in subfertile couples compared with no testing and treatment. Cost-effectiveness analysis. Decision analytic framework. Computer-simulated cohort of subfertile women. We evaluated six scenarios: (1) no tests and no treatment; (2) immediate treatment without tubal testing; (3) delayed treatment without tubal testing; (4) hysterosalpingogram (HSG), followed by immediate or delayed treatment, according to diagnosis (tailored treatment); (5) HSG and a diagnostic laparoscopy (DL) in case HSG does not prove tubal patency, followed by tailored treatment; and (6) DL followed by tailored treatment. Expected cumulative live births after 3 years. Secondary outcomes were cost per couple and the incremental cost-effectiveness ratio. For a 30-year-old woman with otherwise unexplained subfertility for 12 months, 3-year cumulative live birth rates were 51.8, 78.1, 78.4, 78.4, 78.6 and 78.4%, and costs per couple were €0, €6968, €5063, €5410, €5405 and €6163 for scenarios 1, 2, 3, 4, 5 and 6, respectively. The incremental cost-effectiveness ratios compared with scenario 1 (reference strategy), were €26,541, €19,046, €20,372, €20,150 and €23,184 for scenarios 2, 3, 4, 5 and 6, respectively. Sensitivity analysis showed the model to be robust over a wide range of values for the variables. The most cost-effective scenario is to perform no diagnostic tubal tests and to delay in vitro fertilisation (IVF) treatment for at least 12 months for women younger than 38 years old, and to perform no tubal tests and start immediate IVF treatment from the age of 39 years. If an invasive diagnostic test is planned, HSG followed by tailored treatment, or a DL if HSG shows no tubal patency, is more cost-effective than DL. © 2013 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2013 RCOG.
Global and Regional Sea Level Rise Scenarios for the United States
NASA Technical Reports Server (NTRS)
Sweet, William V.; Kopp, Robert E.; Weaver, Christopher P.; Obeysekera, Jayantha; Horton, Radley M.; Thieler, E. Robert; Zervas, Chris
2017-01-01
The Sea Level Rise and Coastal Flood Hazard Scenarios and Tools Interagency Task Force, jointly convened by the U.S. Global Change Research Program (USGCRP) and the National Ocean Council (NOC), began its work in August 2015. The Task Force has focused its efforts on three primary tasks: 1) updating scenarios of global mean sea level (GMSL) rise, 2) integrating the global scenarios with regional factors contributing to sea level change for the entire U.S. coastline, and 3) incorporating these regionally appropriate scenarios within coastal risk management tools and capabilities deployed by individual agencies in support of the needs of specific stakeholder groups and user communities. This technical report focuses on the first two of these tasks and reports on the production of gridded relative sea level (RSL, which includes both ocean-level change and vertical land motion) projections for the United States associated with an updated set of GMSL scenarios. In addition to supporting the longer-term Task Force effort, this new product will be an important input into the USGCRP Sustained Assessment process and upcoming Fourth National Climate Assessment (NCA4) due in 2018. This report also serves as a key technical input into the in-progress USGCRP Climate Science Special Report (CSSR).
Medical Scenarios Relevant to Spaceflight
NASA Technical Reports Server (NTRS)
Bacal, Kira; Hurs, Victor; Doerr, Harold
2004-01-01
The Medical Operational Support Team (MOST) was tasked by the JSC Space Medicine and Life Sciences Directorate (SLSD) to incorporate medical simulation into 1) medical training for astronaut-crew medical officers (CMO) and medical flight control teams and 2) evaluations of procedures and resources required for medical care aboard the International Space Station (ISS). Development of evidence-based medical scenarios that mimic the physiology observed during spaceflight will be needed for the MOST to complete these two tasks. The MOST used a human patient simulator, the ISS-like resources in the Medical Simulation Laboratory (MSL), and evidence from space operations, military operations and medical literature to develop space relevant medical scenarios. These scenarios include conditions concerning airway management, Advanced Cardiac Life Support (ACLS) and mitigating anaphylactic symptoms. The MOST has used these space relevant medical scenarios to develop a preliminary space medical training regimen for NASA flight surgeons, Biomedical Flight Controllers (Biomedical Engineers; BME) and CMO-analogs. This regimen is conducted by the MOST in the MSL. The MOST has the capability to develop evidence-based space-relevant medical scenarios that can help SLSD I) demonstrate the proficiency of medical flight control teams to mitigate space-relevant medical events and 2) validate nextgeneration medical equipment and procedures for space medicine applications.
ERIC Educational Resources Information Center
van der Meulen, Ineke; van de Sandt-Koenderman, W. Mieke E.; Duivenvoorden, Hugo J.; Ribbers, Gerard M.
2010-01-01
Background: This study explores the psychometric qualities of the Scenario Test, a new test to assess daily-life communication in severe aphasia. The test is innovative in that it: (1) examines the effectiveness of verbal and non-verbal communication; and (2) assesses patients' communication in an interactive setting, with a supportive…
NASA Astrophysics Data System (ADS)
Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo; Tinti, Stefano
2013-04-01
TRIDEC is a EU-FP7 Project whose main goal is, in general terms, to develop suitable strategies for the management of crises possibly arising in the Earth management field. The general paradigms adopted by TRIDEC to develop those strategies include intelligent information management, the capability of managing dynamically increasing volumes and dimensionality of information in complex events, and collaborative decision making in systems that are typically very loosely coupled. The two areas where TRIDEC applies and tests its strategies are tsunami early warning and industrial subsurface development. In the field of tsunami early warning, TRIDEC aims at developing a Decision Support System (DSS) that integrates 1) a set of seismic, geodetic and marine sensors devoted to the detection and characterisation of possible tsunamigenic sources and to monitoring the time and space evolution of the generated tsunami, 2) large-volume databases of pre-computed numerical tsunami scenarios, 3) a proper overall system architecture. Two test areas are dealt with in TRIDEC: the western Iberian margin and the eastern Mediterranean. In this study, we focus on the western Iberian margin with special emphasis on the Portuguese coasts. The strategy adopted in TRIDEC plans to populate two different databases, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB), both of which deal only with earthquake-generated tsunamis. In the VSDB we simulate numerically few large-magnitude events generated by the major known tectonic structures in the study area. Heterogeneous slip distributions on the earthquake faults are introduced to simulate events as "realistically" as possible. The members of the VSDB represent the unknowns that the TRIDEC platform must be able to recognise and match during the early crisis management phase. On the other hand, the MSDB contains a very large number (order of thousands) of tsunami simulations performed starting from many different simple earthquake sources of different magnitudes and located in the "vicinity" of the virtual scenario earthquake. In the DSS perspective, the members of the MSDB have to be suitably combined based on the information coming from the sensor networks, and the results are used during the crisis evolution phase to forecast the degree of exposition of different coastal areas. We provide examples from both databases whose members are computed by means of the in-house software called UBO-TSUFD, implementing the non-linear shallow-water equations and solving them over a set of nested grids that guarantee a suitable spatial resolution (few tens of meters) in specific, suitably chosen, coastal areas.
A data fusion approach for mapping daily evapotranspiration at field scale
USDA-ARS?s Scientific Manuscript database
The capability for mapping water consumption over cropped landscapes on a daily and seasonal basis is increasingly relevant given forecasted scenarios of reduced water availability. Prognostic modeling of water losses to the atmosphere, or evapotranspiration (ET), at field or finer scales in agricul...
Ballistic missile defense technologies
NASA Astrophysics Data System (ADS)
1985-09-01
A report on Ballistic Missile Technologies includes the following: Executive summary; Introduction; Ballistic missiles then and now; Deterrence, U.S. nuclear strategy, and BMD; BMD capabilities and the strategic balance; Crisis stability, arms race stability, and arms control issues; Ballistic missile defense technologies; Feasibility; Alternative future scenarios; Alternative R&D programs.
Virtual reality simulator for vitreoretinal surgery using integrated OCT data.
Kozak, Igor; Banerjee, Pat; Luo, Jia; Luciano, Cristian
2014-01-01
Operative practice using surgical simulators has become a part of training in many surgical specialties, including ophthalmology. We introduce a virtual reality retina surgery simulator capable of integrating optical coherence tomography (OCT) scans from real patients for practicing vitreoretinal surgery using different pathologic scenarios.
Performance Evaluation of a Prototyped Wireless Ground Sensor Network
2005-03-01
the network was capable of dynamic adaptation to failure and degradation. 14. SUBJECT TERMS: Wireless Sensor Network , Unmanned Sensor, Unattended...2 H. WIRELESS SENSOR NETWORKS .................................................................... 3...zation, and network traffic. The evaluated scenarios included outdoor, urban and indoor environments. The characteristics of wireless sensor networks , types
DOE Office of Scientific and Technical Information (OSTI.GOV)
Villone, F.; Mastrostefano, S.; Calabrò, G.
2014-08-15
One of the main FAST (Fusion Advanced Studies Torus) goals is to have a flexible experiment capable to test tools and scenarios for safe and reliable tokamak operation, in order to support ITER and help the final DEMO design. In particular, in this paper, we focus on operation close to a possible border of stability related to low-q operation. To this purpose, a new FAST scenario has then been designed at I{sub p} = 10 MA, B{sub T} = 8.5 T, q{sub 95} ≈ 2.3. Transport simulations, carried out by using the code JETTO and the first principle transport model GLF23, indicate that, under these conditions, FASTmore » could achieve an equivalent Q ≈ 3.5. FAST will be equipped with a set of internal active coils for feedback control, which will produce magnetic perturbation with toroidal number n = 1 or n = 2. Magnetohydrodynamic (MHD) mode analysis and feedback control simulations performed with the codes MARS, MARS-F, CarMa (both assuming the presence of a perfect conductive wall and using the exact 3D resistive wall structure) show the possibility of the FAST conductive structures to stabilize n = 1 ideal modes. This leaves therefore room for active mitigation of the resistive mode (down to a characteristic time of 1 ms) for safety purposes, i.e., to avoid dangerous MHD-driven plasma disruption, when working close to the machine limits and magnetic and kinetic energy density not far from reactor values.« less
A vision system planner for increasing the autonomy of the Extravehicular Activity Helper/Retriever
NASA Technical Reports Server (NTRS)
Magee, Michael
1993-01-01
The Extravehicular Activity Retriever (EVAR) is a robotic device currently being developed by the Automation and Robotics Division at the NASA Johnson Space Center to support activities in the neighborhood of the Space Shuttle or Space Station Freedom. As the name implies, the Retriever's primary function will be to provide the capability to retrieve tools and equipment or other objects which have become detached from the spacecraft, but it will also be able to rescue a crew member who may have become inadvertently de-tethered. Later goals will include cooperative operations between a crew member and the Retriever such as fetching a tool that is required for servicing or maintenance operations. This paper documents a preliminary design for a Vision System Planner (VSP) for the EVAR that is capable of achieving visual objectives provided to it by a high level task planner. Typical commands which the task planner might issue to the VSP relate to object recognition, object location determination, and obstacle detection. Upon receiving a command from the task planner, the VSP then plans a sequence of actions to achieve the specified objective using a model-based reasoning approach. This sequence may involve choosing an appropriate sensor, selecting an algorithm to process the data, reorienting the sensor, adjusting the effective resolution of the image using lens zooming capability, and/or requesting the task planner to reposition the EVAR to obtain a different view of the object. An initial version of the Vision System Planner which realizes the above capabilities using simulated images has been implemented and tested. The remaining sections describe the architecture and capabilities of the VSP and its relationship to the high level task planner. In addition, typical plans that are generated to achieve visual goals for various scenarios are discussed. Specific topics to be addressed will include object search strategies, repositioning of the EVAR to improve the quality of information obtained from the sensors, and complementary usage of the sensors and redundant capabilities.
Analytical modeling of transport aircraft crash scenarios to obtain floor pulses
NASA Technical Reports Server (NTRS)
Wittlin, G.; Lackey, D.
1983-01-01
The KRAS program was used to analyze transport aircraft candidate crash scenarios. Aircraft floor pulses and seat/occupant responses are presented. Results show that: (1) longitudinal only pulses can be represented by equivalent step inputs and/or static requirements; (2) the L1649 crash test floor longitudinal pulse for the aft direction (forward inertia) is less than 9g static or an equivalent 5g pulse; aft inertia accelerations are extremely small ((ch76) 3g) for representative crash scenarios; (3) a viable procedure to relate crash scenario floor pulses to standard laboratory dynamic and static test data using state of the art analysis and test procedures was demonstrated; and (4) floor pulse magnitudes are expected to be lower for wide body aircraft than for smaller narrow body aircraft.
Bauminger-Zviely, Nirit; Bauminger-Zvieli, Nirit; Kugelmass, Dana Shoham
2013-02-01
Affective bonding, social attention, and intersubjective capabilities are all conditions for jealousy, and are deficient in autism. Thus, examining jealousy and attachment may elucidate the socioemotional deficit in autism spectrum disorders (ASD). Jealousy was provoked in 30 high-functioning children with ASD (HFASD) and 30 typical children (ages 3-6 years) through two triadic social (storybook-reading) scenarios - mother-child-rival and stranger-child-rival. A control nonsocial scenario included mother/stranger-book. For both groups, higher jealousy expressions emerged for mother than stranger, and for social than nonsocial scenarios. Attachment security (using Attachment Q-Set) was lower for HFASD than typical groups, but attachment correlated negatively with jealous verbalizations for both groups and with jealous eye gazes for HFASD. Implications for understanding jealousy's developmental complexity and the socioemotional deficit in ASD are discussed.
Process Integrated Mechanism for Human-Computer Collaboration and Coordination
2012-09-12
system we implemented the TAFLib library that provides the communication with TAF . The data received from the TAF server is collected in a data structure...send new commands and flight plans for the UAVs to the TAF server. Test scenarios Several scenarios have been implemented to test and prove our...areas. Shooting Enemies The basic scenario proved the successful integration of PIM and the TAF simulation environment. Subsequently we improved the CP
NASA Astrophysics Data System (ADS)
Kröger, Knut; Creutzburg, Reiner
2013-05-01
The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.
Emulytics for Cyber-Enabled Physical Attack Scenarios: Interim LDRD Report of Year One Results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clem, John; Urias, Vincent; Atkins, William Dee
Sandia National Laboratories has funded the research and development of a new capability to interactively explore the effects of cyber exploits on the performance of physical protection systems. This informal, interim report of progress summarizes the project’s basis and year one (of two) accomplishments. It includes descriptions of confirmed cyber exploits against a representative testbed protection system and details the development of an emulytics capability to support live, virtual, and constructive experiments. This work will support stakeholders to better engineer, operate, and maintain reliable protection systems.
Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal
NASA Astrophysics Data System (ADS)
Wronna, M.; Omira, R.; Baptista, M. A.
2015-11-01
In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2.
Web Based Tool for Mission Operations Scenarios
NASA Technical Reports Server (NTRS)
Boyles, Carole A.; Bindschadler, Duane L.
2008-01-01
A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum visibility across the project. One key aspect is that the tool was built for a scenario process that accounts for stakeholder input, review, comment, and concurrence. By creating well-designed opportunities for stakeholder input and concurrence and by making the scenario content easily accessible to all project personnel, we maximize the opportunities for stakeholders to both understand and agree on the concepts for how their mission is to be carried out.
Reed, H; Leckey, Cara A C; Dick, A; Harvey, G; Dobson, J
2018-01-01
Ultrasonic damage detection and characterization is commonly used in nondestructive evaluation (NDE) of aerospace composite components. In recent years there has been an increased development of guided wave based methods. In real materials and structures, these dispersive waves result in complicated behavior in the presence of complex damage scenarios. Model-based characterization methods utilize accurate three dimensional finite element models (FEMs) of guided wave interaction with realistic damage scenarios to aid in defect identification and classification. This work describes an inverse solution for realistic composite damage characterization by comparing the wavenumber-frequency spectra of experimental and simulated ultrasonic inspections. The composite laminate material properties are first verified through a Bayesian solution (Markov chain Monte Carlo), enabling uncertainty quantification surrounding the characterization. A study is undertaken to assess the efficacy of the proposed damage model and comparative metrics between the experimental and simulated output. The FEM is then parameterized with a damage model capable of describing the typical complex damage created by impact events in composites. The damage is characterized through a transdimensional Markov chain Monte Carlo solution, enabling a flexible damage model capable of adapting to the complex damage geometry investigated here. The posterior probability distributions of the individual delamination petals as well as the overall envelope of the damage site are determined. Copyright © 2017 Elsevier B.V. All rights reserved.
Internet Data Delivery for Future Space Missions
NASA Technical Reports Server (NTRS)
Rash, James; Hogie, Keith; Casasanta, Ralph; Hennessy, Joseph F. (Technical Monitor)
2002-01-01
This paper presents work being done at NASA/GSFC (Goddard Space Flight Center) on applying standard Internet applications and protocols to meet the technology challenge of future satellite missions. Internet protocols (IP) can provide seamless dynamic communication among heterogeneous instruments, spacecraft, ground stations, and constellations of spacecraft. A primary component of this work is to design and demonstrate automated end-to-end transport of files in a dynamic space environment using off-the-shelf, low-cost, commodity-level standard applications and protocols. These functions and capabilities will become increasingly significant in the years to come as both Earth and space science missions fly more sensors and the present labor-intensive, mission-specific techniques for processing and routing data become prohibitively expensive. This paper describes how an IP-based communication architecture can support existing operations concepts and how it will enable some new and complex communication and science concepts. The authors identify specific end-to-end file transfers all the way from instruments to control centers and scientists, and then describe how each data flow can be supported using standard Internet protocols and applications. The scenarios include normal data downlink and command uplink as well as recovery scenarios for both onboard and ground failures. The scenarios are based on an Earth orbiting spacecraft with data rates and downlink capabilities from 300 Kbps to 4 Mbps. Many examples are based on designs currently being investigated for the Global Precipitation Measurement (GPM) mission.
Rychert, Marta; Wilkins, Chris
2015-12-01
In mid-July 2013, New Zealand passed the Psychoactive Substances Act (PSA), which allowed 'low risk' psychoactive products ('legal highs') to be approved for legal sale. In early May 2014, following public protest, the Psychoactive Substances Amendment Act (PSAA) was passed banning animal testing of psychoactive products, potentially making the new regime unworkable. To investigate strategies to overcome the impasse created by the animal testing ban. Solutions to the impasse were investigated using 'scenario' and 'stakeholder' analysis. Legislation, parliamentary debates, and regulatory statements related to the PSA and animal testing were reviewed. Strategies to resolve the impasse were discussed with stakeholders including the Psychoactive Substances Regulatory Authority (PSRA) officials, health officials, a legal high industry lawyer, and a leading legal highs manufacturer. This process generated six possible scenarios and five decision-making criteria of key importance to major stakeholders. Scenarios were then evaluated based on feedback from the industry and regulators. The six scenarios were: (1) pragmatic modification of the animal testing ban; (2) waiting until new non-animal test models are internationally accepted; (3) use of non-validated replacement test methods; (4) judicial challenge of the animal testing ban; (5) 'creative compliance' by only presenting human clinical trial results; and (6) philosophical re-conceptualisation of the 'benefits' from psychoactive products. Options 1 and 5 appear to be the most attractive overall solutions. However, both rely on a new political consensus and astute framing of the issues by political communicators. Political decision makers may be happy to accept Scenario 2 which would impose significant delays. A 'failed' pharmaceutical product with psychoactive effects may have the test data required to be approved under Scenarios 1 and 5. Ultimately, the pleasurable benefits from psychoactive products may need to be included in the debate. Copyright © 2015 Elsevier B.V. All rights reserved.
An inverse approach to perturb historical rainfall data for scenario-neutral climate impact studies
NASA Astrophysics Data System (ADS)
Guo, Danlu; Westra, Seth; Maier, Holger R.
2018-01-01
Scenario-neutral approaches are being used increasingly for climate impact assessments, as they allow water resource system performance to be evaluated independently of climate change projections. An important element of these approaches is the generation of perturbed series of hydrometeorological variables that form the inputs to hydrologic and water resource assessment models, with most scenario-neutral studies to-date considering only shifts in the average and a limited number of other statistics of each climate variable. In this study, a stochastic generation approach is used to perturb not only the average of the relevant hydrometeorological variables, but also attributes such as the intermittency and extremes. An optimization-based inverse approach is developed to obtain hydrometeorological time series with uniform coverage across the possible ranges of rainfall attributes (referred to as the 'exposure space'). The approach is demonstrated on a widely used rainfall generator, WGEN, for a case study at Adelaide, Australia, and is shown to be capable of producing evenly-distributed samples over the exposure space. The inverse approach expands the applicability of the scenario-neutral approach in evaluating a water resource system's sensitivity to a wider range of plausible climate change scenarios.
Yue, Wencong; Cai, Yanpeng; Xu, Linyu; Yang, Zhifeng; Yin, Xin'An; Su, Meirong
2017-07-11
To improve the capabilities of conventional methodologies in facilitating industrial water allocation under uncertain conditions, an integrated approach was developed through the combination of operational research, uncertainty analysis, and violation risk analysis methods. The developed approach can (a) address complexities of industrial water resources management (IWRM) systems, (b) facilitate reflections of multiple uncertainties and risks of the system and incorporate them into a general optimization framework, and (c) manage robust actions for industrial productions in consideration of water supply capacity and wastewater discharging control. The developed method was then demonstrated in a water-stressed city (i.e., the City of Dalian), northeastern China. Three scenarios were proposed according to the city's industrial plans. The results indicated that in the planning year of 2020 (a) the production of civilian-used steel ships and machine-made paper & paperboard would reduce significantly, (b) violation risk of chemical oxygen demand (COD) discharge under scenario 1 would be the most prominent, compared with those under scenarios 2 and 3, (c) the maximal total economic benefit under scenario 2 would be higher than the benefit under scenario 3, and (d) the production of rolling contact bearing, rail vehicles, and commercial vehicles would be promoted.
Biosecurity through Public Health System Design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walter E.; Finley, Patrick D.; Arndt, William
We applied modeling and simulation to examine the real-world tradeoffs between developingcountry public-health improvement and the need to improve the identification, tracking, and security of agents with bio-weapons potential. Traditionally, the international community has applied facility-focused strategies for improving biosecurity and biosafety. This work examines how system-level assessments and improvements can foster biosecurity and biosafety. We modeled medical laboratory resources and capabilities to identify scenarios where biosurveillance goals are transparently aligned with public health needs, and resource are distributed in a way that maximizes their ability to serve patients while minimizing security a nd safety risks. Our modeling platform simulatesmore » key processes involved in healthcare system operation, such as sample collection, transport, and analysis at medical laboratories. The research reported here extends the prior art by provided two key compone nts for comparative performance assessment: a model of patient interaction dynamics, and the capability to perform uncertainty quantification. In addition, we have outlined a process for incorporating quantitative biosecurity and biosafety risk measures. Two test problems were used to exercise these research products examine (a) Systemic effects of technological innovation and (b) Right -sizing of laboratory networks.« less
Constraints, Approach, and Status of Mars Surveyor 2001 Landing Site Selection
NASA Technical Reports Server (NTRS)
Golombek, M.; Bridges, N.; Briggs, G.; Gilmore, M.; Haldemann, A.; Parker, T.; Saunders, R.; Spencer, D.; Smith, J.; Soderblom, L.
1999-01-01
There are many similarities between the Mars Surveyor '01 (MS '01) landing site selection process and that of Mars Pathfinder. The selection process includes two parallel activities in which engineers define and refine the capabilities of the spacecraft through design, testing and modeling and scientists define a set of landing site constraints based on the spacecraft design and landing scenario. As for Pathfinder, the safety of the site is without question the single most important factor, for the simple reason that failure to land safely yields no science and exposes the mission and program to considerable risk. The selection process must be thorough and defensible and capable of surviving multiple withering reviews similar to the Pathfinder decision. On Pathfinder, this was accomplished by attempting to understand the surface properties of sites using available remote sensing data sets and models based on them. Science objectives are factored into the selection process only after the safety of the site is validated. Finally, as for Pathfinder, the selection process is being done in an open environment with multiple opportunities for community involvement including open workshops, with education and outreach opportunities. Additional information is contained in the original extended abstract.
Unmanned ground vehicles for integrated force protection
NASA Astrophysics Data System (ADS)
Carroll, Daniel M.; Mikell, Kenneth; Denewiler, Thomas
2004-09-01
The combination of Command and Control (C2) systems with Unmanned Ground Vehicles (UGVs) provides Integrated Force Protection from the Robotic Operation Command Center. Autonomous UGVs are directed as Force Projection units. UGV payloads and fixed sensors provide situational awareness while unattended munitions provide a less-than-lethal response capability. Remote resources serve as automated interfaces to legacy physical devices such as manned response vehicles, barrier gates, fence openings, garage doors, and remote power on/off capability for unmanned systems. The Robotic Operations Command Center executes the Multiple Resource Host Architecture (MRHA) to simultaneously control heterogeneous unmanned systems. The MRHA graphically displays video, map, and status for each resource using wireless digital communications for integrated data, video, and audio. Events are prioritized and the user is prompted with audio alerts and text instructions for alarms and warnings. A control hierarchy of missions and duty rosters support autonomous operations. This paper provides an overview of the key technology enablers for Integrated Force Protection with details on a force-on-force scenario to test and demonstrate concept of operations using Unmanned Ground Vehicles. Special attention is given to development and applications for the Remote Detection Challenge and Response (REDCAR) initiative for Integrated Base Defense.
Integration of unmanned systems for tactical operations within hostile environments
NASA Astrophysics Data System (ADS)
Maddux, Gary A.; Bosco, Charles D.; Lawrence, James D.
2006-05-01
The University of Alabama in Huntsville (UAH) is currently investigating techniques and technologies for the integration of a small unmanned aerial vehicle (SUAV) with small unmanned ground vehicles (SUGV). Each vehicle has its own set of unique capabilities, but the efficient integration of the two for a specific application requires modifying and integrating both systems. UAH has been flying and testing an autonomously-controlled small helicopter, called the Flying Bassett (Base Airborne Surveillance and Sensing for Emergency Threat Tracking) for over a year. Recently, integrated operations were performed with four SUGVs, the Matilda (Mesa Robotics, Huntsville, AL), the US Navy Vanguard, the UAH Rover, and the Penetrator (Mesa Robotics). The program has progressed from 1) building an air and ground capability for video and infrared surveillance, 2) integration with ground vehicles in realistic scenarios, to 3) deployment and recovery of ground vehicles. The work was done with the cooperation of the US Army at Ft. Benning, GA and Redstone Arsenal, AL, the Federal Bureau of Investigation in Huntsville, AL, the US Naval Reserve in Knoxville, TN, and local emergency organizations. The results so far have shown that when the air and ground systems are employed together, their utility is greatly enhanced.
NASA Astrophysics Data System (ADS)
Ding, S.; Garofalo, A. M.; Qian, J.; Cui, L.; McClenaghan, J. T.; Pan, C.; Chen, J.; Zhai, X.; McKee, G.; Ren, Q.; Gong, X.; Holcomb, C. T.; Guo, W.; Lao, L.; Ferron, J.; Hyatt, A.; Staebler, G.; Solomon, W.; Du, H.; Zang, Q.; Huang, J.; Wan, B.
2017-05-01
Systematic experimental and modeling investigations on DIII-D show attractive transport properties of fully non-inductive high βp plasmas. Experiments on DIII-D show that the large-radius internal transport barrier (ITB), a key feature providing excellent confinement in the high βp regime, is maintained when the scenario is extended from q95 ˜ 12 to 7 and from rapid to near-zero toroidal rotation. The robustness of confinement versus rotation was predicted by gyrofluid modeling showing dominant neoclassical ion energy transport even without the E × B shear effect. The physics mechanism of turbulence suppression, we found, is the Shafranov shift, which is essential and sets a βp threshold for large-radius ITB formation in the high βp scenario on DIII-D. This is confirmed by two different parameter-scan experiments, one for a βN scan and the other for a q95 scan. They both give the same βp threshold at 1.9 in the experiment. The experimental trend of increasing thermal transport with decreasing βp is consistent with transport modeling. The progress toward the high βp scenario on Experimental Advanced Superconducting Tokamak (EAST) is reported. The very first step of extending the high βp scenario on DIII-D to long pulse on EAST is to establish a long pulse H-mode with ITB on EAST. This paper shows the first 61 s fully non-inductive H-mode with stationary ITB feature and actively cooled ITER-like tungsten divertor in the very recent EAST experiment. The successful use of lower hybrid wave as a key tool to optimize the current profile in the EAST experiment is also introduced. Results show that as the electron density is increased, the fully non-inductive current profile broadens on EAST. The improved understanding and modeling capability are also used to develop advanced scenarios for the China Fusion Engineering Test Reactor. Overall, these results provide encouragement that the high βp regime can be extended to a lower safety factor and very low rotation, providing a potential path to high performance steady state operation in future devices.
Capturing Essential Information to Achieve Safe Interoperability.
Weininger, Sandy; Jaffe, Michael B; Rausch, Tracy; Goldman, Julian M
2017-01-01
In this article, we describe the role of "clinical scenario" information to assure the safety of interoperable systems, as well as the system's ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore, improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a "patient-controlled analgesia safety interlock" are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a learning health system to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A Web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™ (MD PnP Program), is described.
Towards improved capability and confidence in coupled atmospheric and wildland fire modeling
NASA Astrophysics Data System (ADS)
Sauer, Jeremy A.
This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.
Systems Analysis | Hydrogen and Fuel Cells | NREL
risks. Analysts also develop least-cost scenarios for hydrogen infrastructure rollout in support of the opportunities for multi-sector integration using hydrogen systems as well as the capability and cost associated with the H2USA public-private collaboration. Publications The following technical reports
2011-02-01
capabilities for airbags , sensors, and seatbelts have tailored the code for applications in the automotive industry. Currently the code contains...larger intervals. In certain contact scenarios where contacting parts are moving relative to each other in a rapid fashion, such as airbag deployment
Memex Meets Madonna: Multimedia at the Intersection of Information and Entertainment.
ERIC Educational Resources Information Center
Kinney, Thomas
1992-01-01
Proposes a personal information management technology called Memex-TV that might develop from advances in entertainment technology. Topics addressed include capabilities of the new system, system design, the scenario for development of Memex-TV as an entertainment technology spin-off, current entertainment technology trends, a typical evening with…
This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments...
Expressions of Critical Thinking in Role-Playing Simulations: Comparisons across Roles
ERIC Educational Resources Information Center
Ertmer, Peggy A.; Strobel, Johannes; Cheng, Xi; Chen, Xiaojun; Kim, Hannah; Olesova, Larissa; Sadaf, Ayesha; Tomory, Annette
2010-01-01
The development of critical thinking is crucial in professional education to augment the capabilities of pre-professional students. One method for enhancing critical thinking is participation in role-playing simulation-based scenarios where students work together to resolve a potentially real situation. In this study, undergraduate nursing…
Flexibility on storage-release based distributed hydrologic modeling with object-oriented approach
USDA-ARS?s Scientific Manuscript database
With the availability of advanced hydrologic data in the public domain such as remotely sensed and climate change scenario data, there is a need for a modeling framework that is capable of using these data to simulate and extend hydrologic processes with multidisciplinary approaches for sustainable ...
Exploring an Alternative Model of Human Reproductive Capability: A Creative Learning Activity
ERIC Educational Resources Information Center
Cherif, Abour H.; Jedlicka, Dianne M.
2012-01-01
Biological and social evolutionary processes, along with social and cultural developments, have allowed humans to separate procreation from pleasurable/recreational sexual activity. As a class learning project, an alternative, hypothetical reproductive scenario is presented: "What if humans were biologically ready to conceive only during one…
Lenard, James; Badea-Romero, Alexandro; Danton, Russell
2014-12-01
An increasing proportion of new vehicles are being fitted with autonomous emergency braking systems. It is difficult for consumers to judge the effectiveness of these safety systems for individual models unless their performance is evaluated through track testing under controlled conditions. This paper aimed to contribute to the development of relevant test conditions by describing typical circumstances of pedestrian accidents. Cluster analysis was applied to two large British databases and both highlighted an urban scenario in daylight and fine weather where a small pedestrian walks across the road, especially from the near kerb, in clear view of a driver who is travelling straight ahead. For each dataset a main test configuration was defined to represent the conditions of the most common accident scenario along with test variations to reflect the characteristics of less common accident scenarios. Some of the variations pertaining to less common accident circumstances or to a minority of casualties in these scenarios were proposed as optional or supplementary test elements for an outstanding performance rating. Many considerations are incorporated into the final design and implementation of an actual testing regime, such as cost and the state of development of technology; only the representation of accident data lay within the scope of this paper. It would be desirable to ascertain the wider representativeness of the results by analysing accident data from other countries in a similar manner. Copyright © 2014 Elsevier Ltd. All rights reserved.
DARPA counter-sniper program: Phase 1 Acoustic Systems Demonstration results
NASA Astrophysics Data System (ADS)
Carapezza, Edward M.; Law, David B.; Csanadi, Christina J.
1997-02-01
During October 1995 through May 1996, the Defense Advanced Research Projects Agency sponsored the development of prototype systems that exploit acoustic muzzle blast and ballistic shock wave signatures to accurately predict the location of gunfire events and associated shooter locations using either single or multiple volumetric arrays. The output of these acoustic systems is an estimate of the shooter location and a classification estimate of the caliber of the shooter's weapon. A portable display and control unit provides both graphical and alphanumeric shooter location related information integrated on a two- dimensional digital map of the defended area. The final Phase I Acoustic Systems Demonstration field tests were completed in May. These these tests were held at USMC Base Camp Pendleton Military Operations Urban Training (MOUT) facility. These tests were structured to provide challenging gunfire related scenarios with significant reverberation and multi-path conditions. Special shot geometries and false alarms were included in these tests to probe potential system vulnerabilities and to determine the performance and robustness of the systems. Five prototypes developed by U.S. companies and one Israeli developed prototype were tested. This analysis quantifies the spatial resolution estimation capability (azimuth, elevation and range) of these prototypes and describes their ability to accurately classify the type of bullet fired in a challenging urban- like setting.
Transitioning from Software Requirements Models to Design Models
NASA Technical Reports Server (NTRS)
Lowry, Michael (Technical Monitor); Whittle, Jon
2003-01-01
Summary: 1. Proof-of-concept of state machine synthesis from scenarios - CTAS case study. 2. CTAS team wants to use the syntheses algorithm to validate trajectory generation. 3. Extending synthesis algorithm towards requirements validation: (a) scenario relationships' (b) methodology for generalizing/refining scenarios, and (c) interaction patterns to control synthesis. 4. Initial ideas tested on conflict detection scenarios.
Top-attack modeling and automatic target detection using synthetic FLIR scenery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Penn, Joseph A.
2004-09-01
A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.
Automated Construction of Molecular Active Spaces from Atomic Valence Orbitals.
Sayfutyarova, Elvira R; Sun, Qiming; Chan, Garnet Kin-Lic; Knizia, Gerald
2017-09-12
We introduce the atomic valence active space (AVAS), a simple and well-defined automated technique for constructing active orbital spaces for use in multiconfiguration and multireference (MR) electronic structure calculations. Concretely, the technique constructs active molecular orbitals capable of describing all relevant electronic configurations emerging from a targeted set of atomic valence orbitals (e.g., the metal d orbitals in a coordination complex). This is achieved via a linear transformation of the occupied and unoccupied orbital spaces from an easily obtainable single-reference wave function (such as from a Hartree-Fock or Kohn-Sham calculations) based on projectors to targeted atomic valence orbitals. We discuss the premises, theory, and implementation of the idea, and several of its variations are tested. To investigate the performance and accuracy, we calculate the excitation energies for various transition-metal complexes in typical application scenarios. Additionally, we follow the homolytic bond breaking process of a Fenton reaction along its reaction coordinate. While the described AVAS technique is not a universal solution to the active space problem, its premises are fulfilled in many application scenarios of transition-metal chemistry and bond dissociation processes. In these cases the technique makes MR calculations easier to execute, easier to reproduce by any user, and simplifies the determination of the appropriate size of the active space required for accurate results.
ERIC Educational Resources Information Center
Cochran, H. Keith
This paper contains two scenario-type assignments for students in a university tests and measurements class as well as a collection of materials developed by actual students in response to these assignments. An opening explanation argues that education students, often nearing the end of their program when they take the tests and measurement…
Interactive specification acquisition via scenarios: A proposal
NASA Technical Reports Server (NTRS)
Hall, Robert J.
1992-01-01
Some reactive systems are most naturally specified by giving large collections of behavior scenarios. These collections not only specify the behavior of the system, but also provide good test suites for validating the implemented system. Due to the complexity of the systems and the number of scenarios, however, it appears that automated assistance is necessary to make this software development process workable. Interactive Specification Acquisition Tool (ISAT) is a proposed interactive system for supporting the acquisition and maintenance of a formal system specification from scenarios, as well as automatic synthesis of control code and automated test generation. This paper discusses the background, motivation, proposed functions, and implementation status of ISAT.
Creating pedestrian crash scenarios in a driving simulator environment.
Chrysler, Susan T; Ahmad, Omar; Schwarz, Chris W
2015-01-01
In 2012 in the United States, pedestrian injuries accounted for 3.3% of all traffic injuries but, disproportionately, pedestrian fatalities accounted for roughly 14% of traffic-related deaths (NHTSA 2014 ). In many other countries, pedestrians make up more than 50% of those injured and killed in crashes. This research project examined driver response to crash-imminent situations involving pedestrians in a high-fidelity, full-motion driving simulator. This article presents a scenario development method and discusses experimental design and control issues in conducting pedestrian crash research in a simulation environment. Driving simulators offer a safe environment in which to test driver response and offer the advantage of having virtual pedestrian models that move realistically, unlike test track studies, which by nature must use pedestrian dummies on some moving track. An analysis of pedestrian crash trajectories, speeds, roadside features, and pedestrian behavior was used to create 18 unique crash scenarios representative of the most frequent and most costly crash types. For the study reported here, we only considered scenarios where the car is traveling straight because these represent the majority of fatalities. We manipulated driver expectation of a pedestrian both by presenting intersection and mid-block crossing as well as by using features in the scene to direct the driver's visual attention toward or away from the crossing pedestrian. Three visual environments for the scenarios were used to provide a variety of roadside environments and speed: a 20-30 mph residential area, a 55 mph rural undivided highway, and a 40 mph urban area. Many variables of crash situations were considered in selecting and developing the scenarios, including vehicle and pedestrian movements; roadway and roadside features; environmental conditions; and characteristics of the pedestrian, driver, and vehicle. The driving simulator scenarios were subjected to iterative testing to adjust time to arrival triggers for the pedestrian actions. This article discusses the rationale behind creating the simulator scenarios and some of the procedural considerations for conducting this type of research. Crash analyses can be used to construct test scenarios for driver behavior evaluations using driving simulators. By considering trajectories, roadway, and environmental conditions of real-world crashes, representative virtual scenarios can serve as safe test beds for advanced driver assistance systems. The results of such research can be used to inform pedestrian crash avoidance/mitigation systems by identifying driver error, driver response time, and driver response choice (i.e., steering vs. braking).
Contributions of axionlike particles to lepton dipole moments
Marciano, W. J.; Masiero, A.; Paradisi, P.; ...
2016-12-30
We examined contributions of a spin-0 axionlike particle (ALP) to lepton dipole moments, g - 2 and EDMs. Barr-Zee and light-by-light loop effects from a light pseudoscalar ALP are found to be capable of resolving the longstanding muon g - 2 discrepancy at the expense of relatively large ALP - γ γ couplings. We also discussed the compatibility of such large couplings with direct experimental constraints and perturbative unitarity bounds. Future tests of such a scenario are described. For C P -violating ALP couplings, the electron EDM is found to probe much smaller, theoretically more easily accommodated ALP interactions. Wemore » advocate future planned improvements in electron EDM searches as a way to not only significantly constrain ALP parameters, but also potentially unveil a new source of C P violation which could have far-reaching ramifications.« less
Mars Tumbleweed: FY2003 Conceptual Design Assessment
NASA Technical Reports Server (NTRS)
Antol, Jeffrey; Calhoun, Philip C.; Flick, John J.; Hajos, Gregory a.; Keys, Jennifer P.; Stillwagen, Frederic H.; Krizan, Shawn A.; Strickland, Christopher V.; Owens, Rachel; Wisniewski, Michael
2005-01-01
NASA LaRC is studying concepts for a new type of Mars exploration vehicle that would be propelled by the wind. Known as the Mars Tumbleweed, it would derive mobility through use of the Martian surface winds. Tumbleweeds could conceivably travel greater distances, cover larger areas of the surface, and provide access to areas inaccessible by conventional vehicles. They would be lightweight and relatively inexpensive, allowing a multiple vehicle network to be deployed on a single mission. Tumbleweeds would be equipped with sensors for conducting science and serve as scouts searching broad areas to identify specific locations for follow-on investigation by other explorers. An extensive assessment of LaRC Tumbleweed concepts was conducted in FY03, including refinement of science mission scenarios, definition of supporting subsystems (structures, power, communications), testing in wind tunnels, and development of a dynamic simulation capability.
Unification of nonclassicality measures in interferometry
NASA Astrophysics Data System (ADS)
Yuan, Xiao; Zhou, Hongyi; Gu, Mile; Ma, Xiongfeng
2018-01-01
From an operational perspective, nonclassicality characterizes the exotic behavior in a physical process which cannot be explained with Newtonian physics. There are several widely used measures of nonclassicality, including coherence, discord, and entanglement, each proven to be essential resources in particular situations. There exists evidence of fundamental connections among the three measures. However, the sources of nonclassicality are still regarded differently and such connections are yet to be elucidated. Here, we introduce a general framework of defining a unified nonclassicality with an operational motivation founded on the capability of interferometry. Nonclassicality appears differently as coherence, discord, and entanglement in different scenarios with local measurement, weak basis-independent measurement, and strong basis-independent measurement, respectively. Our results elaborate how these three measures are related and how they can be transformed from each other. Experimental schemes are proposed to test the results.
NASA Astrophysics Data System (ADS)
Bai, Yang; Tofel, Pavel; Hadas, Zdenek; Smilek, Jan; Losak, Petr; Skarvada, Pavel; Macku, Robert
2018-06-01
The capability of using a linear kinetic energy harvester - A cantilever structured piezoelectric energy harvester - to harvest human motions in the real-life activities is investigated. The whole loop of the design, simulation, fabrication and test of the energy harvester is presented. With the smart wristband/watch sized energy harvester, a root mean square of the output power of 50 μW is obtained from the real-life hand-arm motion in human's daily life. Such a power is enough to make some low power consumption sensors to be self-powered. This paper provides a good and reliable comparison to those with nonlinear structures. It also helps the designers to consider whether to choose a nonlinear structure or not in a particular energy harvester based on different application scenarios.
On a simulation study for reliable and secured smart grid communications
NASA Astrophysics Data System (ADS)
Mallapuram, Sriharsha; Moulema, Paul; Yu, Wei
2015-05-01
Demand response is one of key smart grid applications that aims to reduce power generation at peak hours and maintain a balance between supply and demand. With the support of communication networks, energy consumers can become active actors in the energy management process by adjusting or rescheduling their electricity usage during peak hours based on utilities pricing incentives. Nonetheless, the integration of communication networks expose the smart grid to cyber-attacks. In this paper, we developed a smart grid simulation test-bed and designed evaluation scenarios. By leveraging the capabilities of Matlab and ns-3 simulation tools, we conducted a simulation study to evaluate the impact of cyber-attacks on demand response application. Our data shows that cyber-attacks could seriously disrupt smart grid operations, thus confirming the need of secure and resilient communication networks for supporting smart grid operations.
Practical considerations in Bayesian fusion of point sensors
NASA Astrophysics Data System (ADS)
Johnson, Kevin; Minor, Christian
2012-06-01
Sensor data fusion is and has been a topic of considerable research, but rigorous and quantitative understanding of the benefits of fusing specific types of sensor data remains elusive. Often, sensor fusion is performed on an ad hoc basis with the assumption that overall detection capabilities will improve, only to discover later, after expensive and time consuming laboratory and/or field testing that little advantage was gained. The work presented here will discuss these issues with theoretical and practical considerations in the context of fusing chemical sensors with binary outputs. Results are given for the potential performance gains one could expect with such systems, as well as the practical difficulties involved in implementing an optimal Bayesian fusion strategy with realistic scenarios. Finally, a discussion of the biases that inaccurate statistical estimates introduce into the results and their consequences is presented.
Reaction control system/remote manipulator system automation
NASA Technical Reports Server (NTRS)
Hiers, Harry K.
1990-01-01
The objectives of this project is to evaluate the capability of the Procedural Reasoning System (PRS) in a typical real-time space shuttle application and to assess its potential for use in the Space Station Freedom. PRS, developed by SRI International, is a result of research in automating the monitoring and control of spacecraft systems. The particular application selected for the present work is the automation of malfunction handling procedures for the Shuttle Remote Manipulator System (SRMS). The SRMS malfunction procedures will be encoded within the PRS framework, a crew interface appropriate to the RMS application will be developed, and the real-time data interface software developed. The resulting PRS will then be integrated with the high-fidelity On-orbit Simulation of the NASA Johnson Space Center's System Engineering Simulator, and tests under various SRMS fault scenarios will be conducted.
Direct LiT Electrolysis in a Metallic Fusion Blanket
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olson, Luke
2016-09-30
A process that simplifies the extraction of tritium from molten lithium-based breeding blankets was developed. The process is based on the direct electrolysis of lithium tritide using a ceramic Li ion conductor that replaces the molten salt extraction step. Extraction of tritium in the form of lithium tritide in the blankets/targets of fusion/fission reactors is critical in order to maintain low concentrations. This is needed to decrease the potential tritium permeation to the surroundings and large releases from unforeseen accident scenarios. Extraction is complicated due to required low tritium concentration limits and because of the high affinity of tritium formore » the blanket. This work identified, developed and tested the use of ceramic lithium ion conductors capable of recovering hydrogen and deuterium through an electrolysis step at high temperatures.« less
Direct Lit Electrolysis In A Metallic Lithium Fusion Blanket
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colon-Mercado, H.; Babineau, D.; Elvington, M.
2015-10-13
A process that simplifies the extraction of tritium from molten lithium based breeding blankets was developed. The process is based on the direct electrolysis of lithium tritide using a ceramic Li ion conductor that replaces the molten salt extraction step. Extraction of tritium in the form of lithium tritide in the blankets/targets of fission/fusion reactors is critical in order to maintained low concentrations. This is needed to decrease the potential tritium permeation to the surroundings and large releases from unforeseen accident scenarios. Because of the high affinity of tritium for the blanket, extraction is complicated at the required low levels. This workmore » identified, developed and tested the use of ceramic lithium ion conductors capable of recovering the hydrogen and deuterium thru an electrolysis step at high temperatures. « less
NASA Technical Reports Server (NTRS)
DeCristofaro, Michael A.; Lansdowne, Chatwin A.; Schlesinger, Adam M.
2014-01-01
NASA has identified standardized wireless mesh networking as a key technology for future human and robotic space exploration. Wireless mesh networks enable rapid deployment, provide coverage in undeveloped regions. Mesh networks are also self-healing, resilient, and extensible, qualities not found in traditional infrastructure-based networks. Mesh networks can offer lower size, weight, and power (SWaP) than overlapped infrastructure-perapplication. To better understand the maturity, characteristics and capability of the technology, we developed an 802.11 mesh network consisting of a combination of heterogeneous commercial off-the-shelf devices and opensource firmware and software packages. Various streaming applications were operated over the mesh network, including voice and video, and performance measurements were made under different operating scenarios. During the testing several issues with the currently implemented mesh network technology were identified and outlined for future work.
A smart grid simulation testbed using Matlab/Simulink
NASA Astrophysics Data System (ADS)
Mallapuram, Sriharsha; Moulema, Paul; Yu, Wei
2014-06-01
The smart grid is the integration of computing and communication technologies into a power grid with a goal of enabling real time control, and a reliable, secure, and efficient energy system [1]. With the increased interest of the research community and stakeholders towards the smart grid, a number of solutions and algorithms have been developed and proposed to address issues related to smart grid operations and functions. Those technologies and solutions need to be tested and validated before implementation using software simulators. In this paper, we developed a general smart grid simulation model in the MATLAB/Simulink environment, which integrates renewable energy resources, energy storage technology, load monitoring and control capability. To demonstrate and validate the effectiveness of our simulation model, we created simulation scenarios and performed simulations using a real-world data set provided by the Pecan Street Research Institute.
Paroissien, Jean-Baptiste; Darboux, Frédéric; Couturier, Alain; Devillers, Benoît; Mouillot, Florent; Raclot, Damien; Le Bissonnais, Yves
2015-03-01
Global climate and land use changes could strongly affect soil erosion and the capability of soils to sustain agriculture and in turn impact regional or global food security. The objective of our study was to develop a method to assess soil sustainability to erosion under changes in land use and climate. The method was applied in a typical mixed Mediterranean landscape in a wine-growing watershed (75 km(2)) within the Languedoc region (La Peyne, France) for two periods: a first period with the current climate and land use and a second period with the climate and land use scenarios at the end of the twenty-first century. The Intergovernmental Panel on Climate Change A1B future rainfall scenarios from the Météo France General circulation model was coupled with four contrasting land use change scenarios that were designed using a spatially-explicit land use change model. Mean annual erosion rate was estimated with an expert-based soil erosion model. Soil life expectancy was assessed using soil depth. Soil erosion rate and soil life expectancy were combined into a sustainability index. The median simulated soil erosion rate for the current period was 3.5 t/ha/year and the soil life expectancy was 273 years, showing a low sustainability of soils. For the future period with the same land use distribution, the median simulated soil erosion rate was 4.2 t/ha/year and the soil life expectancy was 249 years. The results show that soil erosion rate and soil life expectancy are more sensitive to changes in land use than to changes in precipitation. Among the scenarios tested, institution of a mandatory grass cover in vineyards seems to be an efficient means of significantly improving soil sustainability, both in terms of decreased soil erosion rates and increased soil life expectancies. Copyright © 2014 Elsevier Ltd. All rights reserved.
Development of test scenarios for off-roadway crash countermeasures based on crash statistics
DOT National Transportation Integrated Search
2002-09-01
This report presents the results from an analysis of off-roadway crashes and proposes a set of crash-imminent scenarios to objectively test countermeasure systems for light vehicles (passenger cars, sport utility vehicles, vans, and pickup trucks) ba...
Doherty, John U; Kort, Smadar; Mehran, Roxana; Schoenhagen, Paul; Soman, Prem; Dehmer, Greg J; Doherty, John U; Schoenhagen, Paul; Amin, Zahid; Bashore, Thomas M; Boyle, Andrew; Calnon, Dennis A; Carabello, Blase; Cerqueira, Manuel D; Conte, John; Desai, Milind; Edmundowicz, Daniel; Ferrari, Victor A; Ghoshhajra, Brian; Mehrotra, Praveen; Nazarian, Saman; Reece, T Brett; Tamarappoo, Balaji; Tzou, Wendy S; Wong, John B; Doherty, John U; Dehmer, Gregory J; Bailey, Steven R; Bhave, Nicole M; Brown, Alan S; Daugherty, Stacie L; Dean, Larry S; Desai, Milind Y; Duvernoy, Claire S; Gillam, Linda D; Hendel, Robert C; Kramer, Christopher M; Lindsay, Bruce D; Manning, Warren J; Mehrotra, Praveen; Patel, Manesh R; Sachdeva, Ritu; Wann, L Samuel; Winchester, David E; Wolk, Michael J; Allen, Joseph M
2018-04-01
This document is 1 of 2 companion appropriate use criteria (AUC) documents developed by the American College of Cardiology, American Association for Thoracic Surgery, American Heart Association, American Society of Echocardiography, American Society of Nuclear Cardiology, Heart Rhythm Society, Society for Cardiovascular Angiography and Interventions, Society of Cardiovascular Computed Tomography, Society for Cardiovascular Magnetic Resonance, and Society of Thoracic Surgeons. This document addresses the evaluation and use of multimodality imaging in the diagnosis and management of valvular heart disease, whereas the second, companion document addresses this topic with regard to structural heart disease. Although there is clinical overlap, the documents addressing valvular and structural heart disease are published separately, albeit with a common structure. The goal of the companion AUC documents is to provide a comprehensive resource for multimodality imaging in the context of valvular and structural heart disease, encompassing multiple imaging modalities. Using standardized methodology, the clinical scenarios (indications) were developed by a diverse writing group to represent patient presentations encountered in everyday practice and included common applications and anticipated uses. Where appropriate, the scenarios were developed on the basis of the most current American College of Cardiology/American Heart Association guidelines. A separate, independent rating panel scored the 92 clinical scenarios in this document on a scale of 1 to 9. Scores of 7 to 9 indicate that a modality is considered appropriate for the clinical scenario presented. Midrange scores of 4 to 6 indicate that a modality may be appropriate for the clinical scenario, and scores of 1 to 3 indicate that a modality is considered rarely appropriate for the clinical scenario. The primary objective of the AUC is to provide a framework for the assessment of these scenarios by practices that will improve and standardize physician decision making. AUC publications reflect an ongoing effort by the American College of Cardiology to critically and systematically create, review, and categorize clinical situations where diagnostic tests and procedures are utilized by physicians caring for patients with cardiovascular diseases. The process is based on the current understanding of the technical capabilities of the imaging modalities examined. Copyright © 2017. Published by Elsevier Inc.
Doherty, John U; Kort, Smadar; Mehran, Roxana; Schoenhagen, Paul; Soman, Prem
2017-12-01
This document is 1 of 2 companion appropriate use criteria (AUC) documents developed by the American College of Cardiology, American Association for Thoracic Surgery, American Heart Association, American Society of Echocardiography, American Society of Nuclear Cardiology, Heart Rhythm Society, Society for Cardiovascular Angiography and Interventions, Society of Cardiovascular Computed Tomography, Society for Cardiovascular Magnetic Resonance, and Society of Thoracic Surgeons. This document addresses the evaluation and use of multimodality imaging in the diagnosis and management of valvular heart disease, whereas the second, companion document addresses this topic with regard to structural heart disease. Although there is clinical overlap, the documents addressing valvular and structural heart disease are published separately, albeit with a common structure. The goal of the companion AUC documents is to provide a comprehensive resource for multimodality imaging in the context of valvular and structural heart disease, encompassing multiple imaging modalities.Using standardized methodology, the clinical scenarios (indications) were developed by a diverse writing group to represent patient presentations encountered in everyday practice and included common applications and anticipated uses. Where appropriate, the scenarios were developed on the basis of the most current American College of Cardiology/American Heart Association guidelines.A separate, independent rating panel scored the 92 clinical scenarios in this document on a scale of 1 to 9. Scores of 7 to 9 indicate that a modality is considered appropriate for the clinical scenario presented. Midrange scores of 4 to 6 indicate that a modality may be appropriate for the clinical scenario, and scores of 1 to 3 indicate that a modality is considered rarely appropriate for the clinical scenario.The primary objective of the AUC is to provide a framework for the assessment of these scenarios by practices that will improve and standardize physician decision making. AUC publications reflect an ongoing effort by the American College of Cardiology to critically and systematically create, review, and categorize clinical situations where diagnostic tests and procedures are utilized by physicians caring for patients with cardiovascular diseases. The process is based on the current understanding of the technical capabilities of the imaging modalities examined.
van der Meulen, Ineke; van de Sandt-Koenderman, W Mieke E; Duivenvoorden, Hugo J; Ribbers, Gerard M
2010-01-01
This study explores the psychometric qualities of the Scenario Test, a new test to assess daily-life communication in severe aphasia. The test is innovative in that it: (1) examines the effectiveness of verbal and non-verbal communication; and (2) assesses patients' communication in an interactive setting, with a supportive communication partner. To determine the reliability, validity, and sensitivity to change of the Scenario Test and discuss its clinical value. The Scenario Test was administered to 122 persons with aphasia after stroke and to 25 non-aphasic controls. Analyses were performed for the entire group of persons with aphasia, as well as for a subgroup of persons unable to communicate verbally (n = 43). Reliability (internal consistency, test-retest reliability, inter-judge, and intra-judge reliability) and validity (internal validity, convergent validity, known-groups validity) and sensitivity to change were examined using standard psychometric methods. The Scenario Test showed high levels of reliability. Internal consistency (Cronbach's alpha = 0.96; item-rest correlations = 0.58-0.82) and test-retest reliability (ICC = 0.98) were high. Agreement between judges in total scores was good, as indicated by the high inter- and intra-judge reliability (ICC = 0.86-1.00). Agreement in scores on the individual items was also good (square-weighted kappa values 0.61-0.92). The test demonstrated good levels of validity. A principal component analysis for categorical data identified two dimensions, interpreted as general communication and communicative creativity. Correlations with three other instruments measuring communication in aphasia, that is, Spontaneous Speech interview from the Aachen Aphasia Test (AAT), Amsterdam-Nijmegen Everyday Language Test (ANELT), and Communicative Effectiveness Index (CETI), were moderate to strong (0.50-0.85) suggesting good convergent validity. Group differences were observed between persons with aphasia and non-aphasic controls, as well as between persons with aphasia unable to use speech to convey information and those able to communicate verbally; this indicates good known-groups validity. The test was sensitive to changes in performance, measured over a period of 6 months. The data support the reliability and validity of the Scenario Test as an instrument for examining daily-life communication in aphasia. The test focuses on multimodal communication; its psychometric qualities enable future studies on the effect of Alternative and Augmentative Communication (AAC) training in aphasia.
Reducing a Knowledge-Base Search Space When Data Are Missing
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.
Oremus, Mark; Tarride, Jean-Eric; Raina, Parminder; Thabane, Lehana; Foster, Gary; Goldsmith, Charlie H; Clayton, Natasha
2012-11-01
Alzheimer's disease (AD) is a neurodegenerative disorder highlighted by progressive declines in cognitive and functional abilities. Our objective was to assess the general public's maximum willingness to pay ((M)WTP) for an increase in annual personal income taxes to fund unrestricted access to AD medications. We randomly recruited 500 Canadians nationally and used computer-assisted telephone interviewing to administer a questionnaire. The questionnaire contained four 'efficacy' scenarios describing an AD medication as capable of symptomatically treating cognitive decline or modifying disease progression. The scenarios also described the medication as having no adverse effects or a 30% chance of adverse effects. We randomized participants to order of scenarios and willingness-to-pay bid values; (M)WTP for each scenario was the highest accepted bid for that scenario. We conducted linear regression and bootstrap sensitivity analyses to investigate potential determinants of (M)WTP. Mean (M)WTP was highest for the 'disease modification/no adverse effects' scenario ($Can130.26) and lowest for the 'symptomatic treatment/30% chance of adverse effects' scenario ($Can99.16). Bootstrap analyses indicated none of our potential determinants (e.g. age, sex) were associated with participants' (M)WTP. The general public is willing to pay higher income taxes to fund unrestricted access to AD (especially disease-modifying) medications. Consequently, the public should favour placing new AD medications on public drug plans. As far as we are aware, no other study has elicited the general public's willingness to pay for AD medications.
Combinatorial structure of genome rearrangements scenarios.
Ouangraoua, Aïda; Bergeron, Anne
2010-09-01
In genome rearrangement theory, one of the elusive questions raised in recent years is the enumeration of rearrangement scenarios between two genomes. This problem is related to the uniform generation of rearrangement scenarios and the derivation of tests of statistical significance of the properties of these scenarios. Here we give an exact formula for the number of double-cut-and-join (DCJ) rearrangement scenarios between two genomes. We also construct effective bijections between the set of scenarios that sort a component as well studied combinatorial objects such as parking functions, labeled trees, and prüfer codes.
High altitude airship configuration and power technology and method for operation of same
NASA Technical Reports Server (NTRS)
Choi, Sang H. (Inventor); Elliott, Jr., James R. (Inventor); King, Glen C. (Inventor); Park, Yeonjoon (Inventor); Kim, Jae-Woo (Inventor); Chu, Sang-Hyon (Inventor)
2011-01-01
A new High Altitude Airship (HAA) capable of various extended applications and mission scenarios utilizing inventive onboard energy harvesting and power distribution systems. The power technology comprises an advanced thermoelectric (ATE) thermal energy conversion system. The high efficiency of multiple stages of ATE materials in a tandem mode, each suited for best performance within a particular temperature range, permits the ATE system to generate a high quantity of harvested energy for the extended mission scenarios. When the figure of merit 5 is considered, the cascaded efficiency of the three-stage ATE system approaches an efficiency greater than 60 percent.
Development of a tele-stethoscope and its application in pediatric cardiology.
Hedayioglu, F L; Mattos, S S; Moser, L; de Lima, M E
2007-01-01
Over the years, many attempts have been made to develop special stethoscopes for the teaching of auscultation. The objective of this article is to report on the experience with the development and implementation of an electronic stethoscope and a virtual library of cardiac sounds. There were four stages to this project: (1) the building of the prototype to acquire, filter and amplify the cardiac sounds, (2) the development of a software program to record, reproduce and visualize them, (3) the testing of the prototype in a clinical scenario, and (4) the development of an internet site, to store and display the sounds collected. The first two stages are now complete. The prototype underwent an initial evaluation in a clinical scenario within the Unit and during virtual out-patient clinical sessions. One hundred auscultations were recorded during these tests. They were reviewed and discussed on-line by a panel of experience cardiologists during the sessions. Although the sounds were considered "satisfactory" for diagnostic purposes by the cardiology team, they identified some qualitative differences in the electronic recorded auscultations, such as a higher pitch of the recorded sounds. Prospective clinical studies are now being conducted to further evaluate the interference of the electronic device in the physicians' capability to diagnose different cardiac conditions. An internet site (www.caduceusvirtual.com.br/ auscultaped) was developed to host these cardiac auscultations. It is set as a library of cardiac sounds, catalogued by pathologies and already contains examples from auscultations of the majority of common congenital heart lesions, such as septal defects and valvar lesions.
Validation of the thermal code of RadTherm-IR, IR-Workbench, and F-TOM
NASA Astrophysics Data System (ADS)
Schwenger, Frédéric; Grossmann, Peter; Malaplate, Alain
2009-05-01
System assessment by image simulation requires synthetic scenarios that can be viewed by the device to be simulated. In addition to physical modeling of the camera, a reliable modeling of scene elements is necessary. Software products for modeling of target data in the IR should be capable of (i) predicting surface temperatures of scene elements over a long period of time and (ii) computing sensor views of the scenario. For such applications, FGAN-FOM acquired the software products RadTherm-IR (ThermoAnalytics Inc., Calumet, USA; IR-Workbench (OKTAL-SE, Toulouse, France). Inspection of the accuracy of simulation results by validation is necessary before using these products for applications. In the first step of validation, the performance of both "thermal solvers" was determined through comparison of the computed diurnal surface temperatures of a simple object with the corresponding values from measurements. CUBI is a rather simple geometric object with well known material parameters which makes it suitable for testing and validating object models in IR. It was used in this study as a test body. Comparison of calculated and measured surface temperature values will be presented, together with the results from the FGAN-FOM thermal object code F-TOM. In the second validation step, radiances of the simulated sensor views computed by RadTherm-IR and IR-Workbench will be compared with radiances retrieved from the recorded sensor images taken by the sensor that was simulated. Strengths and weaknesses of the models RadTherm-IR, IR-Workbench and F-TOM will be discussed.
Aerosol Delivery for Amendment Distribution in Contaminated Vadose Zones
NASA Astrophysics Data System (ADS)
Hall, R. J.; Murdoch, L.; Riha, B.; Looney, B.
2011-12-01
Remediation of contaminated vadose zones is often hindered by an inability to effectively distribute amendments. Many amendment-based approaches have been successful in saturated formations, however, have not been widely pursued when treating contaminated unsaturated materials due to amendment distribution limitations. Aerosol delivery is a promising new approach for distributing amendments in contaminated vadose zones. Amendments are aerosolized and injected through well screens. During injection the aerosol particles are transported with the gas and deposited on the surfaces of soil grains. Resulting distributions are radially and vertically broad, which could not be achieved by injecting pure liquid-phase solutions. The objectives of this work were A) to characterize transport and deposition behaviors of aerosols; and B) to develop capabilities for predicting results of aerosol injection scenarios. Aerosol transport and deposition processes were investigated by conducting lab-scale injection experiments. These experiments involved injection of aerosols through a 2m radius, sand-filled wedge. A particle analyzer was used to measure aerosol particle distributions with time, and sand samples were taken for amendment content analysis. Predictive capabilities were obtained by constructing a numerical model capable of simulating aerosol transport and deposition in porous media. Results from tests involving vegetable oil aerosol injection show that liquid contents appropriate for remedial applications could be readily achieved throughout the sand-filled wedge. Lab-scale tests conducted with aqueous aerosols show that liquid accumulation only occurs near the point of injection. Tests were also conducted using 200 g/L salt water as the aerosolized liquid. Liquid accumulations observed during salt water tests were minimal and similar to aqueous aerosol results. However, particles were measured, and salt deposited distal to the point of injection. Differences between aqueous and oil deposition are assumed to occur due to surface interactions, and susceptibility to evaporation of aqueous aerosols. Distal salt accumulation during salt water aerosol tests suggests that solid salt forms as salt water aerosols evaporate. The solid salt aerosols are less likely to deposit, so they travel further than aqueous aerosols. A numerical model was calibrated using results from lab-scale tests. The calibrated model was then used to simulate field-scale aerosol injection. Results from field-scale simulations suggest that effective radii of influence on the scale of 8-10 meters could be achieved in partially saturated sand. The aerosol delivery process appears to be capable distributing oil amendments over considerable volumes of formation at concentrations appropriate for remediation purposes. Thus far, evaporation has limited liquid accumulation observed when distributing aqueous aerosols, however, results from salt water experiments suggest that injection of solid phase aerosols can effectively distribute water soluble amendments (electron donor, pH buffer, oxidants, etc.). Utilization of aerosol delivery could considerably expand treatment options for contaminated vadose zones at a wide variety of sites.
10 CFR 26.123 - Testing facility capabilities.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...
10 CFR 26.123 - Testing facility capabilities.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...
10 CFR 26.123 - Testing facility capabilities.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...
10 CFR 26.123 - Testing facility capabilities.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...
10 CFR 26.123 - Testing facility capabilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Testing facility capabilities. 26.123 Section 26.123 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Licensee Testing Facilities § 26.123 Testing facility capabilities. Each licensee testing facility shall have the capability, at the same...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-15
.... OCC-2012-0016] Policy Statement on the Principles for Development and Distribution of Annual Stress... and factors to be used by the OCC in development and distributing the stress test scenarios for the annual stress test required by the Dodd- Frank Wall Street Reform and Consumer Protection Act of 2010 as...
C-arm rotation encoding with accelerometers.
Grzeda, Victor; Fichtinger, Gabor
2010-07-01
Fluoroscopic C-arms are being incorporated in computer-assisted interventions in increasing number. For these applications to work, the relative poses of imaging must be known. To find the pose, tracking methods such as optical cameras, electromagnetic trackers, and radiographic fiducials have been used-all hampered by significant shortcomings. We propose to recover the rotational pose of the C-arm using the angle-sensing ability of accelerometers, by exploiting the capability of the accelerometer to measure tilt angles. By affixing the accelerometer to a C-arm, the accelerometer tracks the C-arm pose during rotations of the C-arm. To demonstrate this concept, a C-arm analogue was constructed with a webcam device affixed to the C-arm model to mimic X-ray imaging. Then, measuring the offset between the accelerometer angle readings to the webcam pose angle, an angle correction equation (ACE) was created to properly tracking the C-arm rotational pose. Several tests were performed on the webcam C-arm model using the ACEs to tracking the primary and secondary angle rotations of the model. We evaluated the capability of linear and polynomial ACEs to tracking the webcam C-arm pose angle for different rotational scenarios. The test results showed that the accelerometer could track the pose of the webcam C-arm model with an accuracy of less than 1.0 degree. The accelerometer was successful in sensing the C-arm's rotation with clinically adequate accuracy in the C-arm webcam model.
Controlling Hazardous Releases while Protecting Passengers in Civil Infrastructure Systems
NASA Astrophysics Data System (ADS)
Rimer, Sara P.; Katopodes, Nikolaos D.
2015-11-01
The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety, and the real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. Furthermore, the safe evacuation of occupants during such a catastrophe is of utmost importance. This research develops a comprehensive means to address such scenarios, through both the sensing and control of contaminants, and the modeling of and potential communication to occupants as they evacuate. A computational fluid dynamics model is developed of a simplified public space characterized by a long conduit (e.g. airport terminal) with unidirectional ambient flow that is capable of detecting and mitigating the hazardous contaminant (via boundary ports) over several time horizons using model predictive control optimization. Additionally, a physical prototype is built to test the real-time feasibility of this computational flow control model. The prototype is a blower wind-tunnel with an elongated test section with the capability of sensing (via digital camera) an injected `contaminant' (propylene glycol smoke), and then mitigating that contaminant using actuators (compressed air operated vacuum nozzles) which are operated by a set of pressure regulators and a programmable controller. Finally, an agent-based model is developed to simulate ``agents'' (i.e. building occupants) as they evacuate a public space, and is coupled with the computational flow control model such that agents must interact with a dynamic, threatening environment. NSF-CMMI #0856438.
Delay/Disruption Tolerant Networks (DTN): Testing and Demonstration for Lunar Surface Applications
NASA Technical Reports Server (NTRS)
2009-01-01
This slide presentation reviews the testing of the Delay/Disruption Tolerant Network (DTN) designed for use with Lunar Surface applications. This is being done through the DTN experimental Network (DEN), that permit access and testing by other NASA centers, DTN team members and protocol developers. The objective of this work is to demonstrate DTN for high return applications in lunar scenarios, provide DEN connectivity with analogs of Constellation elements, emulators, and other resources from DTN Team Members, serve as a wireless communications staging ground for remote analog excursions and enable testing of detailed communication scenarios and evaluation of network performance. Three scenarios for DTN on the Lunar surface are reviewed: Motion imagery, Voice and sensor telemetry, and Navigation telemetry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henline, P.A.
1995-12-31
The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DIII-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape controlmore » due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henline, P.A.
1995-10-01
The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DRI-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape controlmore » due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described.« less
The rogue nature of hiatuses in a global warming climate
NASA Astrophysics Data System (ADS)
Sévellec, F.; Sinha, B.; Skliris, N.
2016-08-01
The nature of rogue events is their unlikelihood and the recent unpredicted decade-long slowdown in surface warming, the so-called hiatus, may be such an event. However, given decadal variability in climate, global surface temperatures were never expected to increase monotonically with increasing radiative forcing. Here surface air temperature from 20 climate models is analyzed to estimate the historical and future likelihood of hiatuses and "surges" (faster than expected warming), showing that the global hiatus of the early 21st century was extremely unlikely. A novel analysis of future climate scenarios suggests that hiatuses will almost vanish and surges will strongly intensify by 2100 under a "business as usual" scenario. For "CO2 stabilisation" scenarios, hiatus, and surge characteristics revert to typical 1940s values. These results suggest to study the hiatus of the early 21st century and future reoccurrences as rogue events, at the limit of the variability of current climate modelling capability.
a Distributed Online 3D-LIDAR Mapping System
NASA Astrophysics Data System (ADS)
Schmiemann, J.; Harms, H.; Schattenberg, J.; Becker, M.; Batzdorfer, S.; Frerichs, L.
2017-08-01
In this paper we are presenting work done within the joint development project ANKommEn. It deals with the development of a highly automated robotic system for fast data acquisition in civil disaster scenarios. One of the main requirements is a versatile system, hence the concept embraces a machine cluster consisting of multiple fundamentally different robotic platforms. To cover a large variety of potential deployment scenarios, neither the absolute amount of participants, nor the precise individual layout of each platform shall be restricted within the conceptual design. Thus leading to a variety of special requirements, like onboard and online data processing capabilities for each individual participant and efficient data exchange structures, allowing reliable random data exchange between individual robots. We are demonstrating the functionality and performance by means of a distributed mapping system evaluated with real world data in a challenging urban and rural indoor/outdoor scenarios.
Kane, Sara K; Lorant, Diane E
2018-05-24
Measure variation in delivery room supervision provided by neonatologists using hypothetical scenarios and determine the factors used to guide entrustment decisions. A survey was distributed to members of the American Academy of Pediatrics Section on Perinatal Pediatrics. Neonatologists were presented with various newborn resuscitation scenarios and asked to choose the level of supervision they thought appropriate and grade factors on their importance in making entrustment decisions. There was significant variation in supervision neonatologists deemed necessary for most scenarios (deviation from the mode 0.36-0.69). Post-graduate year of training and environmental circumstances influence the amount of autonomy neonatologists grant trainees. Few neonatologists have objective assessment of a trainees' competence in neonatal resuscitation available to them and most never document how the trainee performed. Delivery room supervision is often determined by subjective evaluation of trainees' competence and may not provide a level of supervision congruent with their capability.
NDFOM Description for DNDO Summer Internship Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budden, Brent Scott
2017-12-01
Nuclear Detection Figure of Merit (NDFOM) is a DNDO-funded project at LANL to develop a software framework that allows a user to evaluate a radiation detection scenario of interest, quickly obtaining results on detector performance. It is intended as a “first step” in detector performance assessment, and meant to be easily employed by subject matter experts (SMEs) and non-SMEs alike. The generic scenario consists of a potential source moving past a detector at a relative velocity and with a distance of closest approach. Such a scenario is capable of describing, e.g., vehicles driving through portal monitors, border patrol scanning suspectedmore » illicit materials with a handheld instrument, and first responders with backpackor pager-based detectors (see Fig. 1). The backend library is prepopulated by the NDFOM developers to include sources and detectors of interest to DNDO and its community.« less
A Car-Steering Model Based on an Adaptive Neuro-Fuzzy Controller
NASA Astrophysics Data System (ADS)
Amor, Mohamed Anis Ben; Oda, Takeshi; Watanabe, Shigeyoshi
This paper is concerned with the development of a car-steering model for traffic simulation. Our focus in this paper is to propose a model of the steering behavior of a human driver for different driving scenarios. These scenarios are modeled in a unified framework using the idea of target position. The proposed approach deals with the driver’s approximation and decision-making mechanisms in tracking a target position by means of fuzzy set theory. The main novelty in this paper lies in the development of a learning algorithm that has the intention to imitate the driver’s self-learning from his driving experience and to mimic his maneuvers on the steering wheel, using linear networks as local approximators in the corresponding fuzzy areas. Results obtained from the simulation of an obstacle avoidance scenario show the capability of the model to carry out a human-like behavior with emphasis on learned skills.
McClenaghan, Joseph; Garofalo, Andrea M.; Meneghini, Orso; ...
2017-08-03
In this study, transport modeling of a proposed ITER steady-state scenario based on DIII-D high poloidal-beta (more » $${{\\beta}_{p}}$$ ) discharges finds that ITB formation can occur with either sufficient rotation or a negative central shear q-profile. The high $${{\\beta}_{p}}$$ scenario is characterized by a large bootstrap current fraction (80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with excellent normalized confinement. Modeling predictions of the electron transport in the high $${{\\beta}_{p}}$$ scenario improve as $${{q}_{95}}$$ approaches levels similar to typical existing models of ITER steady-state and the ion transport is turbulence dominated. Typical temperature and density profiles from the non-inductive high $${{\\beta}_{p}}$$ scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving a $Q=5$ steady-state fusion gain in ITER with 'day one' heating and current drive capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. A high bootstrap fraction, high $${{\\beta}_{p}}$$ scenario is found to be near an ITB formation threshold, and either strong negative central magnetic shear or rotation in a high bootstrap fraction are found to successfully provide the turbulence suppression required to achieve $Q=5$.« less
Levis, James W; Barlaz, Morton A; Decarolis, Joseph F; Ranjithan, S Ranji
2014-04-01
Solid waste management (SWM) systems must proactively adapt to changing policy requirements, waste composition, and an evolving energy system to sustainably manage future solid waste. This study represents the first application of an optimizable dynamic life-cycle assessment framework capable of considering these future changes. The framework was used to draw insights by analyzing the SWM system of a hypothetical suburban U.S. city of 100 000 people over 30 years while considering changes to population, waste generation, and energy mix and costs. The SWM system included 3 waste generation sectors, 30 types of waste materials, and 9 processes for waste separation, treatment, and disposal. A business-as-usual scenario (BAU) was compared to three optimization scenarios that (1) minimized cost (Min Cost), (2) maximized diversion (Max Diversion), and (3) minimized greenhouse gas (GHG) emissions (Min GHG) from the system. The Min Cost scenario saved $7.2 million (12%) and reduced GHG emissions (3%) relative to the BAU scenario. Compared to the Max Diversion scenario, the Min GHG scenario cost approximately 27% less and more than doubled the net reduction in GHG emissions. The results illustrate how the timed-deployment of technologies in response to changes in waste composition and the energy system results in more efficient SWM system performance compared to what is possible from static analyses.
Architecture for an integrated real-time air combat and sensor network simulation
NASA Astrophysics Data System (ADS)
Criswell, Evans A.; Rushing, John; Lin, Hong; Graves, Sara
2007-04-01
An architecture for an integrated air combat and sensor network simulation is presented. The architecture integrates two components: a parallel real-time sensor fusion and target tracking simulation, and an air combat simulation. By integrating these two simulations, it becomes possible to experiment with scenarios in which one or both sides in a battle have very large numbers of primitive passive sensors, and to assess the likely effects of those sensors on the outcome of the battle. Modern Air Power is a real-time theater-level air combat simulation that is currently being used as a part of the USAF Air and Space Basic Course (ASBC). The simulation includes a variety of scenarios from the Vietnam war to the present day, and also includes several hypothetical future scenarios. Modern Air Power includes a scenario editor, an order of battle editor, and full AI customization features that make it possible to quickly construct scenarios for any conflict of interest. The scenario editor makes it possible to place a wide variety of sensors including both high fidelity sensors such as radars, and primitive passive sensors that provide only very limited information. The parallel real-time sensor network simulation is capable of handling very large numbers of sensors on a computing cluster of modest size. It can fuse information provided by disparate sensors to detect and track targets, and produce target tracks.
Nilsson, Daniel; Lindman, Magdalena; Victor, Trent; Dozza, Marco
2018-04-01
Single-vehicle run-off-road crashes are a major traffic safety concern, as they are associated with a high proportion of fatal outcomes. In addressing run-off-road crashes, the development and evaluation of advanced driver assistance systems requires test scenarios that are representative of the variability found in real-world crashes. We apply hierarchical agglomerative cluster analysis to define similarities in a set of crash data variables, these clusters can then be used as the basis in test scenario development. Out of 13 clusters, nine test scenarios are derived, corresponding to crashes characterised by: drivers drifting off the road in daytime and night-time, high speed departures, high-angle departures on narrow roads, highways, snowy roads, loss-of-control on wet roadways, sharp curves, and high speeds on roads with severe road surface conditions. In addition, each cluster was analysed with respect to crash variables related to the crash cause and reason for the unintended lane departure. The study shows that cluster analysis of representative data provides a statistically based method to identify relevant properties for run-off-road test scenarios. This was done to support development of vehicle-based run-off-road countermeasures and driver behaviour models used in virtual testing. Future studies should use driver behaviour from naturalistic driving data to further define how test-scenarios and behavioural causation mechanisms should be included. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McClenaghan, J.; Garofalo, A. M.; Meneghini, O.; Smith, S. P.
2016-10-01
Transport modeling of a proposed ITER steady-state scenario based on DIII-D high βP discharges finds that the core confinement may be improved with either sufficient rotation or a negative central shear q-profile. The high poloidal beta scenario is characterized by a large bootstrap current fraction( 80%) which reduces the demands on the external current drive, and a large radius internal transport barrier which is associated with improved normalized confinement. Typical temperature and density profiles from the non-inductive high poloidal beta scenario on DIII-D are scaled according to 0D modeling predictions of the requirements for achieving Q=5 steady state performance in ITER with ``day one'' H&CD capabilities. Then, TGLF turbulence modeling is carried out under systematic variations of the toroidal rotation and the core q-profile. Either strong negative central magnetic shear or rotation are found to successfully provide the turbulence suppression required to maintain the temperature and density profiles. This work supported by the US Department of Energy under DE-FC02-04ER54698.
Overview of the EUROfusion Medium Size Tokamak scientific program
NASA Astrophysics Data System (ADS)
Bernert, Matthias; Bolzonella, Tommaso; Coda, Stefano; Hakola, Antti; Meyer, Hendrik; Eurofusion Mst1 Team; Tcv Team; Mast-U Team; ASDEX Upgrade Team
2017-10-01
Under the EUROfusion MST1 program, coordinated experiments are conducted at three European medium sized tokamaks (ASDEX Upgrade, TCV and MAST-U). It complements the JET program for preparing a safe and efficient operation for ITER and DEMO. Work under MST1 benefits from cross-machine comparisons but also makes use of the unique capabilities of each device. For the 2017/2018 campaign 25 topic areas were defined targeting three main objectives: 1) Development towards an edge and wall compatible H-mode scenario with small or no ELMs. 2) Investigation of disruptions in order to achieve better predictions and improve avoidance or mitigation schemes. 3) Exploring conventional and alternative divertor configurations for future high P/R scenarios. This contribution will give an overview of the work done under MST1 exemplified by the highlight results for each top objective from the last campaigns, such as evaluation of natural small ELM scenarios, runaway mitigation and control, assessment of detachment in alternative divertor configurations and highly radiative scenarios. See author list of ``H. Meyer et al. 2017 Nucl. Fusion 57, 102014''.
Excreta Sampling as an Alternative to In Vivo Measurements at the Hanford Site.
Carbaugh, Eugene H; Antonio, Cheryl L; Lynch, Timothy P
2015-08-01
The capabilities of indirect radiobioassay by urine and fecal sample analysis were compared with the direct radiobioassay methods of whole body counting and lung counting for the most common radionuclides and inhalation exposure scenarios encountered by Hanford workers. Radionuclides addressed by in vivo measurement included 137Cs, 60Co, 154Eu, and 241Am as an indicator for plutonium mixtures. The same radionuclides were addressed using gamma energy analysis of urine samples, augmented by radiochemistry and alpha spectrometry methods for plutonium in urine and fecal samples. It was concluded that in vivo whole body counting and lung counting capability should be maintained at the Hanford Site for the foreseeable future, however, urine and fecal sample analysis could provide adequate, though degraded, monitoring capability for workers as a short-term alternative, should in vivo capability be lost due to planned or unplanned circumstances.
Study of gamma detection capabilities of the REWARD mobile spectroscopic system
NASA Astrophysics Data System (ADS)
Balbuena, J. P.; Baptista, M.; Barros, S.; Dambacher, M.; Disch, C.; Fiederle, M.; Kuehn, S.; Parzefall, U.
2017-07-01
REWARD is a novel mobile spectroscopic radiation detector system for Homeland Security applications. The system integrates gamma and neutron detection equipped with wireless communication. A comprehensive simulation study on its gamma detection capabilities in different radioactive scenarios is presented in this work. The gamma detection unit consists of a precise energy resolution system based on two stacked (Cd,Zn)Te sensors working in coincidence sum mode. The volume of each of these CZT sensors is 1 cm3. The investigated energy windows used to determine the detection capabilities of the detector correspond to the gamma emissions from 137Cs and 60Co radioactive sources (662 keV and 1173/1333 keV respectively). Monte Carlo and Technology Computer-Aided Design (TCAD) simulations are combined to determine its sensing capabilities for different radiation sources and estimate the limits of detection of the sensing unit as a function of source activity for several shielding materials.
Advanced Capabilities for Wind Tunnel Testing in the 21st Century
NASA Technical Reports Server (NTRS)
Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.
2010-01-01
Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.
Space Launch System Co-Manifested Payload Options for Habitation
NASA Technical Reports Server (NTRS)
Smitherman, David
2015-01-01
The Space Launch System (SLS) has a co-manifested payload capability that will grow over time as the rocket matures and planned upgrades are implemented. The final configuration is planned to be capable of inserting a payload greater than 10 metric tons (mt) into a trans-lunar injection trajectory along with the crew in the Orion capsule and the service module. The co-manifested payload is located below the Orion and its service module in a 10-meter high fairing similar to the way the Saturn launch vehicle carried the lunar lander below the Apollo command and service modules. A variety of approaches have been explored that utilizes this co-manifested payload capability to build up infrastructure in deep space in support of future asteroid, lunar, and Mars mission scenarios. This paper is a report on the findings from the Advanced Concepts Office study team at the NASA Marshall Space Flight Center, working with the Advanced Exploration Systems Program on the Exploration Augmentation Module Project. It includes some of the possible options for habitation in the co-manifested payload volume on SLS. Findings include module designs that can be developed in 10mt increments to support these missions, including overall conceptual layouts, mass properties, and approaches for integration into various scenarios for near-term support of deep space habitat research and technology development, support to asteroid exploration, and long range support for Mars transfer flights.
NASA Technical Reports Server (NTRS)
Colombano, Silvano P.; Kirchner, Frank; Spenneberg, Dirk; Starman, Jared; Hanratty, James; Kovsmeyer, David (Technical Monitor)
2003-01-01
NASA needs autonomous robotic exploration of difficult (rough and/or steep) scientifically interesting Martian terrains. Concepts involving distributed autonomy for cooperative robotic exploration are key to enabling new scientific objectives in robotic missions. We propose to utilize a legged robot as an adjunct scout to a rover for access to difficult - scientifically interesting - terrains (rocky areas, slopes, cliffs). Our final mission scenario involves the Ames rover platform "K9" and Scorpion acting together to explore a steep cliff, with the Scorpion robot rappelling down using the K9 as an anchor as well as mission planner and executive. Cooperation concepts, including wheeled rappelling robots have been proposed before. Now we propose to test the combined advantages of a wheeled vehicle with a legged scout as well as the advantages of merging of high level planning and execution with biologically inspired, behavior based robotics. We propose to use the 8-legged, multifunctional autonomous robot platform Scorpion that is currently capable of: Walking on different terrains (rocks, sand, grass, ...). Perceiving its environment and modifying its behavioral pattern accordingly. These capabilities would be extended to enable the Scorpion to: communicate and cooperate with a partner robot; climb over rocks, rubble piles, and objects with structural features. This will be done in the context of exploration of rough terrains in the neighborhood of the rover, but inaccessible to it, culminating in the added capability of rappelling down a steep cliff for both vertical and horizontal terrain observation.
Assessing Threat Detection Scenarios through Hypothesis Generation and Testing
2015-12-01
Dog Day scenario .............................................................................................................. 9...Figure 1. Rankings of priority threats identified in the Dog Day scenario ............................... 9 Figure 2. Rankings of priority...making in uncertain environments relies heavily on pattern matching. Cohen, Freeman, and Wolf (1996) reported that features of the decision problem
Adams, Vanessa M; Pressey, Robert L; Álvarez-Romero, Jorge G
2016-01-01
Development of land resources can contribute to increased economic productivity but can also negatively affect the extent and condition of native vegetation, jeopardize the persistence of native species, reduce water quality, and erode ecosystem services. Spatial planning must therefore balance outcomes for conservation, development, and social goals. One approach to evaluating these trade-offs is scenario planning. In this paper we demonstrate methods for incorporating stakeholder preferences into scenario planning through both defining scenario objectives and evaluating the scenarios that emerge. In this way, we aim to develop spatial plans capable of informing actual land-use decisions. We used a novel approach to scenario planning that couples optimal land-use design and social evaluation of environmental outcomes. Four land-use scenarios combined differences in total clearing levels (10% and 20%) in our study region, the Daly Catchment Australia, with the presence or absence of spatial precincts to concentrate irrigated agriculture. We used the systematic conservation planning tool Marxan with Zones to optimally plan for multiple land-uses that met objectives for both conservation and development. We assessed the performance of the scenarios in terms of the number of objectives met and the degree to which existing land-use policies were compromised (e.g., whether clearing limits in existing guidelines were exceeded or not). We also assessed the land-use scenarios using expected stakeholder satisfaction with changes in the catchment to explore how the scenarios performed against social preferences. There were a small fraction of conservation objectives with high conservation targets (100%) that could not be met due to current land uses; all other conservation and development objectives were met in all scenarios. Most scenarios adhered to the existing clearing guidelines with only marginal exceedances of limits, indicating that the scenario objectives were compatible with existing policy. We found that two key stakeholder groups, agricultural and Indigenous residents, had divergent satisfaction levels with the amount of clearing and agricultural development. Based on the range of benefits and potential adverse impacts of each scenario, we suggest that the 10% clearing scenarios are most aligned with stakeholder preferences and best balance preferences across stakeholder groups. Our approach to scenario planning is applicable generally to exploring the potential conflicts between goals for conservation and development. Our case study is particularly relevant to current discussion about increased agricultural and pastoral development in northern Australia.
Pressey, Robert L.; Álvarez-Romero, Jorge G.
2016-01-01
Development of land resources can contribute to increased economic productivity but can also negatively affect the extent and condition of native vegetation, jeopardize the persistence of native species, reduce water quality, and erode ecosystem services. Spatial planning must therefore balance outcomes for conservation, development, and social goals. One approach to evaluating these trade-offs is scenario planning. In this paper we demonstrate methods for incorporating stakeholder preferences into scenario planning through both defining scenario objectives and evaluating the scenarios that emerge. In this way, we aim to develop spatial plans capable of informing actual land-use decisions. We used a novel approach to scenario planning that couples optimal land-use design and social evaluation of environmental outcomes. Four land-use scenarios combined differences in total clearing levels (10% and 20%) in our study region, the Daly Catchment Australia, with the presence or absence of spatial precincts to concentrate irrigated agriculture. We used the systematic conservation planning tool Marxan with Zones to optimally plan for multiple land-uses that met objectives for both conservation and development. We assessed the performance of the scenarios in terms of the number of objectives met and the degree to which existing land-use policies were compromised (e.g., whether clearing limits in existing guidelines were exceeded or not). We also assessed the land-use scenarios using expected stakeholder satisfaction with changes in the catchment to explore how the scenarios performed against social preferences. There were a small fraction of conservation objectives with high conservation targets (100%) that could not be met due to current land uses; all other conservation and development objectives were met in all scenarios. Most scenarios adhered to the existing clearing guidelines with only marginal exceedances of limits, indicating that the scenario objectives were compatible with existing policy. We found that two key stakeholder groups, agricultural and Indigenous residents, had divergent satisfaction levels with the amount of clearing and agricultural development. Based on the range of benefits and potential adverse impacts of each scenario, we suggest that the 10% clearing scenarios are most aligned with stakeholder preferences and best balance preferences across stakeholder groups. Our approach to scenario planning is applicable generally to exploring the potential conflicts between goals for conservation and development. Our case study is particularly relevant to current discussion about increased agricultural and pastoral development in northern Australia. PMID:27362347
Adding Learning to Knowledge-Based Systems: Taking the "Artificial" Out of AI
Daniel L. Schmoldt
1997-01-01
Both, knowledge-based systems (KBS) development and maintenance require time-consuming analysis of domain knowledge. Where example cases exist, KBS can be built, and later updated, by incorporating learning capabilities into their architecture. This applies to both supervised and unsupervised learning scenarios. In this paper, the important issues for learning systems-...
Ontological Relations and the Capability Maturity Model Applied in Academia
ERIC Educational Resources Information Center
de Oliveira, Jerônimo Moreira; Campoy, Laura Gómez; Vilarino, Lilian
2015-01-01
This work presents a new approach to the discovery, identification and connection of ontological elements within the domain of characterization in learning organizations. In particular, the study can be applied to contexts where organizations require planning, logic, balance, and cognition in knowledge creation scenarios, which is the case for the…
Energy Analysis Research | Energy Analysis | NREL
innovation through integration. Illustration of NREL energy analysis research, including impact systems analysis integrates all aspects of our capability set to develop future energy system scenarios evaluate and understand the impact of markets, policies, and financing on technology uptake and the impact
Towards an interplanetary internet: a proposed strategy for standardization
NASA Technical Reports Server (NTRS)
Hooke, A. J.
2002-01-01
This paper reviews the current set of standard data communications capabilities that exist to support advanced missions, discusses the architectural concepts for the future Interplanetary Internet, and suggests how a standardized set of space communications protocols that can grow to support future scenarios where human intelligence is widely distributed across the Solar System.
ERIC Educational Resources Information Center
Boyer, Elisebeth
2016-01-01
The research reported in this study examines the very first time the participants planned for and enacted science instruction within a "best-case scenario" teacher preparation program. Evidence from this study indicates that, within this context, preservice teachers are capable of implementing several of the discursive practices of…
2001-04-05
KENNEDY SPACE CENTER, FLA. -- During a staged mass casualty exercise in the Launch Complex 39 area, a paramedic checks an injured woman on the ground. Employees are playing the role of victims during a sniper scenario. The exercise is being staged to validate capabilities of KSC’ fire, medical, helicopter transport and security personnel to respond to such an event.
Are CMEs capable of producing Moreton waves? A case study: the 2006 December 6 event
NASA Astrophysics Data System (ADS)
Krause, G.; Cécere, M.; Zurbriggen, E.; Costa, A.; Francile, C.; Elaskar, S.
2018-02-01
Considering the chromosphere and a stratified corona, we examine, by performing 2D compressible magnetohydrodynamics simulations, the capability of a coronal mass ejection (CME) scenario to drive a Moreton wave. We find that given a typical flux rope (FR) magnetic configuration, in initial pseudo-equilibrium, the larger the magnetic field and the lighter (and hotter) the FR, the larger the amplitude and the speed of the chromospheric disturbance, which eventually becomes a Moreton wave. We present arguments to explain why Moreton waves are much rarer than CME occurrences. In the frame of the present model, we explicitly exclude the action of flares that could be associated with the CME. Analysing the Mach number, we find that only fast magnetosonic shock waves will be able to produce Moreton events. In these cases an overexpansion of the FR is always present and it is the main factor responsible for the Moreton generation. Finally, we show that this scenario can account for the Moreton wave of the 2006 December 6 event (Francile et al. 2013).
Integration of task level planning and diagnosis for an intelligent robot
NASA Technical Reports Server (NTRS)
Chan, Amy W.
1992-01-01
A satellite floating space is diagnosed with a telerobot attached performing maintenance or replacement tasks. This research included three objectives. The first objective was to generate intelligent path planning for a robot to move around a satellite. The second objective was to diagnose possible faulty scenarios in the satellite. The third objective included two tasks. The first task was to combine intelligent path planning with diagnosis. The second task was to build an interface between the combined intelligent system with Robosim. The ability of a robot to deal with unexpected scenarios is particularly important in space since the situation could be different from time to time so that the telerobot must be capable of detecting that the situation has changed and the necessity may exist to alter its behavior based on the new situation. The feature of allowing human-in-the-loop is also very important in space. In some extreme cases, the situation is beyond the capability of a robot so our research project allows the human to override the decision of a robot.
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
A multi-sensor scenario for coastal surveillance
NASA Astrophysics Data System (ADS)
van den Broek, A. C.; van den Broek, S. P.; van den Heuvel, J. C.; Schwering, P. B. W.; van Heijningen, A. W. P.
2007-10-01
Maritime borders and coastal zones are susceptible to threats such as drug trafficking, piracy, undermining economical activities. At TNO Defence, Security and Safety various studies aim at improving situational awareness in a coastal zone. In this study we focus on multi-sensor surveillance of the coastal environment. We present a study on improving classification results for small sea surface targets using an advanced sensor suite and a scenario in which a small boat is approaching the coast. A next generation sensor suite mounted on a tower has been defined consisting of a maritime surveillance and tracking radar system, capable of producing range profiles and ISAR imagery of ships, an advanced infrared camera and a laser range profiler. For this suite we have developed a multi-sensor classification procedure, which is used to evaluate the capabilities for recognizing and identifying non-cooperative ships in coastal waters. We have found that the different sensors give complementary information. Each sensor has its own specific distance range in which it contributes most. A multi-sensor approach reduces the number of misclassifications and reliable classification results are obtained earlier compared to a single sensor approach.
NASA Technical Reports Server (NTRS)
Contreras, Michael T.; Trease, Brian P.; Bojanowski, Cezary; Kulakx, Ronald F.
2013-01-01
A wheel experiencing sinkage and slippage events poses a high risk to planetary rover missions as evidenced by the mobility challenges endured by the Mars Exploration Rover (MER) project. Current wheel design practice utilizes loads derived from a series of events in the life cycle of the rover which do not include (1) failure metrics related to wheel sinkage and slippage and (2) performance trade-offs based on grouser placement/orientation. Wheel designs are rigorously tested experimentally through a variety of drive scenarios and simulated soil environments; however, a robust simulation capability is still in development due to myriad of complex interaction phenomena that contribute to wheel sinkage and slippage conditions such as soil composition, large deformation soil behavior, wheel geometry, nonlinear contact forces, terrain irregularity, etc. For the purposes of modeling wheel sinkage and slippage at an engineering scale, meshfree nite element approaches enable simulations that capture su cient detail of wheel-soil interaction while remaining computationally feasible. This study implements the JPL wheel-soil benchmark problem in the commercial code environment utilizing the large deformation modeling capability of Smooth Particle Hydrodynamics (SPH) meshfree methods. The nominal, benchmark wheel-soil interaction model that produces numerically stable and physically realistic results is presented and simulations are shown for both wheel traverse and wheel sinkage cases. A sensitivity analysis developing the capability and framework for future ight applications is conducted to illustrate the importance of perturbations to critical material properties and parameters. Implementation of the proposed soil-wheel interaction simulation capability and associated sensitivity framework has the potential to reduce experimentation cost and improve the early stage wheel design proce
Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Williams, Paul
This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decisionmore » making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.« less
The Behavior of a Stitched Composite Large-Scale Multi-Bay Pressure Box
NASA Technical Reports Server (NTRS)
Jegley, Dawn C.; Rouse, Marshall; Przekop, Adam; Lovejoy, Andrew E.
2016-01-01
NASA has created the Environmentally Responsible Aviation (ERA) Project to develop technologies to reduce impact of aviation on the environment. A critical aspect of this pursuit is the development of a lighter, more robust airframe to enable the introduction of unconventional aircraft configurations. NASA and The Boeing Company have worked together to develop a structural concept that is lightweight and an advancement beyond state-of-the-art composite structures. The Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) is an integrally stiffened panel design where elements are stitched together and designed to maintain residual load-carrying capabilities under a variety of damage scenarios. With the PRSEUS concept, through-the-thickness stitches are applied through dry fabric prior to resin infusion, and replace fasteners throughout each integral panel. Through-the-thickness reinforcement at discontinuities, such as along flange edges, has been shown to suppress delamination and turn cracks, which expands the design space and leads to lighter designs. The pultruded rod provides stiffening away from the more vulnerable skin surface and improves bending stiffness. A series of building block tests were evaluated to explore the fundamental assumptions related to the capability and advantages of PRSEUS panels. The final step in the building block series of tests is an 80%-scale pressure box representing a portion of the center section of a Hybrid Wing Body (HWB) transport aircraft. The testing of this test article under maneuver and internal pressure loading conditions is the subject of this paper. The experimental evaluation of this article, along with the other building block tests and the accompanying analyses, has demonstrated the viability of a PRSEUS center body for the HWB vehicle. Additionally, much of the development effort is also applicable to traditional tube-and-wing aircraft, advanced aircraft configurations, and other structures where weight and through-the-thickness strength are design considerations.
A Timing Synchronizer System for Beam Test Setups Requiring Galvanic Isolation
NASA Astrophysics Data System (ADS)
Meder, Lukas Dominik; Emschermann, David; Frühauf, Jochen; Müller, Walter F. J.; Becker, Jürgen
2017-07-01
In beam test setups detector elements together with a readout composed of frontend electronics (FEE) and usually a layer of field-programmable gate arrays (FPGAs) are being analyzed. The FEE is in this scenario often directly connected to both the detector and the FPGA layer what in many cases requires sharing the ground potentials of these layers. This setup can become problematic if parts of the detector need to be operated at different high-voltage potentials, since all of the FPGA boards need to receive a common clock and timing reference for getting the readout synchronized. Thus, for the context of the compressed baryonic matter experiment a versatile timing synchronizer (TS) system was designed providing galvanically isolated timing distribution links over twisted-pair cables. As an electrical interface the so-called timing data processing board FPGA mezzanine card was created for being mounted onto FPGA-based advanced mezzanine cards for mTCA.4 crates. The FPGA logic of the TS system connects to this card and can be monitored and controlled through IPBus slow-control links. Evaluations show that the system is capable of stably synchronizing the FPGA boards of a beam test setup being integrated into a hierarchical TS network.
Karatay, Gülnaz; Gürarslan Baş, Nazan
2017-01-01
During the first phases of adolescent development, young people have little self-efficacy and resistance against substance use. The aim of this study was to demonstrate the effectiveness of role-playing scenarios on the self-efficacy of students in resisting substance use. A pre test and post test study design was used with a single group. The study was carried out with 245 secondary school students. The scenario-based training, developed by the researchers, was presented by the school counselors once a week for 4 weeks. For this purpose, a booklet of scenarios was prepared for the teachers. The role-playing scenarios were intended to improve adolescents' abilities to say "no" to substance offers, to prevent them from becoming addicted to certain substances, and to call for help if needed. The data of the study were collected using the Personal Information Form and the Self-Efficacy for Adolescences Protecting Substance Abuse Scale . The obtained data were assessed using percentages, chi-square, t test, and F test in the SPSS software. Results showed that, after the training, the mean score in the Self-Efficacy for Adolescences Protecting Substance Abuse Scale increased significantly (103.20 ± 20.00) compared with before the training (92.11 ± 17.08) ( P < .05). Short-term outcomes of the class-based scenario training were observed to be effective in the development of students' self-efficacy to resist the temptations of substance use.
NASA Technical Reports Server (NTRS)
Rockwell, T. H.; Giffin, W. C.
1982-01-01
Computer displays using PLATO are illustrated. Diagnostic scenarios are described. A sample of subject data is presented. Destination diversion displays, a combined destination, diversion scenario, and critical in-flight event (CIFE) data collection/subject testing system are presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-13
... irradiation scenarios? F. How should the impact of delays in sampling, delays in testing, combined injury, and... biodosimeter for use in a mass exposure scenario, the development of proper radiation biodosimetry tools is a... clinical animal model testing might be necessary to demonstrate radiation biodosimeter performance? D...
12 CFR 652.61 - Capital planning.
Code of Federal Regulations, 2014 CFR
2014-01-01
... progressively severe stress scenarios developed by Farmer Mac appropriate to its business model and portfolios... stress testing, Farmer Mac must provide to OSMO a description of the expected and stressed scenarios that Farmer Mac intends to use to conduct its annual stress test under this section. (B) A description of all...
SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution
statistical summary of the U.S. distribution systems World-class, high spatial/temporal resolution of solar Systems and Scenarios | Grid Modernization | NREL SMART-DS: Synthetic Models for Advanced , Realistic Testing: Distribution Systems and Scenarios SMART-DS: Synthetic Models for Advanced, Realistic
HIV status: the prima facie right not to know the result.
Chan, Tak Kwong
2016-02-01
When a patient regains consciousness from Cryptococcus meningitis, the clinician may offer an HIV test (in case it has not already been done) (scenario 1) or offer to tell the patient his HIV status (in case the test has already been performed with a positive result while the patient was unconscious) (scenario 2). Youngs and Simmonds proposed that the patient has the prima facie right to refuse an HIV test in scenario 1 but not the prima facie right not to be told the HIV status in scenario 2. I submit that the claims to the right of refusal in both scenarios are similarly strong as they should both be grounded in privacy, self determination or dignity. But a conscientious agent should bear in mind that members of the public also have the right not to be harmed. When the circumstance allows, a proper balance of the potential benefits and harm for all the competing parties should guide the clinical decision as to whose right should finally prevail. Where a full ethical analysis is not possible, the presumption should favour respecting the patient's right of refusal in both scenarios. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Nemesis Autonomous Test System
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Lee, Cin-Young; Horvath, Gregory A,; Clement, Bradley J.
2012-01-01
A generalized framework has been developed for systems validation that can be applied to both traditional and autonomous systems. The framework consists of an automated test case generation and execution system called Nemesis that rapidly and thoroughly identifies flaws or vulnerabilities within a system. By applying genetic optimization and goal-seeking algorithms on the test equipment side, a "war game" is conducted between a system and its complementary nemesis. The end result of the war games is a collection of scenarios that reveals any undesirable behaviors of the system under test. The software provides a reusable framework to evolve test scenarios using genetic algorithms using an operation model of the system under test. It can automatically generate and execute test cases that reveal flaws in behaviorally complex systems. Genetic algorithms focus the exploration of tests on the set of test cases that most effectively reveals the flaws and vulnerabilities of the system under test. It leverages advances in state- and model-based engineering, which are essential in defining the behavior of autonomous systems. It also uses goal networks to describe test scenarios.
Wildlife Scenario Builder and User's Guide (Version 1.0, Beta Test)
The Wildlife Scenario Builder (WSB) was developed to improve the quality of wildlif...
Final Report Feasibility Study for the California Wave Energy Test Center (CalWavesm)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blakeslee, Samuel Norman; Toman, William I.; Williams, Richard B.
The California Wave Energy Test Center (CalWave) Feasibility Study project was funded over multiple phases by the Department of Energy to perform an interdisciplinary feasibility assessment to analyze the engineering, permitting, and stakeholder requirements to establish an open water, fully energetic, grid connected, wave energy test center off the coast of California for the purposes of advancing U.S. wave energy research, development, and testing capabilities. Work under this grant included wave energy resource characterization, grid impact and interconnection requirements, port infrastructure and maritime industry capability/suitability to accommodate the industry at research, demonstration and commercial scale, and macro and micro sitingmore » considerations. CalWave Phase I performed a macro-siting and down-selection process focusing on two potential test sites in California: Humboldt Bay and Vandenberg Air Force Base. This work resulted in the Vandenberg Air Force Base site being chosen as the most favorable site based on a peer reviewed criteria matrix. CalWave Phase II focused on four siting location alternatives along the Vandenberg Air Force Base coastline and culminated with a final siting down-selection. Key outcomes from this work include completion of preliminary engineering and systems integration work, a robust turnkey cost estimate, shoreside and subsea hazards assessment, storm wave analysis, lessons learned reports from several maritime disciplines, test center benchmarking as compared to existing international test sites, analysis of existing applicable environmental literature, the completion of a preliminary regulatory, permitting and licensing roadmap, robust interaction and engagement with state and federal regulatory agency personnel and local stakeholders, and the population of a Draft Federal Energy Regulatory Commission (FERC) Preliminary Application Document (PAD). Analysis of existing offshore oil and gas infrastructure was also performed to assess the potential value and re-use scenarios of offshore platform infrastructure and associated subsea power cables and shoreside substations. The CalWave project team was well balanced and was comprised of experts from industry, academia, state and federal regulatory agencies. The result of the CalWave feasibility study finds that the CalWave Test Center has the potential to provide the most viable path to commercialization for wave energy in the United States.« less
Ground truth seismic events and location capability at Degelen mountain, Kazakhstan
NASA Astrophysics Data System (ADS)
Trabant, Chad; Thurber, Clifford; Leith, William
2002-07-01
We utilized nuclear explosions from the Degelen Mountain sub-region of the Semipalatinsk Test Site (STS), Kazakhstan, to assess seismic location capability directly. Excellent ground truth information for these events was either known or was estimated from maps of the Degelen Mountain adit complex. Origin times were refined for events for which absolute origin time information was unknown using catalog arrival times, our ground truth location estimates, and a time baseline provided by fixing known origin times during a joint hypocenter determination (JHD). Precise arrival time picks were determined using a waveform cross-correlation process applied to the available digital data. These data were used in a JHD analysis. We found that very accurate locations were possible when high precision, waveform cross-correlation arrival times were combined with JHD. Relocation with our full digital data set resulted in a mean mislocation of 2 km and a mean 95% confidence ellipse (CE) area of 6.6 km 2 (90% CE: 5.1 km 2), however, only 5 of the 18 computed error ellipses actually covered the associated ground truth location estimate. To test a more realistic nuclear test monitoring scenario, we applied our JHD analysis to a set of seven events (one fixed) using data only from seismic stations within 40° epicentral distance. Relocation with these data resulted in a mean mislocation of 7.4 km, with four of the 95% error ellipses covering less than 570 km 2 (90% CE: 438 km 2), and the other two covering 1730 and 8869 km 2 (90% CE: 1331 and 6822 km 2). Location uncertainties calculated using JHD often underestimated the true error, but a circular region with a radius equal to the mislocation covered less than 1000 km 2 for all events having more than three observations.
NASA Astrophysics Data System (ADS)
Oleksowicz, Selim A.; Burnham, Keith J.; Southgate, Adam; McCoy, Chris; Waite, Gary; Hardwick, Graham; Harrington, Cian; McMurran, Ross
2013-05-01
The sustainable development of vehicle propulsion systems that have mainly focused on reduction of fuel consumption (i.e. CO2 emission) has led, not only to the development of systems connected with combustion processes but also to legislation and testing procedures. In recent years, the low carbon policy has made hybrid vehicles and fully electric vehicles (H/EVs) popular. The main virtue of these propulsion systems is their ability to restore some of the expended energy from kinetic movement, e.g. the braking process. Consequently new research and testing methods for H/EVs are currently being developed. This especially concerns the critical 'use-cases' for functionality tests within dynamic events for both virtual simulations, as well as real-time road tests. The use-case for conventional vehicles for numerical simulations and road tests are well established. However, the wide variety of tests and their great number (close to a thousand) creates a need for selection, in the first place, and the creation of critical use-cases suitable for testing H/EVs in both virtual and real-world environments. It is known that a marginal improvement in the regenerative braking ratio can significantly improve the vehicle range and, therefore, the economic cost of its operation. In modern vehicles, vehicle dynamics control systems play the principal role in safety, comfort and economic operation. Unfortunately, however, the existing standard road test scenarios are insufficient for H/EVs. Sector knowledge suggests that there are currently no agreed tests scenarios to fully investigate the effects of brake blending between conventional and regenerative braking as well as the regenerative braking interaction with active driving safety systems (ADSS). The paper presents seven manoeuvres, which are considered to be suitable and highly informative for the development and examination of H/EVs with regenerative braking capability. The critical manoeuvres presented are considered to be appropriate for examination of the regenerative braking mode according to ADSS. The manoeuvres are also important for investigation of regenerative braking system properties/functionalities that are specified by the legal requirements concerning H/EVs braking systems. The last part of this paper shows simulation results for one of the proposed manoeuvres that explicitly shows the usefulness of the manoeuvre.
Environmental impact analysis with the airspace concept evaluation system
NASA Technical Reports Server (NTRS)
Augustine, Stephen; Capozzi, Brian; DiFelici, John; Graham, Michael; Thompson, Terry; Miraflor, Raymond M. C.
2005-01-01
The National Aeronautics and Space Administration (NASA) Ames Research Center has developed the Airspace Concept Evaluation System (ACES), which is a fast-time simulation tool for evaluating Air Traffic Management (ATM) systems. This paper describes linking a capability to ACES which can analyze the environmental impact of proposed future ATM systems. This provides the ability to quickly evaluate metrics associated with environmental impacts of aviation for inclusion in multi-dimensional cost-benefit analysis of concepts for evolution of the National Airspace System (NAS) over the next several decades. The methodology used here may be summarized as follows: 1) Standard Federal Aviation Administration (FAA) noise and emissions-inventory models, the Noise Impact Routing System (NIRS) and the Emissions and Dispersion Modeling System (EDMS), respectively, are linked to ACES simulation outputs; 2) appropriate modifications are made to ACES outputs to incorporate all information needed by the environmental models (e.g., specific airframe and engine data); 3) noise and emissions calculations are performed for all traffic and airports in the study area for each of several scenarios, as simulated by ACES; and 4) impacts of future scenarios are compared to the current NAS baseline scenario. This paper also provides the results of initial end-to-end, proof-of-concept runs of the integrated ACES and environmental-modeling capability. These preliminary results demonstrate that if no growth is likely to be impeded by significant environmental impacts that could negatively affect communities throughout the nation.
A framework for modeling scenario-based barrier island storm impacts
Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.
2018-01-01
Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.
Rawashdeh, Nathir A.
2018-01-01
Visual inspection through image processing of welding and shot-peened surfaces is necessary to overcome equipment limitations, avoid measurement errors, and accelerate processing to gain certain surface properties such as surface roughness. Therefore, it is important to design an algorithm to quantify surface properties, which enables us to overcome the aforementioned limitations. In this study, a proposed systematic algorithm is utilized to generate and compare the surface roughness of Tungsten Inert Gas (TIG) welded aluminum 6061-T6 alloy treated by two levels of shot-peening, high-intensity and low-intensity. This project is industrial in nature, and the proposed solution was originally requested by local industry to overcome equipment capabilities and limitations. In particular, surface roughness measurements are usually only possible on flat surfaces but not on other areas treated by shot-peening after welding, as in the heat-affected zone and weld beads. Therefore, those critical areas are outside of the measurement limitations. Using the proposed technique, the surface roughness measurements were possible to obtain for weld beads, high-intensity and low-intensity shot-peened surfaces. In addition, a 3D surface topography was generated and dimple size distributions were calculated for the three tested scenarios: control sample (TIG-welded only), high-intensity shot-peened, and low-intensity shot-peened TIG-welded Al6065-T6 samples. Finally, cross-sectional hardness profiles were measured for the three scenarios; in all scenarios, lower hardness measurements were obtained compared to the base metal alloy in the heat-affected zone and in the weld beads even after shot-peening treatments. PMID:29748520
'Disaster day': global health simulation teaching.
Mohamed-Ahmed, Rayan; Daniels, Alex; Goodall, Jack; O'Kelly, Emily; Fisher, James
2016-02-01
As society diversifies and globalisation quickens, the importance of teaching global health to medical undergraduates increases. For undergraduates, the majority of exposure to 'hands-on' teaching on global health occurs during optional elective periods. This article describes an innovative student-led initiative, 'Disaster Day', which used simulation to teach global health to undergraduates. The teaching day began with an introduction outlining the work of Médecins Sans Frontières and the basic principles of resuscitation. Students then undertook four interactive simulation scenarios: Infectious Diseases in a Refugee Camp, Natural Disaster and Crush Injury, Obstetric Emergency in a Low-Income Country, and Warzone Gunshot Wound. Sessions were facilitated by experienced doctors and fourth-year students who had been trained in the delivery of the scenarios. Students completed pre- and post-session evaluation forms that included the self-rating of confidence in eight learning domains (using a five-point Likert scale). Twenty-seven students voluntarily attended the session, and all provided written feedback. Analysis of the pre- and post-session evaluations demonstrated statistically significant improvements in confidence across all but one domains (Wilcoxon signed rank test). Free-text feedback was overwhelmingly positive, with students appreciating the practical aspect of the scenarios. For undergraduates, the majority of exposure to 'hands-on' teaching on global health occurs during optional elective periods Simulation-based teaching can provide students with 'hands-on' exposure to global health in a controlled, reproducible fashion and appears to help develop their confidence in a variety of learning domains. The more widespread use of such teaching methods is encouraged: helping tomorrow's doctors develop insight into global health challenges may produce more rounded clinicians capable of caring for more culturally diverse populations. © 2015 John Wiley & Sons Ltd.
Atieh, Anas M; Rawashdeh, Nathir A; AlHazaa, Abdulaziz N
2018-05-10
Visual inspection through image processing of welding and shot-peened surfaces is necessary to overcome equipment limitations, avoid measurement errors, and accelerate processing to gain certain surface properties such as surface roughness. Therefore, it is important to design an algorithm to quantify surface properties, which enables us to overcome the aforementioned limitations. In this study, a proposed systematic algorithm is utilized to generate and compare the surface roughness of Tungsten Inert Gas (TIG) welded aluminum 6061-T6 alloy treated by two levels of shot-peening, high-intensity and low-intensity. This project is industrial in nature, and the proposed solution was originally requested by local industry to overcome equipment capabilities and limitations. In particular, surface roughness measurements are usually only possible on flat surfaces but not on other areas treated by shot-peening after welding, as in the heat-affected zone and weld beads. Therefore, those critical areas are outside of the measurement limitations. Using the proposed technique, the surface roughness measurements were possible to obtain for weld beads, high-intensity and low-intensity shot-peened surfaces. In addition, a 3D surface topography was generated and dimple size distributions were calculated for the three tested scenarios: control sample (TIG-welded only), high-intensity shot-peened, and low-intensity shot-peened TIG-welded Al6065-T6 samples. Finally, cross-sectional hardness profiles were measured for the three scenarios; in all scenarios, lower hardness measurements were obtained compared to the base metal alloy in the heat-affected zone and in the weld beads even after shot-peening treatments.
Development of steady-state scenarios compatible with ITER-like wall conditions
NASA Astrophysics Data System (ADS)
Litaudon, X.; Arnoux, G.; Beurskens, M.; Brezinsek, S.; Challis, C. D.; Crisanti, F.; DeVries, P. C.; Giroud, C.; Pitts, R. A.; Rimini, F. G.; Andrew, Y.; Ariola, M.; Baranov, Yu F.; Brix, M.; Buratti, P.; Cesario, R.; Corre, Y.; DeLa Luna, E.; Fundamenski, W.; Giovannozzi, E.; Gryaznevich, M. P.; Hawkes, N. C.; Hobirk, J.; Huber, A.; Jachmich, S.; Joffrin, E.; Koslowski, H. R.; Liang, Y.; Loarer, Th; Lomas, P.; Luce, T.; Mailloux, J.; Matthews, G. F.; Mazon, D.; McCormick, K.; Moreau, D.; Pericoli, V.; Philipps, V.; Rachlew, E.; Reyes-Cortes, S. D. A.; Saibene, G.; Sharapov, S. E.; Voitsekovitch, I.; Zabeo, L.; Zimmermann, O.; Zastrow, K. D.; JET-EFDA Contributors, the
2007-12-01
A key issue for steady-state tokamak operation is to determine the edge conditions that are compatible both with good core confinement and with the power handling and plasma exhaust capabilities of the plasma facing components (PFCs) and divertor systems. A quantitative response to this open question will provide a robust scientific basis for reliable extrapolation of present regimes to an ITER compatible steady-state scenario. In this context, the JET programme addressing steady-state operation is focused on the development of non-inductive, high confinement plasmas with the constraints imposed by the PFCs. A new beryllium main chamber wall and tungsten divertor together with an upgrade of the heating/fuelling capability are currently in preparation at JET. Operation at higher power with this ITER-like wall will impose new constraints on non-inductive scenarios. Recent experiments have focused on the preparation for this new phase of JET operation. In this paper, progress in the development of advanced tokamak (AT) scenarios at JET is reviewed keeping this long-term objective in mind. The approach has consisted of addressing various critical issues separately during the 2006-2007 campaigns with a view to full scenario integration when the JET upgrades are complete. Regimes with internal transport barriers (ITBs) have been developed at q95 ~ 5 and high triangularity, δ (relevant to the ITER steady-state demonstration) by applying more than 30 MW of additional heating power reaching βN ~ 2 at Bo ~ 3.1 T. Operating at higher δ has allowed the edge pedestal and core densities to be increased pushing the ion temperature closer to that of the electrons. Although not yet fully integrated into a performance enhancing ITB scenario, Neon seeding has been successfully explored to increase the radiated power fraction (up to 60%), providing significant reduction of target tile power fluxes (and hence temperatures) and mitigation of edge localized mode (ELM) activity. At reduced toroidal magnetic field strength, high βN regimes have been achieved and q-profile optimization investigated for use in steady-state scenarios. Values of βN above the 'no-wall magnetohydrodynamic limit' (βN ~ 3.0) have been sustained for a resistive current diffusion time in high-δ configurations (at 1.2 MA/1.8 T). In this scenario, ELM activity has been mitigated by applying magnetic perturbations using error field correction coils to provide ergodization of the magnetic field at the plasma edge. In a highly shaped, quasi-double null X-point configuration, ITBs have been generated on the ion heat transport channel and combined with 'grassy' ELMs with ~30 MW of applied heating power (at 1.2 MA/2.7 T, q95 ~ 7). Advanced algorithms and system identification procedures have been developed with a view to developing simultaneously temperature and q-profile control in real-time. These techniques have so far been applied to the control of the q-profile evolution in JET AT scenarios.
NASA Astrophysics Data System (ADS)
Snowden, D. P.; Signell, R.; Knee, K.; Kupiec, J.; Bird, A.; Fratantonio, B.; Koeppen, W.; Wilcox, K.
2014-12-01
The distributed, service-oriented architecture of the US Integrated Ocean Observing System (US IOOS) has been implemented mostly independently by US IOOS partners, using different software approaches and different levels of compliance to standards. Some uniformity has been imparted by documenting the intended output data formats and content and service interface behavior. But to date, a rigorous testing of the distributed system of systems has not been done. To assess the functionality of this system, US IOOS is conducting a system integration test (http://github.com/ioos/system-test) that evaluates whether the services (i.e. SOS, OPeNDAP, WMS, CS/W) deployed to the 17 Federal partners and 11 Regional Associations can solve real-world problems. Scenarios were selected that both address IOOS societal goals and test different functionality of the data architecture. For example, one scenario performs an assessment of water level forecast skill by prompting the user for a bounding box and a temporal extent, searching metadata catalogs via a Catalog Services for the Web (CS/W) interface to discover available sea level observations and model results, extracting data from the identified service endpoints (either OPeNDAP or SOS), interpolating both modeled and observed data onto a common time base, and then comparing the skill of the various models. Other scenarios explore issues such as hypoxia and wading bird habitats. For each scenario, the entire workflow (user input, search, access, analysis and visualization) is captured in an IPython Notebook on GitHub. This allows the scenarios to be self-documenting as well as reproducible by anyone, using free software. The Python packages required to run the scenarios are all available on GitHub and Conda packages are available on binstar.org so that users can easily run the scenarios using the free Anaconda Python distribution. With the advent of hosted services such as Wakari, it is possible for anyone to reproduce these workflows for free, without installing any software locally, using just their web browser. Thus in addition to performing as a system integration test, this project serves to provide examples that anyone in the geoscience community can adapt to solve other real-world problems.
A Testbed Demonstration of an Intelligent Archive in a Knowledge Building System
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram; Isaac, David; Morse, Steve; Yang, Wenli; Bonnlander, Brian; McConaughy, Gail; Di, Liping; Danks, David
2005-01-01
The last decade's influx of raw data and derived geophysical parameters from several Earth observing satellites to NASA data centers has created a data-rich environment for Earth science research and applications. While advances in hardware and information management have made it possible to archive petabytes of data and distribute terabytes of data daily to a broad community of users, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications in order to realize the full potential of these valuable datasets. In examining what is needed to enable this progress in the data provider environment that exists today and is expected to evolve in the next several years, we arrived at the concept of an Intelligent Archive in context of a Knowledge Building System (IA/KBS). Our prior work and associated papers investigated usage scenarios, required capabilities, system architecture, data volume issues, and supporting technologies. We identified six key capabilities of an IA/KBS: Virtual Product Generation, Significant Event Detection, Automated Data Quality Assessment, Large-Scale Data Mining, Dynamic Feedback Loop, and Data Discovery and Efficient Requesting. Among these capabilities, large-scale data mining is perceived by many in the community to be an area of technical risk. One of the main reasons for this is that standard data mining research and algorithms operate on datasets that are several orders of magnitude smaller than the actual sizes of datasets maintained by realistic earth science data archives. Therefore, we defined a test-bed activity to implement a large-scale data mining algorithm in a pseudo-operational scale environment and to examine any issues involved. The application chosen for applying the data mining algorithm is wildfire prediction over the continental U.S. This paper reports a number of observations based on our experience with this test-bed. While proof-of-concept for data mining scalability and utility has been a major goal for the research reported here, it was not the only one. The other five capabilities of an WKBS named above have been considered as well, and an assessment of the implications of our experience for these other areas will also be presented. The lessons learned through the testbed effort and presented in this paper will benefit technologists, scientists, and system operators as they consider introducing IA/KBS capabilities into production systems.
CIRCULAR POLARIZATION OF PULSAR WIND NEBULAE AND THE COSMIC-RAY POSITRON EXCESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linden, Tim, E-mail: trlinden@uchicago.edu
2015-02-01
Recent observations by the PAMELA and AMS-02 telescopes have uncovered an anomalous rise in the positron fraction at energies above 10 GeV. One possible explanation for this excess is the production of primary electron/positron pairs through electromagnetic cascades in pulsar magnetospheres. This process results in a high multiplicity of electron/positron pairs within the wind-termination shock of pulsar wind nebulae (PWNe). A consequence of this scenario is that no circular polarization should be observed within PWNe, since the contributions from electrons and positrons exactly cancel. Here we note that current radio instruments are capable of setting meaningful limits on the circular polarizationmore » of synchrotron radiation in PWNe, which observationally test the model for pulsar production of the local positron excess. The observation of a PWN with detectable circular polarization would cast strong doubt on pulsar interpretations of the positron excess, while observations setting strong limits on the circular polarization of PWNe would lend credence to these models. Finally, we indicate which PWNe are likely to provide the best targets for observational tests of the AMS-02 excess.« less
Enhanced chemical weapon warning via sensor fusion
NASA Astrophysics Data System (ADS)
Flaherty, Michael; Pritchett, Daniel; Cothren, Brian; Schwaiger, James
2011-05-01
Torch Technologies Inc., is actively involved in chemical sensor networking and data fusion via multi-year efforts with Dugway Proving Ground (DPG) and the Defense Threat Reduction Agency (DTRA). The objective of these efforts is to develop innovative concepts and advanced algorithms that enhance our national Chemical Warfare (CW) test and warning capabilities via the fusion of traditional and non-traditional CW sensor data. Under Phase I, II, and III Small Business Innovative Research (SBIR) contracts with DPG, Torch developed the Advanced Chemical Release Evaluation System (ACRES) software to support non real-time CW sensor data fusion. Under Phase I and II SBIRs with DTRA in conjunction with the Edgewood Chemical Biological Center (ECBC), Torch is using the DPG ACRES CW sensor data fuser as a framework from which to develop the Cloud state Estimation in a Networked Sensor Environment (CENSE) data fusion system. Torch is currently developing CENSE to implement and test innovative real-time sensor network based data fusion concepts using CW and non-CW ancillary sensor data to improve CW warning and detection in tactical scenarios.
Implementation of Context Aware e-Health Environments Based on Social Sensor Networks
Aguirre, Erik; Led, Santiago; Lopez-Iturri, Peio; Azpilicueta, Leyre; Serrano, Luís; Falcone, Francisco
2016-01-01
In this work, context aware scenarios applied to e-Health and m-Health in the framework of typical households (urban and rural) by means of deploying Social Sensors will be described. Interaction with end-users and social/medical staff is achieved using a multi-signal input/output device, capable of sensing and transmitting environmental, biomedical or activity signals and information with the aid of a combined Bluetooth and Mobile system platform. The devices, which play the role of Social Sensors, are implemented and tested in order to guarantee adequate service levels in terms of multiple signal processing tasks as well as robustness in relation with the use wireless transceivers and channel variability. Initial tests within a Living Lab environment have been performed in order to validate overall system operation. The results obtained show good acceptance of the proposed system both by end users as well as by medical and social staff, increasing interaction, reducing overall response time and social inclusion levels, with a compact and moderate cost solution that can readily be largely deployed. PMID:26938539
Conceptual study and key technology development for Mars Aeroflyby sample collection
NASA Astrophysics Data System (ADS)
Fujita, K.; Ozawa, T.; Okudaira, K.; Mikouchi, T.; Suzuki, T.; Takayanagi, H.; Tsuda, Y.; Ogawa, N.; Tachibana, S.; Satoh, T.
2014-01-01
Conceptual study of Mars Aeroflyby Sample Collection (MASC) is conducted as a part of the next Mars exploration mission currently entertained in Japan Aerospace Exploration Agency. In the mission scenario, an atmospheric entry vehicle is flown into the Martian atmosphere, collects the Martian dust particles as well as atmospheric gases during the guided hypersonic flight, exits the Martian atmosphere, and is inserted into a parking orbit from which a return system departs for the earth to deliver the dust and gas samples. In order to accomplish a controlled flight and a successful orbit insertion, aeroassist orbit transfer technologies are introduced into the guidance and control system. System analysis is conducted to assess the feasibility and to make a conceptual design, finding that the MASC system is feasible at the minimum system mass of 600 kg approximately. The aerogel, which is one of the candidates for the dust sample collector, is assessed by arcjet heating tests to examine its behavior when exposed to high-temperature gases, as well as by particle impingement tests to evaluate its dust capturing capability.
A Hierarchical Bayesian Model for Crowd Emotions
Urizar, Oscar J.; Baig, Mirza S.; Barakova, Emilia I.; Regazzoni, Carlo S.; Marcenaro, Lucio; Rauterberg, Matthias
2016-01-01
Estimation of emotions is an essential aspect in developing intelligent systems intended for crowded environments. However, emotion estimation in crowds remains a challenging problem due to the complexity in which human emotions are manifested and the capability of a system to perceive them in such conditions. This paper proposes a hierarchical Bayesian model to learn in unsupervised manner the behavior of individuals and of the crowd as a single entity, and explore the relation between behavior and emotions to infer emotional states. Information about the motion patterns of individuals are described using a self-organizing map, and a hierarchical Bayesian network builds probabilistic models to identify behaviors and infer the emotional state of individuals and the crowd. This model is trained and tested using data produced from simulated scenarios that resemble real-life environments. The conducted experiments tested the efficiency of our method to learn, detect and associate behaviors with emotional states yielding accuracy levels of 74% for individuals and 81% for the crowd, similar in performance with existing methods for pedestrian behavior detection but with novel concepts regarding the analysis of crowds. PMID:27458366
Impacts of Inverter-Based Advanced Grid Support Functions on Islanding Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Austin; Hoke, Anderson; Miller, Brian
A long-standing requirement for inverters paired with distributed energy resources is that they are required to disconnect from the electrical power system (EPS) when an electrical island is formed. In recent years, advanced grid support controls have been developed for inverters to provide voltage and frequency support by integrating functions such as voltage and frequency ride-through, volt-VAr control, and frequency-Watt control. With these new capabilities integrated into the inverter, additional examination is needed to determine how voltage and frequency support will impact pre-existing inverter functions like island detection. This paper inspects how advanced inverter functions will impact its ability tomore » detect the formation of an electrical island. Results are presented for the unintentional islanding laboratory tests of three common residential-scale photovoltaic inverters performing various combinations of grid support functions. For the inverters tested, grid support functions prolonged island disconnection times slightly; however, it was found that in all scenarios the inverters disconnected well within two seconds, the limit imposed by IEEE Std 1547-2003.« less
Design of a microfluidic system for red blood cell aggregation investigation.
Mehri, R; Mavriplis, C; Fenech, M
2014-06-01
The purpose of this paper is to design a microfluidic apparatus capable of providing controlled flow conditions suitable for red blood cell (RBC) aggregation analysis. The linear velocity engendered from the controlled flow provides constant shear rates used to qualitatively analyze RBC aggregates. The design of the apparatus is based on numerical and experimental work. The numerical work consists of 3D numerical simulations performed using a research computational fluid dynamics (CFD) solver, Nek5000, while the experiments are conducted using a microparticle image velocimetry system. A Newtonian model is tested numerically and experimentally, then blood is tested experimentally under several conditions (hematocrit, shear rate, and fluid suspension) to be compared to the simulation results. We find that using a velocity ratio of 4 between the two Newtonian fluids, the layer corresponding to blood expands to fill 35% of the channel thickness where the constant shear rate is achieved. For blood experiments, the velocity profile in the blood layer is approximately linear, resulting in the desired controlled conditions for the study of RBC aggregation under several flow scenarios.
Implementation of Context Aware e-Health Environments Based on Social Sensor Networks.
Aguirre, Erik; Led, Santiago; Lopez-Iturri, Peio; Azpilicueta, Leyre; Serrano, Luís; Falcone, Francisco
2016-03-01
In this work, context aware scenarios applied to e-Health and m-Health in the framework of typical households (urban and rural) by means of deploying Social Sensors will be described. Interaction with end-users and social/medical staff is achieved using a multi-signal input/output device, capable of sensing and transmitting environmental, biomedical or activity signals and information with the aid of a combined Bluetooth and Mobile system platform. The devices, which play the role of Social Sensors, are implemented and tested in order to guarantee adequate service levels in terms of multiple signal processing tasks as well as robustness in relation with the use wireless transceivers and channel variability. Initial tests within a Living Lab environment have been performed in order to validate overall system operation. The results obtained show good acceptance of the proposed system both by end users as well as by medical and social staff, increasing interaction, reducing overall response time and social inclusion levels, with a compact and moderate cost solution that can readily be largely deployed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, J.R.; Marshall, M.E.; Barker, B.W.
In situations where cavity decoupling of underground nuclear explosions is a plausible evasion scenario, comprehensive seismic monitoring of any eventual CTBT will require the routine identification of many small seismic events with magnitudes in the range 2.0 < m sub b < 3.5. However, since such events are not expected to be detected teleseismically, their magnitudes will have to be estimated from regional recordings using seismic phases and frequency bands which are different from those employed in the teleseismic m sub b scale which is generally used to specify monitoring capability. Therefore, it is necessary to establish the m submore » b equivalences of any selected regional magnitude measures in order to estimate the expected detection statistics and thresholds of proposed CTBT seismic monitoring networks. In the investigations summarized in this report, this has been accomplished through analyses of synthetic data obtained by theoretically scaling observed regional seismic data recorded in Scandinavia and Central Asia from various tamped nuclear tests to obtain estimates of the corresponding seismic signals to be expected from small cavity decoupled nuclear tests at those same source locations.« less
Adaptable radiation monitoring system and method
Archer, Daniel E [Livermore, CA; Beauchamp, Brock R [San Ramon, CA; Mauger, G Joseph [Livermore, CA; Nelson, Karl E [Livermore, CA; Mercer, Michael B [Manteca, CA; Pletcher, David C [Sacramento, CA; Riot, Vincent J [Berkeley, CA; Schek, James L [Tracy, CA; Knapp, David A [Livermore, CA
2006-06-20
A portable radioactive-material detection system capable of detecting radioactive sources moving at high speeds. The system has at least one radiation detector capable of detecting gamma-radiation and coupled to an MCA capable of collecting spectral data in very small time bins of less than about 150 msec. A computer processor is connected to the MCA for determining from the spectral data if a triggering event has occurred. Spectral data is stored on a data storage device, and a power source supplies power to the detection system. Various configurations of the detection system may be adaptably arranged for various radiation detection scenarios. In a preferred embodiment, the computer processor operates as a server which receives spectral data from other networked detection systems, and communicates the collected data to a central data reporting system.
NASA Technical Reports Server (NTRS)
Bentley, Nicole L.; Brower, David V.; Le, Suy Q.; Seaman, Calvin H.; Tang, Henry H.
2017-01-01
This paper presents the design and development of a friction-based coupling device for a fiber-optic monitoring system capable of measuring pressure, strain, and temperature that can be deployed on existing subsea structures. A summary is provided of the design concept, prototype development, prototype performance testing, and subsequent design refinements of the device. The results of laboratory testing of the first prototype performed at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) are also included. Limitations of the initial concept were identified during testing and future design improvements were proposed and later implemented. These new features enhance the coupling of the sensor device and improve the monitoring system measurement capabilities. A major challenge of a post-installed instrumentation monitoring system is to ensure adequate coupling between the instruments and the structure of interest for reliable measurements. Friction-based devices have the potential to overcome coupling limitations caused by marine growth and soil contamination on flowlines, risers, and other subsea structures. The work described in this paper investigates the design and test of a friction-based coupling device (herein referred to as a friction clamp) which is suitable for pipelines and structures that are suspended in the water column as well as for those that are resting on the seabed. The monitoring elements consist of fiberoptic sensors that are bonded to a stainless steel clamshell assembly with a high-friction surface coating. The friction clamp incorporates a single hinge design to facilitate installation of the clamp and dual rows of opposing fasteners to distribute the clamping force along the structure. The friction clamp can be modified to be installed by commercial divers in shallow depths or by remotely operated vehicles in deep-water applications. NASA-JSC was involved in the selection and testing of the friction coating, and in the design and testing of the prototype clamp device. Four-inch diameter and eight-inch diameter sub-scale friction clamp prototypes were built and tested to evaluate the strain measuring capabilities of the design under different loading scenarios. The testing revealed some limitations of the initial design concept, and subsequent refinements were explored to improve the measurement performance of the system. This study was part of a collaboration between NASA-JSC and Astro Technology Inc. within a study called Clear Gulf. The primary objective of the Clear Gulf study is to develop advanced instrumentation technologies that will improve operational safety and reduce the risk of hydrocarbon spillage. NASA provided unique insights, expansive test facilities, and technical expertise to advance technologies that will benefit the environment, the public, and commercial industries.
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.
2016-12-01
It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.
2013-09-04
LAS VEGAS, Nev. – Engineers prepare a mock-up of The Boeing Company's CST-100 spacecraft for the third and final series of simulated contingency water landing scenarios at Bigelow Aerospace's headquarters near Las Vegas. The CST-100 is designed for ground landings, but could splash down on the water, if necessary. The tests are part of the company’s ongoing work supporting its funded Space Act Agreement with NASA’s Commercial Crew Program, or CCP, during the Commercial Crew Integrated Capability, or CCiCap, initiative. CCP is intended to lead to the availability of commercial human spaceflight services for government and commercial customers to low-Earth orbit. Future development and certification initiatives eventually will lead to the availability of human spaceflight services for NASA to send its astronauts to the International Space Station, where critical research is taking place daily. For more information about CCP, go to http://www.nasa.gov/commercialcrew. Photo credit: Boeing/Kelly George
2013-09-04
LAS VEGAS, Nev. – A mock-up of The Boeing Company's CST-100 spacecraft is prepared for the third and final series of simulated contingency water landing scenarios at Bigelow Aerospace's headquarters near Las Vegas. The CST-100 is designed for ground landings, but could splash down on the water, if necessary. The tests are part of the company’s ongoing work supporting its funded Space Act Agreement with NASA’s Commercial Crew Program, or CCP, during the Commercial Crew Integrated Capability, or CCiCap, initiative. CCP is intended to lead to the availability of commercial human spaceflight services for government and commercial customers to low-Earth orbit. Future development and certification initiatives eventually will lead to the availability of human spaceflight services for NASA to send its astronauts to the International Space Station, where critical research is taking place daily. For more information about CCP, go to http://www.nasa.gov/commercialcrew. Photo credit: Boeing/Kelly George
2013-09-04
LAS VEGAS, Nev. – An engineer prepares a mock-up of The Boeing Company's CST-100 spacecraft for the third and final series of simulated contingency water landing scenarios at Bigelow Aerospace's headquarters near Las Vegas. The CST-100 is designed for ground landings, but could splash down on the water, if necessary. The tests are part of the company’s ongoing work supporting its funded Space Act Agreement with NASA’s Commercial Crew Program, or CCP, during the Commercial Crew Integrated Capability, or CCiCap, initiative. CCP is intended to lead to the availability of commercial human spaceflight services for government and commercial customers to low-Earth orbit. Future development and certification initiatives eventually will lead to the availability of human spaceflight services for NASA to send its astronauts to the International Space Station, where critical research is taking place daily. For more information about CCP, go to http://www.nasa.gov/commercialcrew. Photo credit: Boeing/Kelly George
2013-09-04
LAS VEGAS, Nev. – A mock-up of The Boeing Company's CST-100 spacecraft floats following the third and final series of simulated contingency water landing scenarios at Bigelow Aerospace's headquarters near Las Vegas. The CST-100 is designed for ground landings, but could splash down on the water, if necessary. The tests are part of the company’s ongoing work supporting its funded Space Act Agreement with NASA’s Commercial Crew Program, or CCP, during the Commercial Crew Integrated Capability, or CCiCap, initiative. CCP is intended to lead to the availability of commercial human spaceflight services for government and commercial customers to low-Earth orbit. Future development and certification initiatives eventually will lead to the availability of human spaceflight services for NASA to send its astronauts to the International Space Station, where critical research is taking place daily. For more information about CCP, go to http://www.nasa.gov/commercialcrew. Photo credit: Boeing/Kelly George
Cost estimation model for advanced planetary programs, fourth edition
NASA Technical Reports Server (NTRS)
Spadoni, D. J.
1983-01-01
The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.
NASA Technical Reports Server (NTRS)
Seymour, David C.; Martin, Michael A.; Nguyen, Huy H.; Greene, William D.
2005-01-01
The subject of mathematical modeling of the transient operation of liquid rocket engines is presented in overview form from the perspective of engineers working at the NASA Marshall Space Flight Center. The necessity of creating and utilizing accurate mathematical models as part of liquid rocket engine development process has become well established and is likely to increase in importance in the future. The issues of design considerations for transient operation, development testing, and failure scenario simulation are discussed. An overview of the derivation of the basic governing equations is presented along with a discussion of computational and numerical issues associated with the implementation of these equations in computer codes. Also, work in the field of generating usable fluid property tables is presented along with an overview of efforts to be undertaken in the future to improve the tools use for the mathematical modeling process.
NASA Technical Reports Server (NTRS)
Martin, Michael A.; Nguyen, Huy H.; Greene, William D.; Seymout, David C.
2003-01-01
The subject of mathematical modeling of the transient operation of liquid rocket engines is presented in overview form from the perspective of engineers working at the NASA Marshall Space Flight Center. The necessity of creating and utilizing accurate mathematical models as part of liquid rocket engine development process has become well established and is likely to increase in importance in the future. The issues of design considerations for transient operation, development testing, and failure scenario simulation are discussed. An overview of the derivation of the basic governing equations is presented along with a discussion of computational and numerical issues associated with the implementation of these equations in computer codes. Also, work in the field of generating usable fluid property tables is presented along with an overview of efforts to be undertaken in the future to improve the tools use for the mathematical modeling process.
NASA Astrophysics Data System (ADS)
Pahar, Gourabananda; Dhar, Anirban
2017-04-01
A coupled solenoidal Incompressible Smoothed Particle Hydrodynamics (ISPH) model is presented for simulation of sediment displacement in erodible bed. The coupled framework consists of two separate incompressible modules: (a) granular module, (b) fluid module. The granular module considers a friction based rheology model to calculate deviatoric stress components from pressure. The module is validated for Bagnold flow profile and two standardized test cases of sediment avalanching. The fluid module resolves fluid flow inside and outside porous domain. An interaction force pair containing fluid pressure, viscous term and drag force acts as a bridge between two different flow modules. The coupled model is validated against three dambreak flow cases with different initial conditions of movable bed. The simulated results are in good agreement with experimental data. A demonstrative case considering effect of granular column failure under full/partial submergence highlights the capability of the coupled model for application in generalized scenario.
Characterisation of Feature Points in Eye Fundus Images
NASA Astrophysics Data System (ADS)
Calvo, D.; Ortega, M.; Penedo, M. G.; Rouco, J.
The retinal vessel tree adds decisive knowledge in the diagnosis of numerous opthalmologic pathologies such as hypertension or diabetes. One of the problems in the analysis of the retinal vessel tree is the lack of information in terms of vessels depth as the image acquisition usually leads to a 2D image. This situation provokes a scenario where two different vessels coinciding in a point could be interpreted as a vessel forking into a bifurcation. That is why, for traking and labelling the retinal vascular tree, bifurcations and crossovers of vessels are considered feature points. In this work a novel method for these retinal vessel tree feature points detection and classification is introduced. The method applies image techniques such as filters or thinning to obtain the adequate structure to detect the points and sets a classification of these points studying its environment. The methodology is tested using a standard database and the results show high classification capabilities.
NASA Astrophysics Data System (ADS)
Li, Ping; Wang, Weiwei; Zhang, Chenxi; An, Yong; Song, Zhijian
2016-07-01
Intraoperative brain retraction leads to a misalignment between the intraoperative positions of the brain structures and their previous positions, as determined from preoperative images. In vitro swine brain sample uniaxial tests showed that the mechanical response of brain tissue to compression and extension could be described by the hyper-viscoelasticity theory. The brain retraction caused by the mechanical process is a combination of brain tissue compression and extension. In this paper, we first constructed a hyper-viscoelastic framework based on the extended finite element method (XFEM) to simulate intraoperative brain retraction. To explore its effectiveness, we then applied this framework to an in vivo brain retraction simulation. The simulation strictly followed the clinical scenario, in which seven swine were subjected to brain retraction. Our experimental results showed that the hyper-viscoelastic XFEM framework is capable of simulating intraoperative brain retraction and improving the navigation accuracy of an image-guided neurosurgery system (IGNS).
Real-Time Model and Simulation Architecture for Half- and Full-Bridge Modular Multilevel Converters
NASA Astrophysics Data System (ADS)
Ashourloo, Mojtaba
This work presents an equivalent model and simulation architecture for real-time electromagnetic transient analysis of either half-bridge or full-bridge modular multilevel converter (MMC) with 400 sub-modules (SMs) per arm. The proposed CPU/FPGA-based architecture is optimized for the parallel implementation of the presented MMC model on the FPGA and is beneficiary of a high-throughput floating-point computational engine. The developed real-time simulation architecture is capable of simulating MMCs with 400 SMs per arm at 825 nanoseconds. To address the difficulties of the sorting process implementation, a modified Odd-Even Bubble sorting is presented in this work. The comparison of the results under various test scenarios reveals that the proposed real-time simulator is representing the system responses in the same way of its corresponding off-line counterpart obtained from the PSCAD/EMTDC program.
2014-01-01
Background Using the Android platform as a notification instrument for diseases and disorders forms a new alternative for computerization of epidemiological studies. Objective The objective of our study was to construct a tool for gathering epidemiological data on schistosomiasis using the Android platform. Methods The developed application (app), named the Schisto Track, is a tool for data capture and analysis that was designed to meet the needs of a traditional epidemiological survey. An initial version of the app was finished and tested in both real situations and simulations for epidemiological surveys. Results The app proved to be a tool capable of automation of activities, with data organization and standardization, easy data recovery (to enable interfacing with other systems), and totally modular architecture. Conclusions The proposed Schisto Track is in line with worldwide trends toward use of smartphones with the Android platform for modeling epidemiological scenarios. PMID:25099881
Investigation of the detection of shallow tunnels using electromagnetic and seismic waves
NASA Astrophysics Data System (ADS)
Counts, Tegan; Larson, Gregg; Gürbüz, Ali Cafer; McClellan, James H.; Scott, Waymond R., Jr.
2007-04-01
Multimodal detection of subsurface targets such as tunnels, pipes, reinforcement bars, and structures has been investigated using both ground-penetrating radar (GPR) and seismic sensors with signal processing techniques to enhance localization capabilities. Both systems have been tested in bi-static configurations but the GPR has been expanded to a multi-static configuration for improved performance. The use of two compatible sensors that sense different phenomena (GPR detects changes in electrical properties while the seismic system measures mechanical properties) increases the overall system's effectiveness in a wider range of soils and conditions. Two experimental scenarios have been investigated in a laboratory model with nearly homogeneous sand. Images formed from the raw data have been enhanced using beamforming inversion techniques and Hough Transform techniques to specifically address the detection of linear targets. The processed data clearly indicate the locations of the buried targets of various sizes at a range of depths.
Scenario-Testing: Decision Rules for Evaluating Conflicting Probabilistic Claims.
ERIC Educational Resources Information Center
Dudczak, Craig A.; Baker, David
Evaluators of argument are frequently confronted by conflicting claims. While these claims are usually based on probabilities, they are often resolved with the accepted claim treated as though it were "true," while the rejected claim is treated as though it were "false." Scenario testing is the label applied to a set of…