Sample records for probable future capabilities

  1. An approach to evaluating reactive airborne wind shear systems

    NASA Technical Reports Server (NTRS)

    Gibson, Joseph P., Jr.

    1992-01-01

    An approach to evaluating reactive airborne windshear detection systems was developed to support a deployment study for future FAA ground-based windshear detection systems. The deployment study methodology assesses potential future safety enhancements beyond planned capabilities. The reactive airborne systems will be an integral part of planned windshear safety enhancements. The approach to evaluating reactive airborne systems involves separate analyses for both landing and take-off scenario. The analysis estimates the probability of effective warning considering several factors including NASA energy height loss characteristics, reactive alert timing, and a probability distribution for microburst strength.

  2. Visualizing uncertainty about the future.

    PubMed

    Spiegelhalter, David; Pearson, Mike; Short, Ian

    2011-09-09

    We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge.

  3. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    PubMed

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  4. NetMOD v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J

    2015-12-22

    NetMOD is a tool to model the performance of global ground-based explosion monitoring systems. The version 2.0 of the software supports the simulation of seismic, hydroacoustic, and infrasonic detection capability. The tool provides a user interface to execute simulations based upon a hypothetical definition of the monitoring system configuration, geophysical properties of the Earth, and detection analysis criteria. NetMOD will be distributed with a project file defining the basic performance characteristics of the International Monitoring System (IMS), a network of sensors operated by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Network modeling is needed to be able to assess and explainmore » the potential effect of changes to the IMS, to prioritize station deployment and repair, and to assess the overall CTBTO monitoring capability currently and in the future. Currently the CTBTO uses version 1.0 of NetMOD, provided to them in early 2014. NetMOD will provide a modern tool that will cover all the simulations currently available and allow for the development of additional simulation capabilities of the IMS in the future. NetMOD simulates the performance of monitoring networks by estimating the relative amplitudes of the signal and noise measured at each of the stations within the network based upon known geophysical principles. From these signal and noise estimates, a probability of detection may be determined for each of the stations. The detection probabilities at each of the stations may then be combined to produce an estimate of the detection probability for the entire monitoring network.« less

  5. Traffic handling capability of a broadband indoor wireless network using CDMA multiple access

    NASA Astrophysics Data System (ADS)

    Zhang, Chang G.; Hafez, H. M.; Falconer, David D.

    1994-05-01

    CDMA (code division multiple access) may be an attractive technique for wireless access to broadband services because of its multiple access simplicity and other appealing features. In order to investigate traffic handling capabilities of a future network providing a variety of integrated services, this paper presents a study of a broadband indoor wireless network supporting high-speed traffic using CDMA multiple access. The results are obtained through the simulation of an indoor environment and the traffic capabilities of the wireless access to broadband 155.5 MHz ATM-SONET networks using the mm-wave band. A distributed system architecture is employed and the system performance is measured in terms of call blocking probability and dropping probability. The impacts of the base station density, traffic load, average holding time, and variable traffic sources on the system performance are examined. The improvement of system performance by implementing various techniques such as handoff, admission control, power control and sectorization are also investigated.

  6. Radar Resource Management in a Dense Target Environment

    DTIC Science & Technology

    2014-03-01

    problem faced by networked MFRs . While relaxing our assumptions concerning information gain presents numerous challenges worth exploring, future research...linear programming MFR multifunction phased array radar MILP mixed integer linear programming NATO North Atlantic Treaty Organization PDF probability...1: INTRODUCTION Multifunction phased array radars ( MFRs ) are capable of performing various tasks in rapid succession. The performance of target search

  7. The potential impact of MMICs on future satellite communications

    NASA Technical Reports Server (NTRS)

    Dunn, Vernon E.

    1988-01-01

    This is the Final Report representing the results of a 17-month study on the future trends and requirements of Monolithic Microwave Integrated Circuits (MMIC) for space communication applications. Specifically this report identifies potential space communication applications of MMICs, assesses the impact of MMIC on the classes of systems that were identified, determines the present status and probable 10-year growth in capability of required MMIC and competing technologies, identifies the applications most likely to benefit from further MMIC development and presents recommendations for NASA development activities to address the needs of these applications.

  8. The potential impact of MMICs on future satellite communications: Executive summary

    NASA Technical Reports Server (NTRS)

    Dunn, Vernon E.

    1988-01-01

    This Executive Summary presents the results of a 17-month study on the future trends and requirments for Monolithic Microwave Integrated circuits (MMIC) for space communication application. Specifically this report identifies potential space communication applications of MMICs, assesses the impact of MMIC on the classes of systems that were identified, determines the present status and probable 10-year growth in capability of required MMIC and competing technologies, identifies the applications most likely to benefit from further MMIC development, and presents recommendations for NASA development activities to address the needs of these applications.

  9. Instructional television utilization in the United States

    NASA Technical Reports Server (NTRS)

    Dumolin, J. R.

    1971-01-01

    Various aspects of utilizing instructional television (ITV) are summarized and evaluated and basic guidelines for future utilization of television as an instructional medium in education are considered. The role of technology in education, capabilities and limitations of television as an instructional media system and the state of ITV research efforts are discussed. Examples of various ongoing ITV programs are given and summarized. The problems involved in the three stages of the ITV process (production, distribution, and classroom utilization) are presented. A summary analysis outlines probable trends in future utilization.

  10. An Arrival and Departure Time Predictor for Scheduling Communication in Opportunistic IoT

    PubMed Central

    Pozza, Riccardo; Georgoulas, Stylianos; Moessner, Klaus; Nati, Michele; Gluhak, Alexander; Krco, Srdjan

    2016-01-01

    In this article, an Arrival and Departure Time Predictor (ADTP) for scheduling communication in opportunistic Internet of Things (IoT) is presented. The proposed algorithm learns about temporal patterns of encounters between IoT devices and predicts future arrival and departure times, therefore future contact durations. By relying on such predictions, a neighbour discovery scheduler is proposed, capable of jointly optimizing discovery latency and power consumption in order to maximize communication time when contacts are expected with high probability and, at the same time, saving power when contacts are expected with low probability. A comprehensive performance evaluation with different sets of synthetic and real world traces shows that ADTP performs favourably with respect to previous state of the art. This prediction framework opens opportunities for transmission planners and schedulers optimizing not only neighbour discovery, but the entire communication process. PMID:27827909

  11. An Arrival and Departure Time Predictor for Scheduling Communication in Opportunistic IoT.

    PubMed

    Pozza, Riccardo; Georgoulas, Stylianos; Moessner, Klaus; Nati, Michele; Gluhak, Alexander; Krco, Srdjan

    2016-11-04

    In this article, an Arrival and Departure Time Predictor (ADTP) for scheduling communication in opportunistic Internet of Things (IoT) is presented. The proposed algorithm learns about temporal patterns of encounters between IoT devices and predicts future arrival and departure times, therefore future contact durations. By relying on such predictions, a neighbour discovery scheduler is proposed, capable of jointly optimizing discovery latency and power consumption in order to maximize communication time when contacts are expected with high probability and, at the same time, saving power when contacts are expected with low probability. A comprehensive performance evaluation with different sets of synthetic and real world traces shows that ADTP performs favourably with respect to previous state of the art. This prediction framework opens opportunities for transmission planners and schedulers optimizing not only neighbour discovery, but the entire communication process.

  12. A Mobile Decision Aid for Determining Detection Probabilities for Acoustic Targets

    DTIC Science & Technology

    2002-08-01

    propagation mobile application . Personal Computer Memory Card International Association, an organization of some 500 companies that has developed a...SENSOR: lHuman and possible outputs, it was felt that for a mobile application , the interface and number of output parameters should be kept simple...value could be computed on the server and transmitted back to the mobile application for display. FUTURE CAPABILITIES 2-D/3-D Displays The full ABFA

  13. Estimation of probability of failure for damage-tolerant aerospace structures

    NASA Astrophysics Data System (ADS)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This dissertation describes and develops new PDTA methodologies that directly address the deficiencies of the currently used tools. The new methods are implemented as a free, publicly licensed and open source R software package that can be downloaded from the Comprehensive R Archive Network. The tools consist of two main components. First, an explicit (and expensive) Monte Carlo approach is presented which simulates the life of an aircraft structural component flight-by-flight. This straightforward MC routine can be used to provide defensible estimates of the failure probabilities for future flights and repair probabilities for future inspections under a variety of failure and maintenance scenarios. This routine is intended to provide baseline estimates against which to compare the results of other, more efficient approaches. Second, an original approach is described which models the fatigue process and future scheduled inspections as a hidden Markov model. This model is solved using a particle-based approximation and the sequential importance sampling algorithm, which provides an efficient solution to the PDTA problem. Sequential importance sampling is an extension of importance sampling to a Markov process, allowing for efficient Bayesian updating of model parameters. This model updating capability, the benefit of which is demonstrated, is lacking in other PDTA approaches. The results of this approach are shown to agree with the results of the explicit Monte Carlo routine for a number of PDTA problems. Extensions to the typical PDTA problem, which cannot be solved using currently available tools, are presented and solved in this work. These extensions include incorporating observed evidence (such as non-destructive inspection results), more realistic treatment of possible future repairs, and the modeling of failure involving more than one crack (the so-called continuing damage problem). The described hidden Markov model / sequential importance sampling approach to PDTA has the potential to improve aerospace structural safety and reduce maintenance costs by providing a more accurate assessment of the risk of failure and the likelihood of repairs throughout the life of an aircraft.

  14. From Landsat through SLI: Ball Aerospace Instrument Architecture for Earth Surface Monitoring

    NASA Astrophysics Data System (ADS)

    Wamsley, P. R.; Gilmore, A. S.; Malone, K. J.; Kampe, T. U.; Good, W. S.

    2017-12-01

    The Landsat legacy spans more than forty years of moderate resolution, multi-spectral imaging of the Earth's surface. Applications for Landsat data include global environmental change, disaster planning and recovery, crop and natural resource management, and glaciology. In recent years, coastal water science has been greatly enhanced by the outstanding on-orbit performance of Landsat 8. Ball Aerospace designed and built the Operational Land Imager (OLI) instrument on Landsat 8, and is in the process of building OLI 2 for Landsat 9. Both of these instruments have the same design however improved performance is expected from OLI 2 due to greater image bit depth (14 bit on OLI 2 vs 12 bit on OLI). Ball Aerospace is currently working on two novel instrument architectures applicable to Sustainable Land Imaging for Landsat 10 and beyond. With increased budget constraints probable for future missions, technological improvements must be included in future instrument architectures to enable increased capabilities at lower cost. Ball presents the instrument architectures and associated capabilities enabling new science in past, current, and future Landsat missions.

  15. The Blue Dot Workshop: Spectroscopic Search for Life on Extrasolar Planets

    NASA Technical Reports Server (NTRS)

    Des Marais, David J. (Editor)

    1997-01-01

    This workshop explored the key questions and challenges associated with detecting life on an extrasolar planet. The final product will be a NASA Conference Publication which includes the abstracts from 21 talks, summaries of key findings, and recommendations for future research. The workshop included sessions on three related topics: the biogeochemistry of biogenic gases in the atmosphere, the chemistry and spectroscopy of planetary atmospheres, and the remote sensing of planetary atmospheres and surfaces. With the observation that planetary formation is probably a common phenomenon, together with the advent of the technical capability to locate and describe extrasolar planets, this research area indeed has an exciting future.

  16. Recent Results on "Approximations to Optimal Alarm Systems for Anomaly Detection"

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2009-01-01

    An optimal alarm system and its approximations may use Kalman filtering for univariate linear dynamic systems driven by Gaussian noise to provide a layer of predictive capability. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. An optimal alarm system can be designed to elicit the fewest false alarms for a fixed detection probability in this particular scenario.

  17. Diagnostic imaging applications; Proceedings of the Meeting, Amsterdam, Netherlands, October 8, 9, 1984

    NASA Technical Reports Server (NTRS)

    Beckenbach, E. S. (Editor)

    1984-01-01

    It is more important than ever that engineers have an understanding of the future needs of clinical and research medicine, and that physicians know somthing about probable future developments in instrumentation capabilities. Only by maintaining such a dialog can the most effective application of technological advances to medicine be achieved. This workshop attempted to provide this kind of information transfer in the limited field of diagnostic imaging. Biomedical research at the Jet Propulsion Laboratory is discussed, taking into account imaging results from space exploration missions, as well as biomedical research tasks based in these technologies. Attention is also given to current and future indications for magnetic resonance in medicine, high speed quantitative digital microscopy, computer processing of radiographic images, computed tomography and its modern applications, position emission tomography, and developments related to medical ultrasound.

  18. The Hurricane-Flood-Landslide Continuum

    NASA Technical Reports Server (NTRS)

    Negri, Andrew J.; Burkardt, Nina; Golden, Joseph H.; Halverson, Jeffrey B.; Huffman, George J.; Larsen, Matthew C.; McGinley, John A.; Updike, Randall G.; Verdin, James P.; Wieczorek, Gerald F.

    2005-01-01

    In August 2004, representatives from NOAA, NASA, the USGS, and other government agencies convened in San Juan, Puerto Rim for a workshop to discuss a proposed research project called the Hurricane-Flood-Landslide Continuum (HFLC). The essence of the HFLC is to develop and integrate tools across disciplines to enable the issuance of regional guidance products for floods and landslides associated with major tropical rain systems, with sufficient lead time that local emergency managers can protect vulnerable populations and infrastructure. All three lead agencies are independently developing precipitation-flood-debris flow forecasting technologies, and all have a history of work on natural hazards both domestically and overseas. NOM has the capability to provide tracking and prediction of storm rainfall, trajectory and landfall and is developing flood probability and magnTtude capabilities. The USGS has the capability to evaluate the ambient stability of natural and man-made landforms, to assess landslide susceptibilities for those landforms, and to establish probabilities for initiation of landslides and debris flows. Additionally, the USGS has well-developed operational capacity for real-time monitoring and reporting of streamflow across distributed networks of automated gaging stations (http://water.usgs.gov/waterwatch/). NASA has the capability to provide sophisticated algorithms for satellite remote sensing of precipitation, land use, and in the future, soil moisture. The Workshop sought to initiate discussion among three agencies regarding their specific and highly complimentary capabilities. The fundamental goal of the Workshop was to establish a framework that will leverage the strengths of each agency. Once a prototype system is developed for example, in relatively data-rich Puerto Rim, it could be adapted for use in data-poor, low-infrastructure regions such as the Dominican Republic or Haiti. This paper provides an overview of the Workshop s goals, presentations and recommendations with respect to the development of the HFLC.

  19. The United States should forego a damage-limitation capability against China

    NASA Astrophysics Data System (ADS)

    Glaser, Charles L.

    2017-11-01

    Bottom Lines • THE KEY STRATEGIC NUCLEAR CHOICE. Whether to attempt to preserve its damage-limitation capability against China is the key strategic nuclear choice facing the United States. The answer is much less clear-cut than when the United States faced the Soviet Union during the Cold War. • FEASIBILITY OF DAMAGE LIMITATION. Although technology has advanced significantly over the past three decades, future military competition between the U.S. and Chinese forces will favor large-scale nuclear retaliation over significant damage limitation. • BENEFITS AND RISKS OF A DAMAGE-LIMITATION CAPABILITY. The benefits provided by a modest damage-limitation capability would be small, because the United States can meet its most important regional deterrent requirements without one. In comparison, the risks, which include an increased probability of accidental and unauthorized Chinese attacks, as well as strained U.S.—China relations, would be large. • FOREGO DAMAGE LIMITATION. These twin findings—the poor prospects for prevailing in the military competition, and the small benefits and likely overall decrease in U.S. security—call for a U.S. policy that foregoes efforts to preserve or enhance its damage-limitation capability.

  20. LEGEND, a LEO-to-GEO Environment Debris Model

    NASA Technical Reports Server (NTRS)

    Liou, Jer Chyi; Hall, Doyle T.

    2013-01-01

    LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.

  1. Associations among habitat characteristics and meningeal worm prevalence in eastern South Dakota, USA

    USGS Publications Warehouse

    Jacques, Christopher N.; Jenks, Jonathan A.; Klaver, Robert W.; Dubay, Shelli A.

    2017-01-01

    Few studies have evaluated how wetland and forest characteristics influence the prevalence of meningeal worm (Parelaphostrongylus tenuis) infection of deer throughout the grassland biome of central North America. We used previously collected, county-level prevalence data to evaluate associations between habitat characteristics and probability of meningeal worm infection in white-tailed deer (Odocoileus virginianus) across eastern South Dakota, US. The highest-ranked binomial regression model for detecting probability of meningeal worm infection was spring temperature + summer precipitation + percent wetland; weight of evidence (wi=0.71) favored this model over alternative models, though predictive capability was low (Receiver operating characteristic=0.62). Probability of meningeal worm infection increased by 1.3- and 1.6-fold for each 1-cm and 1-C increase in summer precipitation and spring temperature, respectively. Similarly, probability of infection increased 1.2-fold for each 1% increase in wetland habitat. Our findings highlight the importance of wetland habitat in predicting meningeal worm infection across eastern South Dakota. Future research is warranted to evaluate the relationships between climatic conditions (e.g., drought, wet cycles) and deer habitat selection in maintaining P. tenuis along the western boundary of the parasite.

  2. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  3. Assessing the Utility of Seasonal SST Forecasts to the Fisheries Management Process: a Pacific Sardine Case Study

    NASA Astrophysics Data System (ADS)

    Tommasi, D.; Stock, C. A.

    2016-02-01

    It is well established that environmental fluctuations affect the productivity of numerous fish stocks. Recent advances in prediction capability of dynamical global forecast systems, such as the state of the art NOAA Geophysical Fluid dynamics Laboratory (GFDL) 2.5-FLOR model, allow for climate predictions of fisheries-relevant variables at temporal scales relevant to the fishery management decision making process. We demonstrate that the GFDL FLOR model produces skillful seasonal SST anomaly predictions over the continental shelf , where most of the global fish yield is generated. The availability of skillful SST projections at this "fishery relevant" scale raises the potential for better constrained estimates of future fish biomass and improved harvest decisions. We assessed the utility of seasonal SST coastal shelf predictions for fisheries management using the case study of Pacific sardine. This fishery was selected because it is one of the few to already incorporate SST into its harvest guideline, and show a robust recruitment-SST relationship. We quantified the effectiveness of management under the status quo harvest guideline (HG) and under alternative HGs including future information at different levels of uncertainty. Usefulness of forecast SST to management was dependent on forecast uncertainty. If the standard deviation of the SST anomaly forecast residuals was less than 0.65, the alternative HG produced higher long-term yield and stock biomass, and reduced the probability of either catch or stock biomass falling below management-set threshold values as compared to the status quo. By contrast, probability of biomass falling to extremely low values increased as compared to the status quo for all alternative HGs except for a perfectly known future SST case. To safeguard against occurrence of such low probability but costly events, a harvest cutoff biomass also has to be implemented into the HG.

  4. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  5. The influence of institutional pressures on hospital electronic health record presence.

    PubMed

    Fareed, Naleef; Bazzoli, Gloria J; Farnsworth Mick, Stephen S; Harless, David W

    2015-05-01

    Electronic health records (EHR) are a promising form of health information technology that could help US hospitals improve on their quality of care and costs. During the study period explored (2005-2009), high expectations for EHR diffused across institutional stakeholders in the healthcare environment, which may have pressured hospitals to have EHR capabilities even in the presence of weak technical rationale for the technology. Using an extensive set of organizational theory-specific predictors, this study explored whether five factors - cause, constituents, content, context, and control - that reflect the nature of institutional pressures for EHR capabilities motivated hospitals to comply with these pressures. Using information from several national data bases, an ordered probit regression model was estimated. The resulting predicted probabilities of EHR capabilities from the empirical model's estimates were used to test the study's five hypotheses, of which three were supported. When the underlying cause, dependence on constituents, or influence of control were high and potential countervailing forces were low, hospitals were more likely to employ strategic responses that were compliant with the institutional pressures for EHR capabilities. In light of these pressures, hospitals may have acquiesced, by having comprehensive EHR capabilities, or compromised, by having intermediate EHR capabilities, in order to maintain legitimacy in their environment. The study underscores the importance of our assessment for theory and policy development, and provides suggestions for future research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Solid motor diagnostic instrumentation. [design of self-contained instrumentation

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.; Arens, W. E.; Wuest, W. S.

    1973-01-01

    A review of typical surveillance and monitoring practices followed during the flight phases of representative solid-propellant upper stages and apogee motors was conducted to evaluate the need for improved flight diagnostic instrumentation on future spacecraft. The capabilities of the flight instrumentation package were limited to the detection of whether or not the solid motor was the cause of failure and to the identification of probable primary failure modes. Conceptual designs of self-contained flight instrumentation packages capable of meeting these reqirements were generated and their performance, typical cost, and unit characteristics determined. Comparisons of a continuous real time and a thresholded hybrid design were made on the basis of performance, mass, power, cost, and expected life. The results of this analysis substantiated the feasibility of a self-contained independent flight instrumentation module as well as the existence of performance margins by which to exploit growth option applications.

  7. Evaluation of ultrasonic array imaging algorithms for inspection of a coarse grained material

    NASA Astrophysics Data System (ADS)

    Van Pamel, A.; Lowe, M. J. S.; Brett, C. R.

    2014-02-01

    Improving the ultrasound inspection capability for coarse grain metals remains of longstanding interest to industry and the NDE research community and is expected to become increasingly important for next generation power plants. A test sample of coarse grained Inconel 625 which is representative of future power plant components has been manufactured to test the detectability of different inspection techniques. Conventional ultrasonic A, B, and C-scans showed the sample to be extraordinarily difficult to inspect due to its scattering behaviour. However, in recent years, array probes and Full Matrix Capture (FMC) imaging algorithms, which extract the maximum amount of information possible, have unlocked exciting possibilities for improvements. This article proposes a robust methodology to evaluate the detection performance of imaging algorithms, applying this to three FMC imaging algorithms; Total Focusing Method (TFM), Phase Coherent Imaging (PCI), and Decomposition of the Time Reversal Operator with Multiple Scattering (DORT MSF). The methodology considers the statistics of detection, presenting the detection performance as Probability of Detection (POD) and probability of False Alarm (PFA). The data is captured in pulse-echo mode using 64 element array probes at centre frequencies of 1MHz and 5MHz. All three algorithms are shown to perform very similarly when comparing their flaw detection capabilities on this particular case.

  8. The Science-Policy Link: Stakeholder Reactions to the Uncertainties of Future Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Plag, H.; Bye, B.

    2011-12-01

    Policy makers and stakeholders in the coastal zone are equally challenged by the risk of an anticipated rise of coastal Local Sea Level (LSL) as a consequence of future global warming. Many low-lying and often densely populated coastal areas are under risk of increased inundation. More than 40% of the global population is living in or near the coastal zone and this fraction is steadily increasing. A rise in LSL will increase the vulnerability of coastal infrastructure and population dramatically, with potentially devastating consequences for the global economy, society, and environment. Policy makers are faced with a trade-off between imposing today the often very high costs of coastal protection and adaptation upon national economies and leaving the costs of potential major disasters to future generations. They are in need of actionable information that provides guidance for the development of coastal zones resilient to future sea level changes. Part of this actionable information comes from risk and vulnerability assessments, which require information on future LSL changes as input. In most cases, a deterministic approach has been applied based on predictions of the plausible range of future LSL trajectories as input. However, there is little consensus in the scientific community on how these trajectories should be determined, and what the boundaries of the plausible range are. Over the last few years, many publications in Science, Nature and other peer-reviewed scientific journals have revealed a broad range of possible futures and significant epistemic uncertainties and gaps concerning LSL changes. Based on the somewhat diffuse science input, policy and decision makers have made rather different choices for mitigation and adaptation in cases such as Venice, The Netherlands, New York City, and the San Francisco Bay area. Replacing the deterministic, prediction-based approach with a statistical one that fully accounts for the uncertainties and epistemic gaps would provide a different kind of science input to policy makers and stakeholders. Like in many other insurance problems (for example, earthquakes), where deterministic predictions are not possible and decisions have to be made on the basis of statistics and probabilities, the statistical approach to coastal resilience would require stakeholders to make decisions on the basis of probabilities instead of predictions. The science input for informed decisions on adaptation would consist of general probabilities of decadal to century scale sea level changes derived from paleo records, including the probabilities for large and rapid rises. Similar to other problems where the appearance of a hazard is associated with a high risk (like a fire in a house), this approach would also require a monitoring and warning system (a "smoke detector") capable of detecting any onset of a rapid sea level rise.

  9. The AIDS epidemic: the spread of a deadly disease in the biotech era.

    PubMed

    Noë, A; Verhofstede, C; Plum, J

    2004-01-01

    In the last two decades we have witnessed the progression of a newly introduced infection in humans. It is sobering that despite a world-wide effort and the tremendous progression of technical capabilities and scientific knowledge we are still not able to control the global epidemic of HIV. In 2004 more than 40 million people were infected. Educational approaches to modify risk-taking behavior is still the most critical component of prevention and the most important measure to limit the spread of the infection. Vaccine development, which is still far from promising, is probably the only way to control the disease in the future.

  10. Training in Small Business Retailing: Testing Human Capital Theory.

    ERIC Educational Resources Information Center

    Barcala, Marta Fernandez; Perez, Maria Jose Sanzo; Gutierrez, Juan Antonio Trespalacios

    1999-01-01

    Looks at four models of training demand: (1) probability of attending training in the near future; (2) probability of having attended training in the past; (3) probability of being willing to follow multimedia and correspondence courses; and (4) probability of repeating the experience of attending another training course in the near future.…

  11. Assessing risk based on uncertain avalanche activity patterns

    NASA Astrophysics Data System (ADS)

    Zeidler, Antonia; Fromm, Reinhard

    2015-04-01

    Avalanches may affect critical infrastructure and may cause great economic losses. The planning horizon of infrastructures, e.g. hydropower generation facilities, reaches well into the future. Based on the results of previous studies on the effect of changing meteorological parameters (precipitation, temperature) and the effect on avalanche activity we assume that there will be a change of the risk pattern in future. The decision makers need to understand what the future might bring to best formulate their mitigation strategies. Therefore, we explore a commercial risk software to calculate risk for the coming years that might help in decision processes. The software @risk, is known to many larger companies, and therefore we explore its capabilities to include avalanche risk simulations in order to guarantee a comparability of different risks. In a first step, we develop a model for a hydropower generation facility that reflects the problem of changing avalanche activity patterns in future by selecting relevant input parameters and assigning likely probability distributions. The uncertain input variables include the probability of avalanches affecting an object, the vulnerability of an object, the expected costs for repairing the object and the expected cost due to interruption. The crux is to find the distribution that best represents the input variables under changing meteorological conditions. Our focus is on including the uncertain probability of avalanches based on the analysis of past avalanche data and expert knowledge. In order to explore different likely outcomes we base the analysis on three different climate scenarios (likely, worst case, baseline). For some variables, it is possible to fit a distribution to historical data, whereas in cases where the past dataset is insufficient or not available the software allows to select from over 30 different distribution types. The Monte Carlo simulation uses the probability distribution of uncertain variables using all valid combinations of the values of input variables to simulate all possible outcomes. In our case the output is the expected risk (Euro/year) for each object (e.g. water intake) considered and the entire hydropower generation system. The output is again a distribution that is interpreted by the decision makers as the final strategy depends on the needs and requirements of the end-user, which may be driven by personal preferences. In this presentation, we will show a way on how we used the uncertain information on avalanche activity in future to subsequently use it in a commercial risk software and therefore bringing the knowledge of natural hazard experts to decision makers.

  12. Cache-enabled small cell networks: modeling and tradeoffs.

    PubMed

    Baştuǧ, Ejder; Bennis, Mehdi; Kountouris, Marios; Debbah, Mérouane

    We consider a network model where small base stations (SBSs) have caching capabilities as a means to alleviate the backhaul load and satisfy users' demand. The SBSs are stochastically distributed over the plane according to a Poisson point process (PPP) and serve their users either (i) by bringing the content from the Internet through a finite rate backhaul or (ii) by serving them from the local caches. We derive closed-form expressions for the outage probability and the average delivery rate as a function of the signal-to-interference-plus-noise ratio (SINR), SBS density, target file bitrate, storage size, file length, and file popularity. We then analyze the impact of key operating parameters on the system performance. It is shown that a certain outage probability can be achieved either by increasing the number of base stations or the total storage size. Our results and analysis provide key insights into the deployment of cache-enabled small cell networks (SCNs), which are seen as a promising solution for future heterogeneous cellular networks.

  13. Total Probability of Collision as a Metric for Finite Conjunction Assessment and Collision Risk Management

    NASA Technical Reports Server (NTRS)

    Frigm, Ryan C.; Hejduk, Matthew D.; Johnson, Lauren C.; Plakalovic, Dragan

    2015-01-01

    On-orbit collision risk is becoming an increasing mission risk to all operational satellites in Earth orbit. Managing this risk can be disruptive to mission and operations, present challenges for decision-makers, and is time-consuming for all parties involved. With the planned capability improvements to detecting and tracking smaller orbital debris and capacity improvements to routinely predict on-orbit conjunctions, this mission risk will continue to grow in terms of likelihood and effort. It is very real possibility that the future space environment will not allow collision risk management and mission operations to be conducted in the same manner as it is today. This paper presents the concept of a finite conjunction assessment-one where each discrete conjunction is not treated separately but, rather, as a continuous event that must be managed concurrently. The paper also introduces the Total Probability of Collision as an analogous metric for finite conjunction assessment operations and provides several options for its usage in a Concept of Operations.

  14. 2014 SRNL LDRD Annual Report, Rev. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mcwhorter, S.

    2015-03-15

    Laboratory Directed Research and Development is a congressionally authorized program that provides the ‘innovation inspiration’ from which many of the Laboratory’s multi-discipline advancements are made in both science and engineering technology. The program is the backbone for insuring that scientific, technical and engineering capabilities can meet current and future needs. It is an important tool in reducing the probability of technological surprise by allowing laboratory technical staff room to innovate and keep abreast of scientific breakthroughs. Drawing from the synergism among the EM and NNSA missions, and work from other federal agencies ensures that LDRD is the key element inmore » maintaining the vitality of SRNL’s technical programs. The LDRD program aims to position the Laboratory for new business in clean energy, national security, nuclear materials management and environmental stewardship by leveraging the unique capabilities of the Laboratory to yield foundational scientific research in core business areas, while aligning with SRS strategic initiatives and maintaining a vision for ultimate DOE applications.« less

  15. Symbolic inversion of control relationships in model-based expert systems

    NASA Technical Reports Server (NTRS)

    Thomas, Stan

    1988-01-01

    Symbolic inversion is examined from several perspectives. First, a number of symbolic algebra and mathematical tool packages were studied in order to evaluate their capabilities and methods, specifically with respect to symbolic inversion. Second, the KATE system (without hardware interface) was ported to a Zenith Z-248 microcomputer running Golden Common Lisp. The interesting thing about the port is that it allows the user to have measurements vary and components fail in a non-deterministic manner based upon random value from probability distributions. Third, INVERT was studied as currently implemented in KATE, its operation documented, some of its weaknesses identified, and corrections made to it. The corrections and enhancements are primarily in the way that logical conditions involving AND's and OR's and inequalities are processed. In addition, the capability to handle equalities was also added. Suggestions were also made regarding the handling of ranges in INVERT. Last, other approaches to the inversion process were studied and recommendations were made as to how future versions of KATE should perform symbolic inversion.

  16. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    PubMed

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  17. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  18. USAF Cyber Capability Development: A Vision for Future Cyber Warfare & a Concept for Education of Cyberspace Leaders

    DTIC Science & Technology

    2009-04-01

    Significant and interrelated problems are hindering the Air Force’s development of cyber warfare capabilities. The first is a lack of awareness about...why the AF has chosen to take cyber warfare on as a core capability on par with air and space. The second stems from the lack of a commonly...the cyber capabilities needed in the future? The contributions of this research include a strategic vision for future cyber warfare capabilities that

  19. The Everett-Wheeler interpretation and the open future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudbery, Anthony

    2011-03-28

    I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

  20. Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.

    PubMed

    Speirs, Calandra; Huang, Vivian; Konnert, Candace

    2017-09-01

    Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.

  1. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    High resolution calibrated infrared imagery of vehicles during hypervelocity atmospheric entry or sustained hypersonic cruise has the potential to provide flight data on the distribution of surface temperature and the state of the airflow over the vehicle. In the early 1980 s NASA sought to obtain high spatial resolution infrared imagery of the Shuttle during entry. Despite mission execution with a technically rigorous pre-planning capability, the single airborne optical system for this attempt was considered developmental and the scientific return was marginal. In 2005 the Space Shuttle Program again sponsored an effort to obtain imagery of the Orbiter. Imaging requirements were targeted towards Shuttle ascent; companion requirements for entry did not exist. The engineering community was allowed to define observation goals and incrementally demonstrate key elements of a quantitative spatially resolved measurement capability over a series of flights. These imaging opportunities were extremely beneficial and clearly demonstrated capability to capture infrared imagery with mature and operational assets of the US Navy and the Missile Defense Agency. While successful, the usefulness of the imagery was, from an engineering perspective, limited. These limitations were mainly associated with uncertainties regarding operational aspects of data acquisition. These uncertainties, in turn, came about because of limited pre-flight mission planning capability, a poor understanding of several factors including the infrared signature of the Shuttle, optical hardware limitations, atmospheric effects and detector response characteristics. Operational details of sensor configuration such as detector integration time and tracking system algorithms were carried out ad hoc (best practices) which led to low probability of target acquisition and detector saturation. Leveraging from the qualified success during Return-to-Flight, the NASA Engineering and Safety Center sponsored an assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  2. Evaluating detection and estimation capabilities of magnetometer-based vehicle sensors

    NASA Astrophysics Data System (ADS)

    Slater, David M.; Jacyna, Garry M.

    2013-05-01

    In an effort to secure the northern and southern United States borders, MITRE has been tasked with developing Modeling and Simulation (M&S) tools that accurately capture the mapping between algorithm-level Measures of Performance (MOP) and system-level Measures of Effectiveness (MOE) for current/future surveillance systems deployed by the the Customs and Border Protection Office of Technology Innovations and Acquisitions (OTIA). This analysis is part of a larger M&S undertaking. The focus is on two MOPs for magnetometer-based Unattended Ground Sensors (UGS). UGS are placed near roads to detect passing vehicles and estimate properties of the vehicle's trajectory such as bearing and speed. The first MOP considered is the probability of detection. We derive probabilities of detection for a network of sensors over an arbitrary number of observation periods and explore how the probability of detection changes when multiple sensors are employed. The performance of UGS is also evaluated based on the level of variance in the estimation of trajectory parameters. We derive the Cramer-Rao bounds for the variances of the estimated parameters in two cases: when no a priori information is known and when the parameters are assumed to be Gaussian with known variances. Sample results show that UGS perform significantly better in the latter case.

  3. Systems identification and application systems development for monitoring the physiological and health status of crewmen in space

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.; Furukawa, S.; Vannordstrand, P. C.

    1975-01-01

    The use of automated, analytical techniques to aid medical support teams is suggested. Recommendations are presented for characterizing crew health in terms of: (1) wholebody function including physiological, psychological and performance factors; (2) a combination of critical performance indexes which consist of multiple factors of measurable parameters; (3) specific responses to low noise level stress tests; and (4) probabilities of future performance based on present and periodic examination of past performance. A concept is proposed for a computerized real time biomedical monitoring and health care system that would have the capability to integrate monitored data, detect off-nominal conditions based on current knowledge of spaceflight responses, predict future health status, and assist in diagnosis and alternative therapies. Mathematical models could play an important role in this approach, especially when operating in a real time mode. Recommendations are presented to update the present health monitoring systems in terms of recent advances in computer technology and biomedical monitoring systems.

  4. Developing our leaders in the future.

    PubMed

    Hackett, M; Spurgeon, P

    1998-01-01

    The role of the chief executive in a transformed organisation is an extremely challenging one. The development of vision, building a commitment to it and communicating it constantly are key skills for a chief executive. However, the need to build and empower the stakeholders within and outside the organisation to support the changes required to deliver the vision requires leaders who can connect with a wide range of people and build alliances and partnerships to secure organisational success. A passion for understanding human intervention and behaviour is needed to encourage, cajole and drive teams and individuals to own and commit to change and a new direction. This requires leaders who have imagination and creativity--who seek connections and thread them together to create order out of incoherence. These skills are not taught in schools or textbooks, but are probably innate. They are what separate leaders from the rest. These skills need to be developed. A movement towards encouraging experimentation, career transfers and more individuality is needed if capable leaders of the future are to appear.

  5. Bayesics

    NASA Astrophysics Data System (ADS)

    Skilling, John

    2005-11-01

    This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth

  6. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    PubMed

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  7. Joint chemical agent detector (JCAD): the future of chemical agent detection

    NASA Astrophysics Data System (ADS)

    Laljer, Charles E.

    2003-08-01

    The Joint Chemical Agent Detector (JCAD) has continued development through 2002. The JCAD has completed Contractor Validation Testing (CVT) that included chemical warfare agent testing, environmental testing, electromagnetic interferent testing, and platform integration validation. The JCAD provides state of the art chemical warfare agent detection capability to military and homeland security operators. Intelligence sources estimate that over twenty countries have active chemical weapons programs. The spread of weapons of mass destruction (and the industrial capability for manufacture of these weapons) to third world nations and terrorist organizations has greatly increased the chemical agent threat to U.S. interests. Coupled with the potential for U.S. involvement in localized conflicts in an operational or support capacity, increases the probability that the military Joint Services may encounter chemical agents anywhere in the world. The JCAD is a small (45 in3), lightweight (2 lb.) chemical agent detector for vehicle interiors, aircraft, individual personnel, shipboard, and fixed site locations. The system provides a common detection component across multi-service platforms. This common detector system will allow the Joint Services to use the same operational and support concept for more efficient utilization of resources. The JCAD detects, identifies, quantifies, and warns of the presence of chemical agents prior to onset of miosis. Upon detection of chemical agents, the detector provides local and remote audible and visual alarms to the operators. Advance warning will provide the vehicle crew and other personnel in the local area with the time necessary to protect themselves from the lethal effects of chemical agents. The JCAD is capable of being upgraded to protect against future chemical agent threats. The JCAD provides the operator with the warning necessary to survive and fight in a chemical warfare agent threat environment.

  8. Intelligent systems for the autonomous exploration of Titan and Enceladus

    NASA Astrophysics Data System (ADS)

    Furfaro, Roberto; Lunine, Jonathan I.; Kargel, Jeffrey S.; Fink, Wolfgang

    2008-04-01

    Future planetary exploration of the outer satellites of the Solar System will require higher levels of onboard automation, including autonomous determination of sites where the probability of significant scientific findings is highest. Generally, the level of needed automation is heavily influenced by the distance between Earth and the robotic explorer(s) (e.g. spacecraft(s), rover(s), and balloon(s)). Therefore, planning missions to the outer satellites mandates the analysis, design and integration within the mission architecture of semi- and/or completely autonomous intelligence systems. Such systems should (1) include software packages that enable fully automated and comprehensive identification, characterization, and quantification of feature information within an operational region with subsequent target prioritization and selection for close-up reexamination; and (2) integrate existing information with acquired, "in transit" spatial and temporal sensor data to automatically perform intelligent planetary reconnaissance, which includes identification of sites with the highest potential to yield significant geological and astrobiological information. In this paper we review and compare some of the available Artificial Intelligence (AI) schemes and their adaptation to the problem of designing expert systems for onboard-based, autonomous science to be performed in the course of outer satellites exploration. More specifically, the fuzzy-logic framework proposed is analyzed in some details to show the effectiveness of such a scheme when applied to the problem of designing expert systems capable of identifying and further exploring regions on Titan and/or Enceladus that have the highest potential to yield evidence for past or present life. Based on available information (e.g., Cassini data), the current knowledge and understanding of Titan and Enceladus environments is evaluated to define a path for the design of a fuzzy-based system capable of reasoning over collected data and capable of providing the inference required to autonomously optimize future outer satellites explorations.

  9. Artificial neural networks in gynaecological diseases: current and potential future applications.

    PubMed

    Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios

    2010-10-01

    Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.

  10. The Science Advantage of a Redder Filter for WFIRST

    NASA Astrophysics Data System (ADS)

    Bauer, James; Stauffer, John; Milam, Stefanie N.; Holler, Bryan J.

    2018-01-01

    WFIRST will be capable of providing Hubble-quality imaging performance over several thousand square degrees of the sky. The wide-area, high spatial resolution survey data from WFIRST will be unsurpassed for probably many decades into the future. With the current baseline design, the WFIRST filter complement will extend from the bluest wavelength allowed by the optical design to a reddest filter (F184W) that has a red cutoff at 2.0 microns. Extension of the imaging capabilities even slightly beyond the 2.0 micron wavelength cut-off would provide significant advantages over the presently proposed science for objects both near and far. The inclusion of a Ks (2.0-2.3 micron) filter would result in a wider range and more comprehensive set of Solar System investigations. It would also extend the range of higher-redshift population studies. In this poster, we outline some of the science advantages for adding a K filter, similar in bandpass to the 2MASS Ks filter, in order to extend the wavelength range for WFIRST as far to the red as the thermal performance of the spacecraft allows.

  11. Target Detection Routine (TADER). User’s Guide.

    DTIC Science & Technology

    1987-09-01

    o System range capability subset (one record - omitted for standoff SLAR and penetrating system) o System inherent detection probability subset ( IELT ...records, i.e., one per element type) * System capability modifier subset/A=1, E=1 ( IELT records) o System capability modifier subset/A=1, E=2 ( IELT ...records) s System capability modifier subset/A=2, E=1 ( IELT records) o System capability modifier subset/A=2, E=2 ( IELT records) Unit Data Set (one set

  12. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis.

    PubMed

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-06-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals' time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying "I don't know" item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research.

  13. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis

    PubMed Central

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-01-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals’ time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying “I don’t know” item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research. PMID:28781381

  14. Aeolian sand transport and aeolian deposits on Venus: A review

    NASA Astrophysics Data System (ADS)

    Kreslavsly, Mikhail A.; Bondarenko, Nataliya V.

    2017-06-01

    We review the current state of knowledge about aeolian sand transport and aeolian bedforms on planet Venus. This knowledge is limited by lack of observational data. Among the four planetary bodies of the Solar System with sufficient atmospheres in contact with solid surfaces, Venus has the densest atmosphere; the conditions there are transitional between those for terrestrial subaerial and subaqueous transport. The dense atmosphere causes low saltation threshold and short characteristic saltation length, and short scale length of the incipient dunes. A few lines of evidence indicate that the typical wind speeds exceed the saltation threshold; therefore, sand transport would be pervasive, if sand capable of saltation is available. Sand production on Venus is probably much slower than on the Earth; the major terrestrial sand sinks are also absent, however, lithification of sand through sintering is expected to be effective under Venus' conditions. Active transport is not detectable with the data available. Aeolian bedforms (transverse dunes) resolved in the currently available radar images occupy a tiny area on the planet; however, indirect observations suggest that small-scale unresolved aeolian bedforms are ubiquitous. Aeolian transport is probably limited by sand lithification causing shortage of saltation-capable material. Large impact events likely cause regional short-term spikes in aeolian transport by supplying a large amount of sand-size particles, as well as disintegration and activation of older indurated sand deposits. The data available are insufficient to understand whether the global aeolian sand transport occurs or not. More robust knowledge about aeolian transport on Venus is essential for future scientific exploration of the planet, in particular, for implementation and interpretation of geochemical studies of surface materials. High-resolution orbital radar imaging with local to regional coverage and desirable interferometric capabilities is the most effective way to obtain essential new knowledge about aeolian transport on Venus.

  15. Charts designate probable future oceanographic research fields

    NASA Technical Reports Server (NTRS)

    1968-01-01

    Charts outline the questions and problems of oceanographic research in the future. NASA uses the charts to estimate the probable requirements for instrumentation carried by satellites engaged in cooperative programs with other agencies concerned with identification, analysis, and solution of many of these problems.

  16. Molecular design of anticancer drug leads based on three-dimensional quantitative structure-activity relationship.

    PubMed

    Huang, Xiao Yan; Shan, Zhi Jie; Zhai, Hong Lin; Li, Li Na; Zhang, Xiao Yun

    2011-08-22

    Heat shock protein 90 (Hsp90) takes part in the developments of several cancers. Novobiocin, a typically C-terminal inhibitor for Hsp90, will probably used as an important anticancer drug in the future. In this work, we explored the valuable information and designed new novobiocin derivatives based on a three-dimensional quantitative structure-activity relationship (3D QSAR). The comparative molecular field analysis and comparative molecular similarity indices analysis models with high predictive capability were established, and their reliabilities are supported by the statistical parameters. Based on the several important influence factors obtained from these models, six new novobiocin derivatives with higher inhibitory activities were designed and confirmed by the molecular simulation with our models, which provide the potential anticancer drug leads for further research.

  17. Alternative electrical distribution system architectures for automobiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afridi, K.K.; Tabors, R.D.; Kassakian, J.G.

    At present most automobiles use a 12 V electrical system with point-to-point wiring. The capability of this architecture in meeting the needs of future electrical loads is questionable. Furthermore, with the development of electric vehicles (EVs) there is a greater need for a better architecture. In this paper the authors outline the limitations of the conventional architecture and identify alternatives. They also present a multi-attribute trade-off methodology which compares these alternatives, and identifies a set of Pareto optimal architectures. The system attributes traded off are cost, weight, losses and probability of failure. These are calculated by a computer program thatmore » has built-in component attribute models. System attributes of a few dozen architectures are also reported and the results analyzed. 17 refs.« less

  18. Medical Optimization Network for Space Telemedicine Resources

    NASA Technical Reports Server (NTRS)

    Shah, R. V.; Mulcahy, R.; Rubin, D.; Antonsen, E. L.; Kerstman, E. L.; Reyes, D.

    2017-01-01

    INTRODUCTION: Long-duration missions beyond low Earth orbit introduce new constraints to the space medical system such as the inability to evacuate to Earth, communication delays, and limitations in clinical skillsets. NASA recognizes the need to improve capabilities for autonomous care on such missions. As the medical system is developed, it is important to have an ability to evaluate the trade space of what resources will be most important. The Medical Optimization Network for Space Telemedicine Resources was developed for this reason, and is now a system to gauge the relative importance of medical resources in addressing medical conditions. METHODS: A list of medical conditions of potential concern for an exploration mission was referenced from the Integrated Medical Model, a probabilistic model designed to quantify in-flight medical risk. The diagnostic and treatment modalities required to address best and worst-case scenarios of each medical condition, at the terrestrial standard of care, were entered into a database. This list included tangible assets (e.g. medications) and intangible assets (e.g. clinical skills to perform a procedure). A team of physicians working within the Exploration Medical Capability Element of NASA's Human Research Program ranked each of the items listed according to its criticality. Data was then obtained from the IMM for the probability of occurrence of the medical conditions, including a breakdown of best case and worst case, during a Mars reference mission. The probability of occurrence information and criticality for each resource were taken into account during analytics performed using Tableau software. RESULTS: A database and weighting system to evaluate all the diagnostic and treatment modalities was created by combining the probability of condition occurrence data with the criticalities assigned by the physician team. DISCUSSION: Exploration Medical Capabilities research at NASA is focused on providing a medical system to support crew medical needs in the context of a Mars mission. MONSTR is a novel approach to performing a quantitative risk analysis that will assess the relative value of individual resources needed for the diagnosis and treatment of various medical conditions. It will provide the operational and research communities at NASA with information to support informed decisions regarding areas of research investment, future crew training, and medical supplies manifested as part of the exploration medical system.

  19. "If It Is Dreamable It Is Doable": The Role of Desired Job Flexibility in Imagining the Future

    ERIC Educational Resources Information Center

    Guglielmi, Dina; Chiesa, Rita; Mazzetti, Greta

    2016-01-01

    Purpose: The purpose of this paper is to compare how the dimension of attitudes toward future that consists in perception of dynamic future may be affected by desirable goals (desired job flexibility) and probable events (probable job flexibility) in a group of permanent vs temporary employees. Moreover the aim is to explore the gender differences…

  20. Analysis of the impact of safeguards criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, M.F.; Reardon, P.T.

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less

  1. Joint Operations 2030 - Phase III Report: The JO 2030 Capability Set (Operations interarmees 2030 - Rapport Phase III: L’ensemble capacitaire JO 2030)

    DTIC Science & Technology

    2011-04-01

    a ‘strategy as process’ manner to develop capabilities that are flexible, adaptable and robust. 3.4 Future structures The need for agile...to develop models of the future security environment 3.4.10 Planning Under Deep Uncertainty Future structures The need for agile, flexible and... Organisation NEC Network Enabled Capability NGO Non Government Organisation NII Networking and Information Infrastructure PVO Private Voluntary

  2. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of conservative and nonconservative assumptions on the probability results. We discuss the methods necessary to assess mission risks once exploration mission scenarios are characterized. Preliminary efforts have produced results that are commensurate with earlier qualitative estimates of risk probabilities in this and other operational contexts, indicating that our approach may be usefully applied in support of the development of human health and performance standards for long-duration space exploration missions. This approach will also enable mission-specific probabilistic risk assessments for space exploration missions.

  3. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only...of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of

  4. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2007-01-01

    This viewgraph presentation reviews some of the issues that people who specialize in Non destructive evaluation (NDE) have with determining the statistics of the probability of detection. There is discussion of the use of the binominal distribution, and the probability of hit. The presentation then reviews the concepts of Directed Design of Experiments for Validating Probability of Detection of Inspection Systems (DOEPOD). Several cases are reviewed, and discussed. The concept of false calls is also reviewed.

  5. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  6. Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2014-01-01

    The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.

  7. A Stochastic Model for the Landing Dispersion of Hazard Detection and Avoidance Capable Flight Systems

    NASA Astrophysics Data System (ADS)

    Witte, L.

    2014-06-01

    To support landing site assessments for HDA-capable flight systems and to facilitate trade studies between the potential HDA architectures versus the yielded probability of safe landing a stochastic landing dispersion model has been developed.

  8. Scenarios of land use change for agriculture: the role of Land Evaluation in improving model simulation

    NASA Astrophysics Data System (ADS)

    Mereu, V.; Santini, M.; Dettori, G.; Muresu, P.; Spano, D.; Duce, P.

    2009-12-01

    Integrated scenarios of future climate and land use represent a useful input for impact studies about global changes. In particular, improving future land use simulations is essential for the agricultural sector, which is influenced by both biogeophysical constraints and human needs. Often land use change models are mainly based on statistical relationships between known land use distribution and biophysical or socio-economic factors, neglecting the necessary consideration of physical constraints that interact in making lands more or less capable for agriculture and suitable for supporting specific crops. In this study, a well developed land use change model (CLUE@CMCC) was suited for the Mediterranean basin case study, focusing on croplands. Several climate scenarios and future demands for croplands were combined to drive the model, while the same climate scenarios were used to more reliably allocate crops in the most suitable areas on the basis of Land Evaluation techniques. The probability for each map unit to sustain a specific crop, usually related to location characteristics, elasticity to conversion and competition among land use types, now includes specific crop-favoring location characteristics. Results, besides improving the consistency of the land use change model to allocate land for the future, can have the main feedback to suggest feasibility or reasonable thresholds to adjust land use demands during dynamic simulations.

  9. Automation, Autonomy & Megacities 2025: A Dark Preview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Assante, Michael; Bochman, Andrew

    This paper extrapolates from present trends to describe very plausible – and actually quite likely – future crises playing out in multiple global cities. While predicting the future is fraught with uncertainty, much of what occurs in the scenarios that follow is fully possible today and absent a significant course change, probable in the timeframe discussed. The authors want to caveat that we are not commenting on a specific organization or technology deployment. It is not hard to find tech evangelists touting that ubiquitous and highly interconnected digital technology will bring great advances in productivity and efficiency, as well asmore » new capabilities we cannot foresee. This paper attempts to reveal what is possible when these technologies are applied to critical infrastructure applications en masse without adequate security in densely populated cities that by their nature are less resilient than other environments. Megacities need and will deploy these new technologies to keep up with insatiable demand for energy, communications, transportation and other services, but it is important to recognize that they are also made more vulnerable by following this path .« less

  10. TANDI: threat assessment of network data and information

    NASA Astrophysics Data System (ADS)

    Holsopple, Jared; Yang, Shanchieh Jay; Sudit, Moises

    2006-04-01

    Current practice for combating cyber attacks typically use Intrusion Detection Sensors (IDSs) to passively detect and block multi-stage attacks. This work leverages Level-2 fusion that correlates IDS alerts belonging to the same attacker, and proposes a threat assessment algorithm to predict potential future attacker actions. The algorithm, TANDI, reduces the problem complexity by separating the models of the attacker's capability and opportunity, and fuse the two to determine the attacker's intent. Unlike traditional Bayesian-based approaches, which require assigning a large number of edge probabilities, the proposed Level-3 fusion procedure uses only 4 parameters. TANDI has been implemented and tested with randomly created attack sequences. The results demonstrate that TANDI predicts future attack actions accurately as long as the attack is not part of a coordinated attack and contains no insider threats. In the presence of abnormal attack events, TANDI will alarm the network analyst for further analysis. The attempt to evaluate a threat assessment algorithm via simulation is the first in the literature, and shall open up a new avenue in the area of high level fusion.

  11. Mediators of the Availability Heuristic in Probability Estimates of Future Events.

    ERIC Educational Resources Information Center

    Levi, Ariel S.; Pryor, John B.

    Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…

  12. The Effect of Tungsten and Niobium on the Stress Relaxation Rates of Disk Alloy CH98

    NASA Technical Reports Server (NTRS)

    Gayda, John

    2003-01-01

    Gas turbine engines for future subsonic transports will probably have higher pressure ratios which will require nickel-base superalloy disks with 1300 to 1400 F temperature capability. Several advanced disk alloys are being developed to fill this need. One of these, CH98, is a promising candidate for gas turbine engines and is being studied in NASA s Advanced Subsonic Technology (AST) program. For large disks, residual stresses generated during quenching from solution heat treatment are often reduced by a stabilization heat treatment, in which the disk is heated to 1500 to 1600 F for several hours followed by a static air cool. The reduction in residual stress levels lessens distortion during machining of disks. However, previous work on CH98 has indicated that stabilization treatments decrease creep capability. Additions of the refractory elements tungsten and niobium improve tensile and creep properties after stabilization, while maintaining good crack growth resistance at elevated temperatures. As the additions of refractory elements increase creep capability, they might also effect stress relaxation rates and therefore the reduction in residual stress levels obtained for a given stabilization treatment. To answer this question, the stress relaxation rates of CH98 with and without tungsten and niobium additions are compared in this paper for temperatures and times generally employed in stabilization treatments on modern disk alloys.

  13. Thermal bioaerosol cloud tracking with Bayesian classification

    NASA Astrophysics Data System (ADS)

    Smith, Christian W.; Dupuis, Julia R.; Schundler, Elizabeth C.; Marinelli, William J.

    2017-05-01

    The development of a wide area, bioaerosol early warning capability employing existing uncooled thermal imaging systems used for persistent perimeter surveillance is discussed. The capability exploits thermal imagers with other available data streams including meteorological data and employs a recursive Bayesian classifier to detect, track, and classify observed thermal objects with attributes consistent with a bioaerosol plume. Target detection is achieved based on similarity to a phenomenological model which predicts the scene-dependent thermal signature of bioaerosol plumes. Change detection in thermal sensor data is combined with local meteorological data to locate targets with the appropriate thermal characteristics. Target motion is tracked utilizing a Kalman filter and nearly constant velocity motion model for cloud state estimation. Track management is performed using a logic-based upkeep system, and data association is accomplished using a combinatorial optimization technique. Bioaerosol threat classification is determined using a recursive Bayesian classifier to quantify the threat probability of each tracked object. The classifier can accept additional inputs from visible imagers, acoustic sensors, and point biological sensors to improve classification confidence. This capability was successfully demonstrated for bioaerosol simulant releases during field testing at Dugway Proving Grounds. Standoff detection at a range of 700m was achieved for as little as 500g of anthrax simulant. Developmental test results will be reviewed for a range of simulant releases, and future development and transition plans for the bioaerosol early warning platform will be discussed.

  14. Low-Resolution Screening of Early Stage Acquisition Simulation Scenario Development Decisions

    DTIC Science & Technology

    2012-12-01

    6 seconds) incorporating reload times and assumptions. Phit for min range is assumed to be 100% (excepting FGM- 148, which was estimated for a...User Interface HTN Hierarchical Task Network MCCDC Marine Corps Combat Development Command Phit Probability to hit the intended target Pkill...well beyond the scope of this study. 5. Weapon Capabilities Translation COMBATXXI develops situation probabilities to hit ( Phit ) and probabilities to

  15. An evaluation of the NASA/GSFC Barnes field spectral reflectometer model 14-758, using signal/noise as a measure of utility

    NASA Astrophysics Data System (ADS)

    Bell, R.; Labovitz, M. L.

    1982-07-01

    A Barnes field spectral reflectometer which collected information in 373 channels covering the region from 0.4 to 2.5 micrometers was assessed for signal utility. A band was judged unsatisfactory if the probability was 0.1 or greater than its signal to noise ratio was less than eight to one. For each of the bands the probability of a noisy observation was estimated under a binomial assumption from a set of field crop spectra covering an entire growing season. A 95% confidence interval was calculated about each estimate and bands whose lower confidence limits were greater than 0.1 were judged unacceptable. As a result, 283 channels were deemed statistically satisfactory. Excluded channels correspond to portions of the electromagnetic spectrum (EMS) where high atmospheric absorption and filter wheel overlap occur. In addition, the analyses uncovered intervals of unsatisfactory detection capability within the blue, red and far infrared regions of vegetation spectra. From the results of the analysis it was recommended that 90 channels monitored by the instrument under consideration be eliminated from future studies. These channels are tabulated and discussed.

  16. Genetic diversity of Timarete punctata (Annelida: Cirratulidae): Detection of pseudo-cryptic species and a potential biological invader

    NASA Astrophysics Data System (ADS)

    Seixas, Victor Corrêa; Zanol, Joana; Magalhães, Wagner F.; Paiva, Paulo Cesar

    2017-10-01

    Among the processes that drive biological invasions, the presence of asexual reproduction, as observed in many polychaetes, is an important feature because it allows a rapid spread and colonization in the invaded site. Despite its ecological importance for benthic communities, studies on the biological invasive context are rare for this abundant taxon. Here, the phylogeographic pattern of a common asexual reproducer polychaete, Timarete punctata, was analyzed at five sites along the Atlantic and Pacific Oceans to investigate if its wide distribution is associated to human-mediated transport. Sequences of COI and 16S revealed the presence of two cryptic species. One of them exhibits a wide distribution range (∼14,000 km), very low level of genetic diversity and a high frequency of shared haplotypes along sampled sites. The genetic pattern indicates that this species has probably been introduced in all sampled sites, and its wide distribution is associated to human-mediated transport. In addition, the great capability of T. punctata to reproduce by fragmentation makes the colonization process easier. Thus, the number of alien polychaete species is probably underestimated and future studies are necessary to reach a more realistic perspective.

  17. The long hold: Storing data at the National Archives

    NASA Technical Reports Server (NTRS)

    Thibodeau, Kenneth

    1992-01-01

    The National Archives is, in many respects, in a unique position. For example, I find people from other organizations describing an archival medium as one which will last for three to five years. At the National Archives, we deal with the centuries, not years. From our perspective, there is no archival medium for data storage, and we do not expect there will ever be one. Predicting the long-term future of information technology beyond a mere five or ten years approaches the occult arts. But one prediction is probably safe. It is that the technology will continue to change, at least until analysts start talking about the post-information age. If we did have a medium which lasted a hundred years or longer, we probably would not have a device capable of reading it. The issue of obsolescence, as opposed to media stability, is more complex and more costly. It is especially complex at the National Archives because of two other aspects of our peculiar position. The first aspect is that we deal with incoherent data. The second is that we are charged with satisfying unknown and unknowable requirements. A brief overview of these aspects is presented.

  18. An evaluation of the NASA/GSFC Barnes field spectral reflecometer model 14-758, using signal/noise as a measure of utility

    NASA Technical Reports Server (NTRS)

    Bell, R.; Labovitz, M. L.

    1982-01-01

    A Barnes field spectral reflectometer which collected information in 373 channels covering the region from 0.4 to 2.5 micrometers was assessed for signal utility. A band was judged unsatisfactory if the probability was 0.1 or greater than its signal to noise ratio was less than eight to one. For each of the bands the probability of a noisy observation was estimated under a binomial assumption from a set of field crop spectra covering an entire growing season. A 95% confidence interval was calculated about each estimate and bands whose lower confidence limits were greater than 0.1 were judged unacceptable. As a result, 283 channels were deemed statistically satisfactory. Excluded channels correspond to portions of the electromagnetic spectrum (EMS) where high atmospheric absorption and filter wheel overlap occur. In addition, the analyses uncovered intervals of unsatisfactory detection capability within the blue, red and far infrared regions of vegetation spectra. From the results of the analysis it was recommended that 90 channels monitored by the instrument under consideration be eliminated from future studies. These channels are tabulated and discussed.

  19. The impact of changing climate conditions on the hydrological behavior of several Mediterranean sub-catchments in Crete

    NASA Astrophysics Data System (ADS)

    Eirini Vozinaki, Anthi; Tapoglou, Evdokia; Tsanis, Ioannis

    2017-04-01

    Climate change, although is already happening, consists of a big threat capable of causing lots of inconveniences in future societies and their economies. In this work, the climate change impact on the hydrological behavior of several Mediterranean sub-catchments, in Crete, is presented. The sensitivity of these hydrological systems to several climate change scenarios is also provided. The HBV hydrological model has been used, calibrated and validated for the study sub-catchments against measured weather and streamflow data and inputs. The impact of climate change on several hydro-meteorological parameters (i.e. precipitation, streamflow etc.) and hydrological signatures (i.e. spring flood peak, length and volume, base flow, flow duration curves, seasonality etc.) have been statistically elaborated and analyzed, defining areas of increased probability risk associated additionally to flooding or drought. The potential impacts of climate change on current and future water resources have been quantified by driving HBV model with current and future scenarios, respectively, for specific climate periods. This work aims to present an integrated methodology for the definition of future climate and hydrological risks and the prediction of future water resources behavior. Future water resources management could be rationally effectuated, in Mediterranean sub-catchments prone to drought or flooding, using the proposed methodology. The research reported in this paper was fully supported by the Project "Innovative solutions to climate change adaptation and governance in the water management of the Region of Crete - AQUAMAN" funded within the framework of the EEA Financial Mechanism 2009-2014.

  20. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay

    PubMed Central

    Oddo, Perry C.; Keller, Klaus

    2017-01-01

    Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies. PMID:28350884

  1. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay.

    PubMed

    Ruckert, Kelsey L; Oddo, Perry C; Keller, Klaus

    2017-01-01

    Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.

  2. Earth Satellite Population Instability: Underscoring the Need for Debris Mitigation

    NASA Technical Reports Server (NTRS)

    Liou, Jer-chyi; Johnson, N. L.

    2006-01-01

    A recent study by NASA indicates that the implementation of international orbital debris mitigation measures alone will not prevent a significant increase in the artificial Earth satellite population, beginning in the second half of this century. Whereas the focus of the aerospace community for the past 25 years has been on the curtailment of the generation of long-lived orbital debris, active remediation of the current orbital debris population should now be reconsidered to help preserve near-Earth space for future generations. In particular, we show in this paper that even if launch operations were to cease today, the population of space debris would continue to grow. Further, proposed remediation techniques do not appear to offer a viable solution. We therefore recommend that, while the aerospace community maintains the current debris-limiting mission regulations and postmission disposal procedures, future emphasis should be placed on finding new remediation technologies for solving this growing problem. Since the launch of Sputnik 1, space activities have created an orbital debris environment that poses increasing impact risks to existing space systems, including human space flight and robotic missions (1, 2). Currently, more than 9,000 Earth orbiting man-made objects (including many breakup fragments), with a combined mass exceeding 5 million kilograms, are tracked by the US Space Surveillance Network and maintained in the US satellite catalog (3-5). Three accidental collisions between cataloged satellites during the period from late 1991 to early 2005 have already been documented (6), although fortunately none resulted in the creation of large, trackable debris clouds. Several studies conducted during 1991-2001 demonstrated, with assumed future launch rates, the unintended growth potential of the Earth satellite population, resulting from random, accidental collisions among resident space objects (7-13). In some low Earth orbit (LEO) altitude regimes where the number density of satellites is above a critical spatial density, the production rate of new satellites (i.e., debris) due to collisions exceeds the loss of objects due to orbital decay. NASA s evolutionary satellite population model LEGEND (LEO-to-GEO Environment Debris model), developed by the Orbital Debris Program Office at the NASA Lyndon B. Johnson Space Center, is a high fidelity three-dimensional physical model that is capable of simulating the historical satellite environment, as well as the evolution of future debris populations (14, 15). The subject study assumed no rocket bodies and spacecraft were launched after December 2004, and no future disposal maneuvers were allowed for existing spacecraft, few of which currently have such a capability. The rate of satellite explosions would naturally decrease to zero within a few decades as the current satellite population ages. The LEGEND future projection adopts a Monte Carlo approach to simulate future on-orbit explosions and collisions. Within a given projection time step, once the explosion probability is estimated for an intact object, a random number is drawn and compared with the probability to determine if an explosion would occur. A similar procedure is applied to collisions for each pair of target and projectile involved within the same time step. Due to the nature of the Monte Carlo process, multiple projection runs must be performed and analyzed before one can draw reliable and meaningful conclusions from the outcome. A total of fifty, 200-year future projection Monte Carlo simulations were executed and evaluated (16).

  3. Why do we find ourselves around a yellow star instead of a red star?

    NASA Astrophysics Data System (ADS)

    Haqq-Misra, Jacob; Kopparapu, Ravi Kumar; Wolf, Eric T.

    2018-01-01

    M-dwarf stars are more abundant than G-dwarf stars, so our position as observers on a planet orbiting a G-dwarf raises questions about the suitability of other stellar types for supporting life. If we consider ourselves as typical, in the anthropic sense that our environment is probably a typical one for conscious observers, then we are led to the conclusion that planets orbiting in the habitable zone of G-dwarf stars should be the best place for conscious life to develop. But such a conclusion neglects the possibility that K-dwarfs or M-dwarfs could provide more numerous sites for life to develop, both now and in the future. In this paper we analyse this problem through Bayesian inference to demonstrate that our occurrence around a G-dwarf might be a slight statistical anomaly, but only the sort of chance event that we expect to occur regularly. Even if M-dwarfs provide more numerous habitable planets today and in the future, we still expect mid G- to early K-dwarfs stars to be the most likely place for observers like ourselves. This suggests that observers with similar cognitive capabilities as us are most likely to be found at the present time and place, rather than in the future or around much smaller stars.

  4. PROBABILITIES OF TEMPERATURE EXTREMES IN THE U.S.

    EPA Science Inventory

    The model Temperature Extremes Version 1.0 provides the capability to estimate the probability, for 332 locations in the 50 U.S. states, that an extreme temperature will occur for one or more consecutive days and/or for any number of days in a given month or season, based on stat...

  5. Next generation earth-to-orbit space transportation systems: Unmanned vehicles and liquid/hybrid boosters

    NASA Technical Reports Server (NTRS)

    Hueter, Uwe

    1991-01-01

    The United States civil space effort when viewed from a launch vehicle perspective tends to categorize into pre-Shuttle and Shuttle eras. The pre-Shuttle era consisted of expendable launch vehicles where a broad set of capabilities were matured in a range of vehicles, followed by a clear reluctance to build on and utilize those systems. The Shuttle era marked the beginning of the U.S. venture into reusable space launch vehicles and the consolidation of launch systems used to this one vehicle. This led to a tremendous capability, but utilized men on a few missions where it was not essential and compromised launch capability resiliency in the long term. Launch vehicle failures, between the period of Aug. 1985 and May 1986, of the Titan 34D, Shuttle Challenger, and the Delta vehicles resulted in a reassessment of U.S. launch vehicle capability. The reassessment resulted in President Reagan issuing a new National Space Policy in 1988 calling for more coordination between Federal agencies, broadening the launch capabilities and preparing for manned flight beyond the Earth into the solar system. As a result, the Department of Defense (DoD) and NASA are jointly assessing the requirements and needs for this nations's future transportation system. Reliability/safety, balanced fleet, and resiliency are the cornerstone to the future. An insight is provided into the current thinking in establishing future unmanned earth-to-orbit (ETO) space transportation needs and capabilities. A background of previous launch capabilities, future needs, current and proposed near term systems, and system considerations to assure future mission need will be met, are presented. The focus is on propulsion options associated with unmanned cargo vehicles and liquid booster required to assure future mission needs will be met.

  6. The role of the bidirectional hydrogenase in cyanobacteria.

    PubMed

    Carrieri, Damian; Wawrousek, Karen; Eckert, Carrie; Yu, Jianping; Maness, Pin-Ching

    2011-09-01

    Cyanobacteria have tremendous potential to produce clean, renewable fuel in the form of hydrogen gas derived from solar energy and water. Of the two cyanobacterial enzymes capable of evolving hydrogen gas (nitrogenase and the bidirectional hydrogenase), the hox-encoded bidirectional Ni-Fe hydrogenase has a high theoretical potential. The physiological role of this hydrogenase is a highly debated topic and is poorly understood relative to that of the nitrogenase. Here the structure, assembly, and expression of this enzyme, as well as its probable roles in metabolism, are discussed and analyzed to gain perspective on its physiological role. It is concluded that the bidirectional hydrogenase in cyanobacteria primarily functions as a redox regulator for maintaining a proper oxidation/reduction state in the cell. Recommendations for future research to test this hypothesis are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Automatic spacecraft detumbling by internal mass motion

    NASA Technical Reports Server (NTRS)

    Edwards, T. L.; Kaplan, M. H.

    1974-01-01

    In the operation of future manned space vehicles, there will always be a finite probability that an accident will occur which results in uncontrolled tumbling of a craft. Hard docking by a manned rescue vehicle is not acceptable because of the hazardous environment to which rescue crewmen would be exposed and excessive maneuvering accelerations during docking operations. A movable-mass control concept, which is activated upon initiation of tumbling and is autonomous, can convert tumbling motion into simple spin. The complete equations of motion for an asymmetric rigid spacecraft containing a movable mass are presented, and appropriate control law and system parameters are selected to minimize kinetic energy, resulting in simple spin about the major principal axis. Simulations indicate that for a large space station experiencing a collision, which results in tumbling, a 1% movable mass is capable of stabilizing motion in 2 hr.

  8. The use of a very high temperature nuclear reactor in the manufacture of synthetic fuels

    NASA Technical Reports Server (NTRS)

    Farbman, G. H.; Brecher, L. E.

    1976-01-01

    The three parts of a program directed toward creating a cost-effective nuclear hydrogen production system are described. The discussion covers the development of a very high temperature nuclear reactor (VHTR) as a nuclear heat and power source capable of producing the high temperature needed for hydrogen production and other processes; the development of a hydrogen generation process based on water decomposition, which can utilize the outputs of the VHTR and be integrated with many different ultimate hydrogen consuming processes; and the evaluation of the process applications of the nuclear hydrogen systems to assess the merits and potential payoffs. It is shown that the use of VHTR for the manufacture of synthetic fuels appears to have a very high probability of making a positive contribution to meeting the nation's energy needs in the future.

  9. An autonomous fault detection, isolation, and recovery system for a 20-kHz electric power distribution test bed

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Walters, Jerry L.

    1991-01-01

    Future space explorations will require long term human presence in space. Space environments that provide working and living quarters for manned missions are becoming increasingly larger and more sophisticated. Monitor and control of the space environment subsystems by expert system software, which emulate human reasoning processes, could maintain the health of the subsystems and help reduce the human workload. The autonomous power expert (APEX) system was developed to emulate a human expert's reasoning processes used to diagnose fault conditions in the domain of space power distribution. APEX is a fault detection, isolation, and recovery (FDIR) system, capable of autonomous monitoring and control of the power distribution system. APEX consists of a knowledge base, a data base, an inference engine, and various support and interface software. APEX provides the user with an easy-to-use interactive interface. When a fault is detected, APEX will inform the user of the detection. The user can direct APEX to isolate the probable cause of the fault. Once a fault has been isolated, the user can ask APEX to justify its fault isolation and to recommend actions to correct the fault. APEX implementation and capabilities are discussed.

  10. The Effect of Stabilization Treatments on Disk Alloy CH98

    NASA Technical Reports Server (NTRS)

    Gayda, John; Gabb, Timothy P.; Ellis, David L.

    2003-01-01

    Gas turbine engines for future subsonic transports will probably have higher pressure ratios which will require nickelbase superalloy disks with 1300 to 1400 F temperature capability. Several advanced disk alloys are being developed to fill this need. One of these, CH98, is a promising candidate for gas turbine engines and is being studied in NASA s Advanced Subsonic Technology (AST) program. For large disks, residual stresses generated during quenching from solution heat treatments are often reduced by a stabilization heat treatment, in which the disk is heated to 1500 or 1600 F for several hours followed by a static air cool. The reduction in residual stress levels lessens distortion during machining of disks. However, previous work on CH98 has indicated that stabilization treatments can also decrease creep capability. In this study, a systematic variation of stabilization temperature and time was investigated to determine its effect on 1300 F tensile and, more importantly, creep behavior. Dwell crack growth rates were also measured for selected stabilization conditions. As these advanced disk alloys may be given a supersolvus solution or a subsolvus solution heat treatment for a given application, it was decided that both options would be studied.

  11. Flow cytometry for the assessment of animal sperm integrity and functionality: state of the art

    PubMed Central

    Hossain, Md. Sharoare; Johannisson, Anders; Wallgren, Margareta; Nagy, Szabolcs; Siqueira, Amanda Pimenta; Rodriguez-Martinez, Heriberto

    2011-01-01

    Flow cytometry is now a recognized methodology within animal spermatology, and has moved from being a research tool to become routine in the assessment of animal semen destined to breeding. The availability of ‘bench-top' flow cytometers and of newer and versatile markers for cell structure and function had allowed the instrumentation to measure more sperm parameters, from viability to reactiveness when exposed to exogenous stimuli, and to increase our capabilities to sort spermatozoa for potential fertilizing capacity, or chromosomal sex. The present review summarizes the state of the art regarding flow cytometry applied to animal andrology, albeit keeping an open comparative intent. It critically evaluates the present and future capabilities of flow cytometry for the diagnostics of potential fertility and for the development of current reproductive technologies such as sperm freezing, sperm selection and sperm sorting. The flow cytometry methods will probably further revolutionize our understanding of the sperm physiology and their functionality, and will undoubtedly extend its application in isolating many uncharacterized features of spermatozoa. However, continuous follow-up of the methods is a necessity owing to technical developments and the complexity of mapping spermatozoa. PMID:21478895

  12. Directed Energy Weapons

    DTIC Science & Technology

    2007-12-01

    future business . In defense systems, the key to future business is the existence of funded programs. Military commanders understand the lethality and...directed energp capabilities that can provide visibiliy into the likey futur business case for sustaining directed energy industry capabilities...the USD (I) staff to be afocalpointfor advocating improvement in all dimensions of directed energy intelligence. - The Director, Defense Inteligence

  13. Expressed Likelihood as Motivator: Creating Value through Engaging What’s Real

    PubMed Central

    Higgins, E. Tory; Franks, Becca; Pavarini, Dana; Sehnert, Steen; Manley, Katie

    2012-01-01

    Our research tested two predictions regarding how likelihood can have motivational effects as a function of how a probability is expressed. We predicted that describing the probability of a future event that could be either A or B using the language of high likelihood (“80% A”) rather than low likelihood (“20% B”), i.e., high rather than low expressed likelihood, would make a present activity more real and engaging, as long as the future event had properties relevant to the present activity. We also predicted that strengthening engagement from the high (vs. low) expressed likelihood of a future event would intensify the value of present positive and negative objects (in opposite directions). Both predictions were supported. There was also evidence that this intensification effect from expressed likelihood was independent of the actual probability or valence of the future event. What mattered was whether high versus low likelihood language was used to describe the future event. PMID:23940411

  14. Well-being, life satisfaction and capabilities of flood disaster victims

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Ootegem, Luc, E-mail: Luc.VanOotegem@UGent.be; SHERPPA–Ghent University; Verhofstadt, Elsy

    The individual well-being of flood disaster victims is examined making use of two concepts: life satisfaction and perceived capabilities in life. These concepts are compared in two samples: a representative sample of Flemish respondents and a specific sample of people that have been the victim of a pluvial flood. Well-being as life satisfaction is found not to be related to past or expected future flooding, whereas well-being as capabilities in life is negatively related to both past and expected future flooding. - Highlights: • Well-being as life satisfaction is not related to past or expected future flooding. • Well-being asmore » capabilities in life is negatively related to flooding. • A disaster can scare people for the future because of the scars that it provokes. • Assess the impact of a disaster not only by monetary damage and life satisfaction.« less

  15. 14 CFR 417.209 - Malfunction turn analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... nozzle burn-through. For each cause of a malfunction turn, the analysis must establish the launch vehicle... the launch vehicle's turning capability in the event of a malfunction during flight. A malfunction... launch vehicle is capable. (4) The time, as a single value or a probability time distribution, when each...

  16. 14 CFR 417.209 - Malfunction turn analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... nozzle burn-through. For each cause of a malfunction turn, the analysis must establish the launch vehicle... the launch vehicle's turning capability in the event of a malfunction during flight. A malfunction... launch vehicle is capable. (4) The time, as a single value or a probability time distribution, when each...

  17. A methodology for evaluating detection performance of ultrasonic array imaging algorithms for coarse-grained materials.

    PubMed

    Van Pamel, Anton; Brett, Colin R; Lowe, Michael J S

    2014-12-01

    Improving the ultrasound inspection capability for coarse-grained metals remains of longstanding interest and is expected to become increasingly important for next-generation electricity power plants. Conventional ultrasonic A-, B-, and C-scans have been found to suffer from strong background noise caused by grain scattering, which can severely limit the detection of defects. However, in recent years, array probes and full matrix capture (FMC) imaging algorithms have unlocked exciting possibilities for improvements. To improve and compare these algorithms, we must rely on robust methodologies to quantify their performance. This article proposes such a methodology to evaluate the detection performance of imaging algorithms. For illustration, the methodology is applied to some example data using three FMC imaging algorithms; total focusing method (TFM), phase-coherent imaging (PCI), and decomposition of the time-reversal operator with multiple scattering filter (DORT MSF). However, it is important to note that this is solely to illustrate the methodology; this article does not attempt the broader investigation of different cases that would be needed to compare the performance of these algorithms in general. The methodology considers the statistics of detection, presenting the detection performance as probability of detection (POD) and probability of false alarm (PFA). A test sample of coarse-grained nickel super alloy, manufactured to represent materials used for future power plant components and containing some simple artificial defects, is used to illustrate the method on the candidate algorithms. The data are captured in pulse-echo mode using 64-element array probes at center frequencies of 1 and 5 MHz. In this particular case, it turns out that all three algorithms are shown to perform very similarly when comparing their flaw detection capabilities.

  18. The anticipated transition to adulthood: effects of culture and individual experience on Polish and Finnish adolescents' future orientations.

    PubMed

    Trempala, J; Malmberg, L E

    1998-05-01

    The purpose of this study was to describe the effect of a set of individual resources and cultural factors on adolescents' probability estimations of the occurrence of positive future events in three life domains: education, occupation, and family. The hypothesis was that the effects of culture and individual resources are interwoven in the formation process of future orientation. The sample consisted of 352 17-year-old Polish and Finnish girls and boys from vocational and upper secondary schools. The 78-item questionnaire developed by the authors was used to measure different aspects of future orientation (probability, valence, and extension of future events in three life domains) and individual resources (self-esteem, control beliefs, and social knowledge about normatively and the generation gap). Data analysis showed that culture separately affected individual resources and adolescents' expectations. However, the results broadly confirmed the thesis that the culture has a limited effect on adolescents' expectations of the occurrence of future events. Moreover, these data suggested that the influence of sociocultural differences on adolescents' probability estimations is indirect. In the context of the presented data, the authors discuss their model of future orientation.

  19. Sensor planning for moving targets

    NASA Astrophysics Data System (ADS)

    Musman, Scott A.; Lehner, Paul; Elsaesser, Chris

    1994-10-01

    Planning a search for moving ground targets is difficult for humans and computationally intractable. This paper describes a technique to solve such problems. The main idea is to combine probability of detection assessments with computational search heuristics to generate sensor plans which approximately maximize either the probability of detection or a user- specified knowledge function (e.g., determining the target's probable destination; locating the enemy tanks). In contrast to super computer-based moving target search planning, our technique has been implemented using workstation technology. The data structures generated by sensor planning can be used to evaluate sensor reports during plan execution. Our system revises its objective function with each sensor report, allowing the user to assess both the current situation as well as the expected value of future information. This capability is particularly useful in situations involving a high rate of sensor reporting, helping the user focus his attention on sensors reports most pertinent to current needs. Our planning approach is implemented in a three layer architecture. The layers are: mobility analysis, followed by sensor coverage analysis, and concluding with sensor plan analysis. It is possible using these layers to describe the physical, spatial, and temporal characteristics of a scenario in the first two layers, and customize the final analysis to specific intelligence objectives. The architecture also allows a user to customize operational parameters in each of the three major components of the system. As examples of these performance options, we briefly describe the mobility analysis and discuss issues affecting sensor plan analysis.

  20. Return to contingency: developing a coherent strategy for future R2E/R3 land medical capabilities.

    PubMed

    Ingram, Mike; Mahan, J

    2015-03-01

    Key to deploying forces in the future will be the provision of a rapidly deployable Deployed Hospital Capability. Developing this capability has been the focus of 34 Field Hospital and 2nd Medical Brigade over the last 18 months and this paper describes a personal account of this development work to date. Future contingent Deployed Hospital Capability must meet the requirements of Defence; that is to be rapidly deployable delivering a hospital standard of care. The excellence seen in clinical delivery on recent operations is intensive; in personnel, equipment, infrastructure and sustainment. The challenge in developing a coherent capability has been in balancing the clinical capability and capacity against strategic load in light of recent advances in battlefield medicine. This paper explores the issues encountered and solutions found to date in reconstituting a Very High Readiness Deployed Hospital Capability. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  1. Germ-line engineering, freedom, and future generations.

    PubMed

    Cooke, Elizabeth F

    2003-02-01

    New technologies in germ-line engineering have raised many questions about obligations to future generations. In this article, I focus on the importance of increasing freedom and the equality of freedom for present and future generations, because these two ideals are necessary for a just society and because they are most threatened by the wide-scale privatisation of GLE technologies. However, there are ambiguities in applying these ideals to the issue of genetic technologies. I argue that Amartya Sen's capability theory can be used as a framework to ensure freedom and equality in the use of GLE technology. Capability theory articulates the goal of equalising real freedom by bringing all people up to a threshold of basic human capabilities. Sen's capability theory can clarify the proper moral goal of GLE insofar as this technology could be used to bring people up to certain basic human capabilities, thereby increasing their real freedom. And by increasing the freedom of those who lack basic human capabilities, GLE can aid in decreasing the inequalities of freedom among classes of people.

  2. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    NASA Astrophysics Data System (ADS)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  3. Future southcentral US wildfire probability due to climate change

    USGS Publications Warehouse

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  4. Assessing the present and future probability of Hurricane Harvey's rainfall

    NASA Astrophysics Data System (ADS)

    Emanuel, Kerry

    2017-11-01

    We estimate, for current and future climates, the annual probability of areally averaged hurricane rain of Hurricane Harvey's magnitude by downscaling large numbers of tropical cyclones from three climate reanalyses and six climate models. For the state of Texas, we estimate that the annual probability of 500 mm of area-integrated rainfall was about 1% in the period 1981–2000 and will increase to 18% over the period 2081–2100 under Intergovernmental Panel on Climate Change (IPCC) AR5 representative concentration pathway 8.5. If the frequency of such event is increasingly linearly between these two periods, then in 2017 the annual probability would be 6%, a sixfold increase since the late 20th century.

  5. Joint Chemical Agent Detector (JCAD): the future of chemical agent detection

    NASA Astrophysics Data System (ADS)

    Laljer, Charles E.; Owen, Jeffery L.

    2002-06-01

    The Joint Chemical Agent Detector (JCAD) will provide state of the art chemical warfare agent detection capability to ground vehicle operators. Intelligence sources estimate that over twenty counties have active chemical weapons programs. The spread of chemical weapons to third world nations, coupled with the potential for US involvement in these areas in an operational or support capacity, increases the probability that the Joint Services may encounter chemical agents and toxic industrial materials anywhere in the world. Currently, fielded chemical agent detectors are bulky, labor intensive, and subject to false readings. No legacy detector is sensitive enough to provide detection and warning of the low dose hazards associated with miosis contamination. The JCAD will provide a small, lightweight chemical agent detector for vehicle interiors, aircraft, individual personnel, shipboard, and fixed site locations. The system provides a common detection components across multi-service platforms. This common detector system will allow the Joint Services to use the same operational and support concept for more efficient utilization of resources. The JCAD will detect, identify, quantify, and warn of the presence of chemical agents prior to onset of miosis. Upon detection of chemical agents, the detector will provide local and remote audible and visual alarms to the operators. Advance warning will provide the vehicle crew with the time necessary to protect themselves from the lethal effects of chemical agents. The JCAD will also be capable of being upgraded to protect against future chemical agent threats. The JCAD will provide the vehicle operators with the warning necessary to survive and fight in a chemical warfare agent threat environment.

  6. The Agility Advantage: A Survival Guide for Complex Enterprises and Endeavors

    DTIC Science & Technology

    2011-09-01

    weighted probability 75th pctl weighted probability 13. Carman. K. G. and Kooreman, P, “ Flu Shots, Mammogram, and the Perception of Probabilities,” 2010...sharing capabilities until users can tes- tify to the benefi ts. This creates a chicken and egg situa- tion, because the consumers of information fi...Bibliography 563 Campen, Alan D. Look Closely at Network-Centric Warfare. Signal, January 2004. Carman, Katherine G., and Peter Kooreman, Peter. Flu Shots

  7. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia.

    PubMed

    Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.

  8. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia

    PubMed Central

    Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054

  9. Concluding Remarks: The Current Status and Future Prospects for GRB Astronomy

    NASA Technical Reports Server (NTRS)

    Gehrels, Neil

    2009-01-01

    We are in a remarkable period of discovery in GRB astronomy. The current satellites including Swift, Fermi. AGILE and INTEGRAL are detecting and observing bursts of all varieties. Increasing capabilities for follow-up observations on the ground and in space are leading to rapid and deep coverage across the electromagnetic spectrum, The future will see continued operation of the current experiments and with future missions like SVOM plus possible rni_Ssions like JANUS and EXIST. An exciting expansion of capabilities is occurring in areas of gravitational waves and neutrinos that could open new windows on the GRB phenomenon. Increased IR capabilities on the ground and with missions like JWST will enable further exploration of high redshift bursts. The future is bright.

  10. Navy/Marine Corps innovative science and technology developments for future enhanced mine detection capabilities

    NASA Astrophysics Data System (ADS)

    Holloway, John H., Jr.; Witherspoon, Ned H.; Miller, Richard E.; Davis, Kenn S.; Suiter, Harold R.; Hilton, Russell J.

    2000-08-01

    JMDT is a Navy/Marine Corps 6.2 Exploratory Development program that is closely coordinated with the 6.4 COBRA acquisition program. The objective of the program is to develop innovative science and technology to enhance future mine detection capabilities. The objective of the program is to develop innovative science and technology to enhance future mine detection capabilities. Prior to transition to acquisition, the COBRA ATD was extremely successful in demonstrating a passive airborne multispectral video sensor system operating in the tactical Pioneer unmanned aerial vehicle (UAV), combined with an integrated ground station subsystem to detect and locate minefields from surf zone to inland areas. JMDT is investigating advanced technology solutions for future enhancements in mine field detection capability beyond the current COBRA ATD demonstrated capabilities. JMDT has recently been delivered next- generation, innovative hardware which was specified by the Coastal System Station and developed under contract. This hardware includes an agile-tuning multispectral, polarimetric, digital video camera and advanced multi wavelength laser illumination technologies to extend the same sorts of multispectral detections from a UAV into the night and over shallow water and other difficult littoral regions. One of these illumination devices is an ultra- compact, highly-efficient near-IR laser diode array. The other is a multi-wavelength range-gateable laser. Additionally, in conjunction with this new technology, algorithm enhancements are being developed in JMDT for future naval capabilities which will outperform the already impressive record of automatic detection of minefields demonstrated by the COBAR ATD.

  11. A Model-Based Architecture Approach to Ship Design Linking Capability Needs to System Solutions

    DTIC Science & Technology

    2012-06-01

    NSSM NATO Sea Sparrow Missile RAM Rolling Airframe Missile CIWS Close-In Weapon System 3D Three Dimensional Ps Probability of Survival PHit ...example effectiveness model. The primary MOP is the inverse of the probability of taking a hit (1- PHit ), which in, this study, will be referred to as

  12. Time transfer techniques: Historical overview, current practices and future capabilities

    NASA Technical Reports Server (NTRS)

    Klepczynski, W. J.

    1984-01-01

    A brief historical review of time transfer techniques used during the last twenty years is presented. Methods currently used are discussed in terms of cost effectiveness as a function of accuracy achievable. Future trends are also discussed in terms of projected timekeeping capabilities.

  13. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    EPA Science Inventory

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends

    A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  14. Assessing the present and future probability of Hurricane Harvey's rainfall.

    PubMed

    Emanuel, Kerry

    2017-11-28

    We estimate, for current and future climates, the annual probability of areally averaged hurricane rain of Hurricane Harvey's magnitude by downscaling large numbers of tropical cyclones from three climate reanalyses and six climate models. For the state of Texas, we estimate that the annual probability of 500 mm of area-integrated rainfall was about 1% in the period 1981-2000 and will increase to 18% over the period 2081-2100 under Intergovernmental Panel on Climate Change (IPCC) AR5 representative concentration pathway 8.5. If the frequency of such event is increasingly linearly between these two periods, then in 2017 the annual probability would be 6%, a sixfold increase since the late 20th century. Copyright © 2017 the Author(s). Published by PNAS.

  15. Military clouds: utilization of cloud computing systems at the battlefield

    NASA Astrophysics Data System (ADS)

    Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai

    2012-05-01

    Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.

  16. Reducing the Risk of Human Space Missions with INTEGRITY

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.

    2003-01-01

    The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.

  17. Emergency preparedness: community-based short-term eruption forecasting at Campi Flegrei

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Marzocchi, Warner; Civetta, Lucia; Del Pezzo, Edoardo; Papale, Paolo

    2010-05-01

    A key element in emergency preparedness is to define advance tools to assist decision makers and emergency management groups during crises. Such tools must be prepared in advance, accounting for all of expertise and scientific knowledge accumulated through time. During a pre-eruptive phase, the key for sound short-term eruption forecasting is the analysis of the monitoring signals. This involves the capability (i) to recognize anomalous signals and to relate single or combined anomalies to physical processes, assigning them probability values, and (ii) to quickly provide an answer to the observed phenomena even when unexpected. Here we present a > 4 years long process devoted to define the pre-eruptive Event Tree (ET) for Campi Flegrei. A community of about 40 experts in volcanology and volcano monitoring participating to two Italian Projects on Campi Flegrei funded by the Italian Civil Protection, has been constituted and trained during periodic meetings on the statistical methods and the model BET_EF (Marzocchi et al., 2008) that forms the statistical package tool for ET definition. Model calibration has been carried out through public elicitation sessions, preceded and followed by devoted meetings and web forum discussion on the monitoring parameters, their accuracy and relevance, and their potential meanings. The calibrated ET allows anomalies in the monitored parameters to be recognized and interpreted, assigning probability values to each set of data. This process de-personalizes the difficult task of interpreting multi-parametric sets of data during on-going emergencies, and provides a view of the observed variations that accounts for the averaged, weighted opinion of the scientific community. An additional positive outcome of the described ET calibration process is that of providing a picture of the degree of confidence by the expert community on the capability of the many different monitored quantities of recognizing significant variations in the state of the volcano. This picture is particularly useful since it can be used to guide future implementations in the monitoring network, as well as research investments aimed at substantially improving the capability to forecast the short-term volcanic hazard.

  18. Ares V an Enabling Capability for Future Space Astrophysics Missions

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2007-01-01

    The potential capability offered by an Ares V launch vehicle completely changes the paradigm for future space astrophysics missions. This presentation examines some details of this capability and its impact on potential missions. A specific case study is presented: implementing a 6 to 8 meter class monolithic UV/Visible telescope at an L2 orbit. Additionally discussed is how to extend the mission life of such a telescope to 30 years or longer.

  19. Implementation of the semiclassical quantum Fourier transform in a scalable system.

    PubMed

    Chiaverini, J; Britton, J; Leibfried, D; Knill, E; Barrett, M D; Blakestad, R B; Itano, W M; Jost, J D; Langer, C; Ozeri, R; Schaetz, T; Wineland, D J

    2005-05-13

    We report the implementation of the semiclassical quantum Fourier transform in a system of three beryllium ion qubits (two-level quantum systems) confined in a segmented multizone trap. The quantum Fourier transform is the crucial final step in Shor's algorithm, and it acts on a register of qubits to determine the periodicity of the quantum state's amplitudes. Because only probability amplitudes are required for this task, a more efficient semiclassical version can be used, for which only single-qubit operations conditioned on measurement outcomes are required. We apply the transform to several input states of different periodicities; the results enable the location of peaks corresponding to the original periods. This demonstration incorporates the key elements of a scalable ion-trap architecture, suggesting the future capability of applying the quantum Fourier transform to a large number of qubits as required for a useful quantum factoring algorithm.

  20. Report of Study On Airlines' Anticipated Near Future Cockpit Control and Display Capabilities and Plans For Data Link Communication, Part 2

    DOT National Transportation Integrated Search

    1995-07-01

    In support of the Federal Aviation Administration (FAA) Airborne Data Link : Program, CTA INCORPORATED researched airlines' anticipated near future cockpit : control and display capabilities and associated plans for Data Link : communication. This ef...

  1. Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.

    PubMed

    Stone, Scott A; Tata, Matthew S

    2017-01-01

    Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible.

  2. Rendering visual events as sounds: Spatial attention capture by auditory augmented reality

    PubMed Central

    Tata, Matthew S.

    2017-01-01

    Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible. PMID:28792518

  3. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  4. Fire spread probabilities for experimental beds composed of mixedwood boreal forest fuels

    Treesearch

    M.B. Dickinson; E.A. Johnson; R. Artiaga

    2013-01-01

    Although fuel characteristics are assumed to have an important impact on fire regimes through their effects on extinction dynamics, limited capabilities exist for predicting whether a fire will spread in mixedwood boreal forest surface fuels. To improve predictive capabilities, we conducted 347 no-wind, laboratory test burns in surface fuels collected from the mixed-...

  5. The Origin of the United States Security Commitment to the Republic of Korea

    DTIC Science & Technology

    1987-06-01

    and war and of the effect of these objectives on our strategic plans, in the light of the probable fission bomb capability and possible thermonuclear ... bomb capability of the Soviet Union. 226 A special ad hoc working group was formed under Nitze to conduct this study. This group took advantage of the

  6. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  7. Comments on potential geologic and seismic hazards affecting coastal Ventura County, California

    USGS Publications Warehouse

    Ross, Stephanie L.; Boore, David M.; Fisher, Michael A.; Frankel, Arthur D.; Geist, Eric L.; Hudnut, Kenneth W.; Kayen, Robert E.; Lee, Homa J.; Normark, William R.; Wong, Florence L.

    2004-01-01

    This report examines the regional seismic and geologic hazards that could affect proposed liquefied natural gas (LNG) facilities in coastal Ventura County, California. Faults throughout this area are thought to be capable of producing earthquakes of magnitude 6.5 to 7.5, which could produce surface fault offsets of as much as 15 feet. Many of these faults are sufficiently well understood to be included in the current generation of the National Seismic Hazard Maps; others may become candidates for inclusion in future revisions as research proceeds. Strong shaking is the primary hazard that causes damage from earthquakes and this area is zoned with a high level of shaking hazard. The estimated probability of a magnitude 6.5 or larger earthquake (comparable in size to the 2003 San Simeon quake) occurring in the next 30 years within 30 miles of Platform Grace is 50-60%; for Cabrillo Port, the estimate is a 35% likelihood. Combining these probabilities of earthquake occurrence with relationships that give expected ground motions yields the estimated seismic-shaking hazard. In parts of the project area, the estimated shaking hazard is as high as along the San Andreas Fault. The combination of long-period basin waves and LNG installations with large long-period resonances potentially increases this hazard.

  8. NASA capabilities roadmap: advanced telescopes and observatories

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee D.

    2005-01-01

    The NASA Advanced Telescopes and Observatories (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories collecting all electromagnetic bands, ranging from x-rays to millimeter waves, and including gravity-waves. It has derived capability priorities from current and developing Space Missions Directorate (SMD) strategic roadmaps and, where appropriate, has ensured their consistency with other NASA Strategic and Capability Roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.

  9. A review of future remote sensing satellite capabilities

    NASA Technical Reports Server (NTRS)

    Calabrese, M. A.

    1980-01-01

    Existing, planned and future NASA capabilities in the field of remote sensing satellites are reviewed in relation to the use of remote sensing techniques for the identification of irrigated lands. The status of the currently operational Landsat 2 and 3 satellites is indicated, and it is noted that Landsat D is scheduled to be in operation in two years. The orbital configuration and instrumentation of Landsat D are discussed, with particular attention given to the thematic mapper, which is expected to improve capabilities for small field identification and crop discrimination and classification. Future possibilities are then considered, including a multi-spectral resource sampler supplying high spatial and temporal resolution data possibly based on push-broom scanning, Shuttle-maintained Landsat follow-on missions, a satellite to obtain high-resolution stereoscopic data, further satellites providing all-weather radar capability and the Large Format Camera.

  10. Oceans in the Outer Solar System: Future Exploration of Europa, Titan, and Enceladus

    NASA Astrophysics Data System (ADS)

    Johnson, T.; Clark, K.; Cutts, J.; Lunine, J.; Pappalardo, R.; Reh, K.

    Observational and theoretical evidence points to water-rich oceans or seas within several of the icy satellites of the outer planets, notably Europa and Enceladus, and hydrocarbon reservoirs within Titan. Here we report on concepts for future studies of these fascinating targets of high astrobiological relevance. Europa Exploration: Post-Galileo exploration of Europa presents several major technical challenges. We argue that four recent investments in technology and research allow a flagship mission class Europa exploration that relies on demonstrated technologies and achieves the high level science objectives. 1. Mass and Trip Time: Utilizing indirect Earth gravity assist, trajectories allows ˜2000 - 3000 kg dry mass, permitting ˜150 - 200 kg of science payload. 2. Radiation Tolerant Electronics: A significant program of radiation hard technology development has been done by NASA. The necessary radiation-tolerant elements are now ready for flight. 3. Science Mission: The science mission would last approximately two years, with a Jupiter system science phase of ˜1.5 yr and a 90 day nominal orbital mission at Europa, with significant probability of functioning much longer. 4. Planetary Protection: The ultimate fate of an orbiter will be impact with Europa. Planetary protection requirements will be met by radiation sterilization during the primary mission for most external and unshielded internal surfaces, combined with pre-launch sterilization of shielded components. We conclude that a flagship class Europa mission can now be developed relying on existing technologies, having significant scientific capability. Titan and Enceladus Exploration: Remarkable discoveries by the Cassini/Huygens related to hydrocarbons at Titan and water vapor geysering at Enceladus demand follow-up of these astrobiologically relevant worlds by future missions. An aerial platform capable of observing the surface of Titan from beneath the obscuring cloud cover and descending repeatedly to the surface, can offer a powerful scientific capability. Taking advantage of both the density and cold temperature of the atmosphere of Titan a hot-air balloon implementation provides long duration operation at a very modest cost in terms of energy input. A Saturn orbiter making repeated encounters of Titan and Enceladus in a so-called cycler orbit can carry out new science at Enceladus while also providing high bandwidth downlink communications for the aerial platform.

  11. Predicting future changes in Muskegon River Watershed game fish distributions under future land cover alteration and climate change scenarios

    USGS Publications Warehouse

    Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.

    2010-01-01

    Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.

  12. Seasonal streamflow prediction using ensemble streamflow prediction technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Singh, Shailesh Kumar

    2014-05-01

    Streamflow forecasts are essential for making critical decision for optimal allocation of water supplies for various demands that include irrigation for agriculture, habitat for fisheries, hydropower production and flood warning. The major objective of this study is to explore the Ensemble Streamflow Prediction (ESP) based forecast in New Zealand catchments and to highlights the present capability of seasonal flow forecasting of National Institute of Water and Atmospheric Research (NIWA). In this study a probabilistic forecast framework for ESP is presented. The basic assumption in ESP is that future weather pattern were experienced historically. Hence, past forcing data can be used with current initial condition to generate an ensemble of prediction. Small differences in initial conditions can result in large difference in the forecast. The initial state of catchment can be obtained by continuously running the model till current time and use this initial state with past forcing data to generate ensemble of flow for future. The approach taken here is to run TopNet hydrological models with a range of past forcing data (precipitation, temperature etc.) with current initial conditions. The collection of runs is called the ensemble. ESP give probabilistic forecasts for flow. From ensemble members the probability distributions can be derived. The probability distributions capture part of the intrinsic uncertainty in weather or climate. An ensemble stream flow prediction which provide probabilistic hydrological forecast with lead time up to 3 months is presented for Rangitata, Ahuriri, and Hooker and Jollie rivers in South Island of New Zealand. ESP based seasonal forecast have better skill than climatology. This system can provide better over all information for holistic water resource management.

  13. A Man-Machine System for Contemporary Counseling Practice: Diagnosis and Prediction.

    ERIC Educational Resources Information Center

    Roach, Arthur J.

    This paper looks at present and future capabilities for diagnosis and prediction in computer-based guidance efforts and reviews the problems and potentials which will accompany the implementation of such capabilities. In addition to necessary procedural refinement in prediction, future developments in computer-based educational and career…

  14. The effects of neighborhood views containing multiple environmental features on road traffic noise perception at dwellings.

    PubMed

    Leung, T M; Xu, J M; Chau, C K; Tang, S K; Pun-Cheng, L S C

    2017-04-01

    The importance of non-acoustical factors including the type of visual environment on human noise perception becomes increasingly recognized. In order to reveal the relationships between long-term noise annoyance and different types of neighborhood views, 2033 questionnaire responses were collected for studying the effect of perceptions of different combinations of views of sea, urban river, greenery, and/or noise barrier on the annoyance responses from residents living in high-rise apartments in Hong Kong. The collected responses were employed to formulate a multivariate model to predict the probability of invoking a high annoyance response from residents. Results showed that views of sea, urban river, or greenery could lower the probability, while views of noise barrier could increase the probability. Views of greenery had a stronger noise moderation capability than views of sea or urban river. The presence of an interaction effect between views of water and views of noise barrier exerted a negative influence on the noise annoyance moderation capability. The probability due to exposure to an environment containing views of noise barriers and urban rivers would be even higher than that due to exposure to an environment containing views of noise barriers alone.

  15. GIS-based probability assessment of natural hazards in forested landscapes of Central and South-Eastern Europe.

    PubMed

    Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F

    2010-12-01

    We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.

  16. 32 CFR 154.40 - General.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...

  17. 32 CFR 154.40 - General.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... effort to assess the probability of future behavior which could have an effect adverse to the national... prior experience with similar cases, reasonably suggest a degree of probability of prejudicial behavior...

  18. Camouflage through colour change: mechanisms, adaptive value and ecological significance

    PubMed Central

    Flores, Augusto A. V.

    2017-01-01

    Animals from a wide range of taxonomic groups are capable of colour change, of which camouflage is one of the main functions. A considerable amount of past work on this subject has investigated species capable of extremely rapid colour change (in seconds). However, relatively slow colour change (over hours, days, weeks and months), as well as changes arising via developmental plasticity are probably more common than rapid changes, yet less studied. We discuss three key areas of colour change and camouflage. First, we review the mechanisms underpinning colour change and developmental plasticity for camouflage, including cellular processes, visual feedback, hormonal control and dietary factors. Second, we discuss the adaptive value of colour change for camouflage, including the use of different camouflage types. Third, we discuss the evolutionary–ecological implications of colour change for concealment, including what it can tell us about intraspecific colour diversity, morph-specific strategies, and matching to different environments and microhabitats. Throughout, we discuss key unresolved questions and present directions for future work, and highlight how colour change facilitates camouflage among habitats and arises when animals are faced with environmental changes occurring over a range of spatial and temporal scales. This article is part of the themed issue ‘Animal coloration: production, perception, function and application’. PMID:28533459

  19. Camouflage through colour change: mechanisms, adaptive value and ecological significance.

    PubMed

    Duarte, Rafael C; Flores, Augusto A V; Stevens, Martin

    2017-07-05

    Animals from a wide range of taxonomic groups are capable of colour change, of which camouflage is one of the main functions. A considerable amount of past work on this subject has investigated species capable of extremely rapid colour change (in seconds). However, relatively slow colour change (over hours, days, weeks and months), as well as changes arising via developmental plasticity are probably more common than rapid changes, yet less studied. We discuss three key areas of colour change and camouflage. First, we review the mechanisms underpinning colour change and developmental plasticity for camouflage, including cellular processes, visual feedback, hormonal control and dietary factors. Second, we discuss the adaptive value of colour change for camouflage, including the use of different camouflage types. Third, we discuss the evolutionary-ecological implications of colour change for concealment, including what it can tell us about intraspecific colour diversity, morph-specific strategies, and matching to different environments and microhabitats. Throughout, we discuss key unresolved questions and present directions for future work, and highlight how colour change facilitates camouflage among habitats and arises when animals are faced with environmental changes occurring over a range of spatial and temporal scales.This article is part of the themed issue 'Animal coloration: production, perception, function and application'. © 2017 The Authors.

  20. Summary of NASA Advanced Telescope and Observatory Capability Roadmap

    NASA Technical Reports Server (NTRS)

    Stahl, H. Phil; Feinberg, Lee

    2006-01-01

    The NASA Advanced Telescope and Observatory (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories operating in all electromagnetic bands, from x-rays to millimeter waves, and including gravity-waves. It lists capability priorities derived from current and developing Space Missions Directorate (SMD) strategic roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.

  1. Summary of NASA Advanced Telescope and Observatory Capability Roadmap

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Feinberg, Lee

    2007-01-01

    The NASA Advanced Telescope and Observatory (ATO) Capability Roadmap addresses technologies necessary for NASA to enable future space telescopes and observatories operating in all electromagnetic bands, from x-rays to millimeter waves, and including gravity-waves. It lists capability priorities derived from current and developing Space Missions Directorate (SMD) strategic roadmaps. Technology topics include optics; wavefront sensing and control and interferometry; distributed and advanced spacecraft systems; cryogenic and thermal control systems; large precision structure for observatories; and the infrastructure essential to future space telescopes and observatories.

  2. Software Architecture: Managing Design for Achieving Warfighter Capability

    DTIC Science & Technology

    2007-04-30

    The Government’s requirements and specifications for a new weapon...at the Preliminary Design Review (PDR) is likely to have a much higher probability of meeting the warfighters’ need for capability. Test -case...inventories of test cases are developed from the user-defined scenarios so that there is one or more test case for every scenario. The test cases will

  3. Infrared Astrophysics in the SOFIA Era - An Overview

    NASA Astrophysics Data System (ADS)

    Yorke, Harold W.

    2018-06-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) provides the international astronomical community access to a broad range of instrumentation that covers wavelengths spanning the near to far infrared. The high spectral resolution of many of these instruments in several wavelength bands is unmatched by any existing or near future planned facility. The far infrared polarization capabilities of one of its instruments, HAWC+, is also unique. Moreover, SOFIA allows for additional instrument augmentations, as new state-of-the-art photometric, spectrometric, and polarimetric capabilities have been added and are being further improved. The fact that SOFIA provides ample mass, power, computing capabilities as well as 4K cooling eases the constraints on future instrument design, technical readiness, and the instrument build to an extent not possible for space-borne missions. We will review SOFIA's current and future planned capabilities and highlight specific science areas for which the stratospheric observatory will be able to significantly advance Origins science topics.

  4. Large-Eddy Simulation: Current Capabilities, Recommended Practices, and Future Research

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Rizzetta, Donald P.; Fureby, Christer

    2009-01-01

    This paper presents the results of an activity by the Large Eddy Simulation (LES) Working Group of the AIAA Fluid Dynamics Technical Committee to (1) address the current capabilities of LES, (2) outline recommended practices and key considerations for using LES, and (3) identify future research needs to advance the capabilities and reliability of LES for analysis of turbulent flows. To address the current capabilities and future needs, a survey comprised of eleven questions was posed to LES Working Group members to assemble a broad range of perspectives on important topics related to LES. The responses to these survey questions are summarized with the intent not to be a comprehensive dictate on LES, but rather the perspective of one group on some important issues. A list of recommended practices is also provided, which does not treat all aspects of a LES, but provides guidance on some of the key areas that should be considered.

  5. Human-machine interface hardware: The next decade

    NASA Technical Reports Server (NTRS)

    Marcus, Elizabeth A.

    1991-01-01

    In order to understand where human-machine interface hardware is headed, it is important to understand where we are today, how we got there, and what our goals for the future are. As computers become more capable, faster, and programs become more sophisticated, it becomes apparent that the interface hardware is the key to an exciting future in computing. How can a user interact and control a seemingly limitless array of parameters effectively? Today, the answer is most often a limitless array of controls. The link between these controls and human sensory motor capabilities does not utilize existing human capabilities to their full extent. Interface hardware for teleoperation and virtual environments is now facing a crossroad in design. Therefore, we as developers need to explore how the combination of interface hardware, human capabilities, and user experience can be blended to get the best performance today and in the future.

  6. Will Robots Ever Replace Attendants? Exploring the Current Capabilities and Future Potential of Robots in Education and Rehabilitation.

    ERIC Educational Resources Information Center

    Lees, David; LePage, Pamela

    1994-01-01

    This article describes the current capabilities and future potential of robots designed as supplements or replacements for human assistants or as tools for education and rehabilitation of people with disabilities. Review of robots providing educational, vocational, or independent living assistance concludes that eventually effective, reliable…

  7. A pilot study of naturally occurring high-probability request sequences in hostage negotiations.

    PubMed

    Hughes, James

    2009-01-01

    In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research.

  8. A PILOT STUDY OF NATURALLY OCCURRING HIGH-PROBABILITY REQUEST SEQUENCES IN HOSTAGE NEGOTIATIONS

    PubMed Central

    Hughes, James

    2009-01-01

    In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research. PMID:19949541

  9. Key Future Engineering Capabilities for Human Capital Retention

    NASA Astrophysics Data System (ADS)

    Sivich, Lorrie

    Projected record retirements of Baby Boomer generation engineers have been predicted to result in significant losses of mission-critical knowledge in space, national security, and future scientific ventures vital to high-technology corporations. No comprehensive review or analysis of engineering capabilities has been performed to identify threats related to the specific loss of mission-critical knowledge posed by the increasing retirement of tenured engineers. Archival data from a single diversified Fortune 500 aerospace manufacturing engineering company's engineering career database were analyzed to ascertain whether relationships linking future engineering capabilities, engineering disciplines, and years of engineering experience could be identified to define critical knowledge transfer models. Chi square, logistic, and linear regression analyses were used to map patterns of discipline-specific, mission-critical knowledge using archival data of engineers' perceptions of engineering capabilities, key developmental experiences, and knowledge learned from their engineering careers. The results from the study were used to document key engineering future capabilities. The results were then used to develop a proposed human capital retention plan to address specific key knowledge gaps of younger engineers as veteran engineers retire. The potential for social change from this study involves informing leaders of aerospace engineering corporations on how to build better quality mentoring or succession plans to fill the void of lost knowledge from retiring engineers. This plan can secure mission-critical knowledge for younger engineers for current and future product development and increased global competitiveness in the technology market.

  10. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2007-01-01

    This viewgraph presentation reviews some of the problems that are encountered by designers of Non-Destructive Examination (NDE) have in determining the probability of detection. According to the author "[the] NDE community should not blindly accept statistical results due to lack of knowledge." This is an attempt to bridge the gap between people doing NDE, and statisticians.

  11. Defence Technology Strategy for the Demands of the 21st Century

    DTIC Science & Technology

    2006-10-01

    understanding of human capability in the CBM role. Ownership of the intellectual property behind algorithms may be sovereign10, but implementation will...synchronisation schemes. · coding schemes. · modulation techniques. · access schemes. · smart spectrum usage . · low probability of intercept. · implementation...modulation techniques; access schemes; smart spectrum usage ; low probability of intercept Spectrum and bandwidth management · cross layer technologies to

  12. Probabilistic assessment of precipitation-triggered landslides using historical records of landslide occurence, Seattle, Washington

    USGS Publications Warehouse

    Coe, J.A.; Michael, J.A.; Crovelli, R.A.; Savage, W.Z.; Laprade, W.T.; Nashem, W.D.

    2004-01-01

    Ninety years of historical landslide records were used as input to the Poisson and binomial probability models. Results from these models show that, for precipitation-triggered landslides, approximately 9 percent of the area of Seattle has annual exceedance probabilities of 1 percent or greater. Application of the Poisson model for estimating the future occurrence of individual landslides results in a worst-case scenario map, with a maximum annual exceedance probability of 25 percent on a hillslope near Duwamish Head in West Seattle. Application of the binomial model for estimating the future occurrence of a year with one or more landslides results in a map with a maximum annual exceedance probability of 17 percent (also near Duwamish Head). Slope and geology both play a role in localizing the occurrence of landslides in Seattle. A positive correlation exists between slope and mean exceedance probability, with probability tending to increase as slope increases. Sixty-four percent of all historical landslide locations are within 150 m (500 ft, horizontal distance) of the Esperance Sand/Lawton Clay contact, but within this zone, no positive or negative correlation exists between exceedance probability and distance to the contact.

  13. Methods, apparatus and system for notification of predictable memory failure

    DOEpatents

    Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.

    2017-01-03

    A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.

  14. Brain injury prediction: assessing the combined probability of concussion using linear and rotational head acceleration.

    PubMed

    Rowson, Steven; Duma, Stefan M

    2013-05-01

    Recent research has suggested possible long term effects due to repetitive concussions, highlighting the importance of developing methods to accurately quantify concussion risk. This study introduces a new injury metric, the combined probability of concussion, which computes the overall risk of concussion based on the peak linear and rotational accelerations experienced by the head during impact. The combined probability of concussion is unique in that it determines the likelihood of sustaining a concussion for a given impact, regardless of whether the injury would be reported or not. The risk curve was derived from data collected from instrumented football players (63,011 impacts including 37 concussions), which was adjusted to account for the underreporting of concussion. The predictive capability of this new metric is compared to that of single biomechanical parameters. The capabilities of these parameters to accurately predict concussion incidence were evaluated using two separate datasets: the Head Impact Telemetry System (HITS) data and National Football League (NFL) data collected from impact reconstructions using dummies (58 impacts including 25 concussions). Receiver operating characteristic curves were generated, and all parameters were significantly better at predicting injury than random guessing. The combined probability of concussion had the greatest area under the curve for all datasets. In the HITS dataset, the combined probability of concussion and linear acceleration were significantly better predictors of concussion than rotational acceleration alone, but not different from each other. In the NFL dataset, there were no significant differences between parameters. The combined probability of concussion is a valuable method to assess concussion risk in a laboratory setting for evaluating product safety.

  15. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  16. Perceptions of Present and Future Capability among a Sample of Rural British Columbia Youth Perceptions

    ERIC Educational Resources Information Center

    Kapil, Meg E.; Shepard, Blythe C.

    2011-01-01

    A cross-sectional survey explored 96 rural adolescents' perceptions of their rural context and how their self-concept is related to perceptions of capability regarding hopes and fears for the future. The youth surveyed, from the Kootenay Boundary region of British Columbia, indicated ambivalence about staying in their communities after leaving…

  17. Handbook for Conducting Future Studies in Education.

    ERIC Educational Resources Information Center

    Phi Delta Kappa, Bloomington, IN.

    This handbook is designed to aid school administrators, policy-makers, and teachers in bringing a "futures orientation" to their schools. The first part of the book describes a "futuring process" developed as a tool for examining alternative future probabilities. It consists of a series of diverging and converging techniques that alternately…

  18. Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success

    DTIC Science & Technology

    2009-09-01

    comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to

  19. A portable life support system for use in mines

    NASA Technical Reports Server (NTRS)

    Zeller, S. S.

    1972-01-01

    The portable life support system described in this paper represents a potential increase in the probability of survival for miners who are trapped underground by a fire or explosion. The habitability and life support capability of the prototype shelter have proved excellent. Development of survival chamber life support systems for wide use in coal mines is definitely within the capabilities of current technology.

  20. A blueprint for demonstrating quantum supremacy with superconducting qubits

    NASA Astrophysics Data System (ADS)

    Neill, C.; Roushan, P.; Kechedzhi, K.; Boixo, S.; Isakov, S. V.; Smelyanskiy, V.; Megrant, A.; Chiaro, B.; Dunsworth, A.; Arya, K.; Barends, R.; Burkett, B.; Chen, Y.; Chen, Z.; Fowler, A.; Foxen, B.; Giustina, M.; Graff, R.; Jeffrey, E.; Huang, T.; Kelly, J.; Klimov, P.; Lucero, E.; Mutus, J.; Neeley, M.; Quintana, C.; Sank, D.; Vainsencher, A.; Wenner, J.; White, T. C.; Neven, H.; Martinis, J. M.

    2018-04-01

    A key step toward demonstrating a quantum system that can address difficult problems in physics and chemistry will be performing a computation beyond the capabilities of any classical computer, thus achieving so-called quantum supremacy. In this study, we used nine superconducting qubits to demonstrate a promising path toward quantum supremacy. By individually tuning the qubit parameters, we were able to generate thousands of distinct Hamiltonian evolutions and probe the output probabilities. The measured probabilities obey a universal distribution, consistent with uniformly sampling the full Hilbert space. As the number of qubits increases, the system continues to explore the exponentially growing number of states. Extending these results to a system of 50 qubits has the potential to address scientific questions that are beyond the capabilities of any classical computer.

  1. A tradeoff study of determine the optimum approach to a wash/rinse capability to support future space flight

    NASA Technical Reports Server (NTRS)

    Wilson, D. A.

    1976-01-01

    Specific requirements for a wash/rinse capability to support Spacelab biological experimentation and to identify various concepts for achieving this capability were determined. This included the examination of current state-of-the-art and emerging technology designs that would meet the wash/rinse requirements. Once several concepts were identified, including the disposable utensils, tools and gloves or other possible alternatives, a tradeoff analysis involving system cost, weight, volume utilization, functional performance, maintainability, reliability, power utilization, safety, complexity, etc., was performed so as to determine an optimum approach for achieving a wash/rinse capability to support future space flights. Missions of varying crew size and durations were considered.

  2. The Simulation Heuristic.

    DTIC Science & Technology

    1981-05-15

    Crane. is capable of imagining unicorns -- and we expect he is -- why does he find it relatively difficult to imagine himself avoiding a 30 minute...probability that the plan will succeed and to evaluate the risk of various causes of failure . We have suggested that the construction of scenarios is...expect that events will unfold as planned. However, the cumulative probability of at least one fatal failure could be overwhelmingly high even when

  3. Incorporating population viability models into species status assessment and listing decisions under the U.S. Endangered Species Act

    USGS Publications Warehouse

    McGowan, Conor P.; Allan, Nathan; Servoss, Jeff; Hedwall, Shaula J.; Wooldridge, Brian

    2017-01-01

    Assessment of a species' status is a key part of management decision making for endangered and threatened species under the U.S. Endangered Species Act. Predicting the future state of the species is an essential part of species status assessment, and projection models can play an important role in developing predictions. We built a stochastic simulation model that incorporated parametric and environmental uncertainty to predict the probable future status of the Sonoran desert tortoise in the southwestern United States and North Central Mexico. Sonoran desert tortoise was a Candidate species for listing under the Endangered Species Act, and decision makers wanted to use model predictions in their decision making process. The model accounted for future habitat loss and possible effects of climate change induced droughts to predict future population growth rates, abundances, and quasi-extinction probabilities. Our model predicts that the population will likely decline over the next few decades, but there is very low probability of quasi-extinction less than 75 years into the future. Increases in drought frequency and intensity may increase extinction risk for the species. Our model helped decision makers predict and characterize uncertainty about the future status of the species in their listing decision. We incorporated complex ecological processes (e.g., climate change effects on tortoises) in transparent and explicit ways tailored to support decision making processes related to endangered species.

  4. Earthquake outlook for the San Francisco Bay region 2014–2043

    USGS Publications Warehouse

    Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn

    2016-06-13

    Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.

  5. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  6. Microgrid Optimal Scheduling With Chance-Constrained Islanding Capability

    DOE PAGES

    Liu, Guodong; Starke, Michael R.; Xiao, B.; ...

    2017-01-13

    To facilitate the integration of variable renewable generation and improve the resilience of electricity sup-ply in a microgrid, this paper proposes an optimal scheduling strategy for microgrid operation considering constraints of islanding capability. A new concept, probability of successful islanding (PSI), indicating the probability that a microgrid maintains enough spinning reserve (both up and down) to meet local demand and accommodate local renewable generation after instantaneously islanding from the main grid, is developed. The PSI is formulated as mixed-integer linear program using multi-interval approximation taking into account the probability distributions of forecast errors of wind, PV and load. With themore » goal of minimizing the total operating cost while preserving user specified PSI, a chance-constrained optimization problem is formulated for the optimal scheduling of mirogrids and solved by mixed integer linear programming (MILP). Numerical simulations on a microgrid consisting of a wind turbine, a PV panel, a fuel cell, a micro-turbine, a diesel generator and a battery demonstrate the effectiveness of the proposed scheduling strategy. Lastly, we verify the relationship between PSI and various factors.« less

  7. Modeling Aircraft Position and Conservatively Calculating Airspace Violations for an Autonomous Collision Awareness System for Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Ueunten, Kevin K.

    With the scheduled 30 September 2015 integration of Unmanned Aerial System (UAS) into the national airspace, the Federal Aviation Administration (FAA) is concerned with UAS capabilities to sense and avoid conflicts. Since the operator is outside the cockpit, the proposed collision awareness plugin (CAPlugin), based on probability and error propagation, conservatively predicts potential conflicts with other aircraft and airspaces, thus increasing the operator's situational awareness. The conflict predictions are calculated using a forward state estimator (FSE) and a conflict calculator. Predicting an aircraft's position, modeled as a mixed Gaussian distribution, is the FSE's responsibility. Furthermore, the FSE supports aircraft engaged in the following three flight modes: free flight, flight path following and orbits. The conflict calculator uses the FSE result to calculate the conflict probability between an aircraft and airspace or another aircraft. Finally, the CAPlugin determines the highest conflict probability and warns the operator. In addition to discussing the FSE free flight, FSE orbit and the airspace conflict calculator, this thesis describes how each algorithm is implemented and tested. Lastly two simulations demonstrates the CAPlugin's capabilities.

  8. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  9. A computer program for uncertainty analysis integrating regression and Bayesian methods

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary

    2014-01-01

    This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.

  10. Convergence of Transition Probability Matrix in CLVMarkov Models

    NASA Astrophysics Data System (ADS)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  11. Identifying management competencies for health care executives: review of a series of Delphi studies.

    PubMed

    Hudak, R P; Brooke, P P; Finstuen, K

    2000-01-01

    This analysis reviews a selected body of research that identifies the essential areas of management expertise required of future health care executives. To ensure consistency, six studies are analyzed, utilizing the Delphi technique, to query a broad spectrum of experts in different fields and sites of health care management. The analysis identifies a number of management competencies, i.e., managerial capabilities, which current and aspiring health care executives, in various settings and with differing educational backgrounds, should possess to enhance the probability of their success in current and future positions of responsibility. In addition, this review identifies the skills (technical expertise), knowledge (facts and principles) and abilities (physical, mental or legal power) required to support achievement of these competencies. Leadership and resource management, including cost and finance dimensions, are the highest-rated requisite management competencies. The dominant skills, knowledge and abilities (SKAs) are related to interpersonal skills. The lowest-rated SKAs are related to job-specific, technical skills. Recommendations include the review of this research by formal and continuing education programs to determine the content of their courses and areas for future research. Similarly, current health care executives should assess this research to assist in identifying competency gaps. Lastly, this analysis recommends that the Delphi technique, as a valid and replicable methodology, be applied toward the study of non-executive health care managers, e.g., students, clinicians, mid-level managers and integrated systems administrators, to determine their requisite management competencies and SKAs.

  12. 10 CFR 626.7 - Royalty transfer and exchange.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) of this section, DOE determines there is a high probability that the cost to the Government can be... supply or refining capability, logistical problems for moving petroleum products, macroeconomic factors...

  13. A 30-year history of earthquake crisis communication in California and lessons for the future

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  14. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  15. Use of the gamma distribution to represent monthly rainfall in Africa for drought monitoring applications

    USGS Publications Warehouse

    Husak, Gregory J.; Michaelsen, Joel C.; Funk, Christopher C.

    2007-01-01

    Evaluating a range of scenarios that accurately reflect precipitation variability is critical for water resource applications. Inputs to these applications can be provided using location- and interval-specific probability distributions. These distributions make it possible to estimate the likelihood of rainfall being within a specified range. In this paper, we demonstrate the feasibility of fitting cell-by-cell probability distributions to grids of monthly interpolated, continent-wide data. Future work will then detail applications of these grids to improved satellite-remote sensing of drought and interpretations of probabilistic climate outlook forum forecasts. The gamma distribution is well suited to these applications because it is fairly familiar to African scientists, and capable of representing a variety of distribution shapes. This study tests the goodness-of-fit using the Kolmogorov–Smirnov (KS) test, and compares these results against another distribution commonly used in rainfall events, the Weibull. The gamma distribution is suitable for roughly 98% of the locations over all months. The techniques and results presented in this study provide a foundation for use of the gamma distribution to generate drivers for various rain-related models. These models are used as decision support tools for the management of water and agricultural resources as well as food reserves by providing decision makers with ways to evaluate the likelihood of various rainfall accumulations and assess different scenarios in Africa. 

  16. What is a missing link among wireless persistent surveillance?

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Szu, Harold

    2011-06-01

    The next generation surveillance system will equip with versatile sensor devices and information focus capable of conducting regular and irregular surveillance and security environments worldwide. The community of the persistent surveillance must invest the limited energy and money effectively into researching enabling technologies such as nanotechnology, wireless networks, and micro-electromechanical systems (MEMS) to develop persistent surveillance applications for the future. Wireless sensor networks can be used by the military for a number of purposes such as monitoring militant activity in remote areas and force protection. Being equipped with appropriate sensors these networks can enable detection of enemy movement, identification of enemy force and analysis of their movement and progress. Among these sensor network technologies, covert communication is one of the challenging tasks in the persistent surveillance because it is highly demanded to provide secured sensor nodes and linkage for fear of deliberate sabotage. Due to the matured VLSI/DSP technologies, affordable COTS of UWB technology with noise-like direct sequence (DS) time-domain pulses is a potential solution to support low probability of intercept and low probability of detection (LPI/LPD) data communication and transmission. This paper will describe a number of technical challenges in wireless persistent surveillance development include covert communication, network control and routing, collaborating signal and information processing, and etc. The paper concludes by presenting Hermitian Wavelets to enhance SNR in support of secured communication.

  17. Performance analysis of the word synchronization properties of the outer code in a TDRSS decoder

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.

    1984-01-01

    A self-synchronizing coding scheme for NASA's TDRSS satellite system is a concatenation of a (2,1,7) inner convolutional code with a (255,223) Reed-Solomon outer code. Both symbol and word synchronization are achieved without requiring that any additional symbols be transmitted. An important parameter which determines the performance of the word sync procedure is the ratio of the decoding failure probability to the undetected error probability. Ideally, the former should be as small as possible compared to the latter when the error correcting capability of the code is exceeded. A computer simulation of a (255,223) Reed-Solomon code as carried out. Results for decoding failure probability and for undetected error probability are tabulated and compared.

  18. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  19. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  20. Potential for Zika Virus to Establish a Sylvatic Transmission Cycle in the Americas

    PubMed Central

    Althouse, Benjamin M.; Vasilakis, Nikos; Sall, Amadou A.; Diallo, Mawlouth; Weaver, Scott C.; Hanley, Kathryn A.

    2016-01-01

    Zika virus (ZIKV) originated and continues to circulate in a sylvatic transmission cycle between non-human primate hosts and arboreal mosquitoes in tropical Africa. Recently ZIKV invaded the Americas, where it poses a threat to human health, especially to pregnant women and their infants. Here we examine the risk that ZIKV will establish a sylvatic cycle in the Americas, focusing on Brazil. We review the natural history of sylvatic ZIKV and present a mathematical dynamic transmission model to assess the probability of establishment of a sylvatic ZIKV transmission cycle in non-human primates and/or other mammals and arboreal mosquito vectors in Brazil. Brazil is home to multiple species of primates and mosquitoes potentially capable of ZIKV transmission, though direct assessment of host competence (ability to mount viremia sufficient to infect a feeding mosquito) and vector competence (ability to become infected with ZIKV and disseminate and transmit upon subsequent feedings) of New World species is lacking. Modeling reveals a high probability of establishment of sylvatic ZIKV across a large range of biologically plausible parameters. Probability of establishment is dependent on host and vector population sizes, host birthrates, and ZIKV force of infection. Research on the host competence of New World monkeys or other small mammals to ZIKV, on vector competence of New World Aedes, Sabethes, and Haemagogus mosquitoes for ZIKV, and on the geographic range of potential New World hosts and vectors is urgently needed. A sylvatic cycle of ZIKV would make future elimination efforts in the Americas practically impossible, and paints a dire picture for the epidemiology of ZIKV and our ability to end the ongoing outbreak of congenital Zika syndrome. PMID:27977671

  1. A Bayesian network to predict coastal vulnerability to sea level rise

    USGS Publications Warehouse

    Gutierrez, B.T.; Plant, N.G.; Thieler, E.R.

    2011-01-01

    Sea level rise during the 21st century will have a wide range of effects on coastal environments, human development, and infrastructure in coastal areas. The broad range of complex factors influencing coastal systems contributes to large uncertainties in predicting long-term sea level rise impacts. Here we explore and demonstrate the capabilities of a Bayesian network (BN) to predict long-term shoreline change associated with sea level rise and make quantitative assessments of prediction uncertainty. A BN is used to define relationships between driving forces, geologic constraints, and coastal response for the U.S. Atlantic coast that include observations of local rates of relative sea level rise, wave height, tide range, geomorphic classification, coastal slope, and shoreline change rate. The BN is used to make probabilistic predictions of shoreline retreat in response to different future sea level rise rates. Results demonstrate that the probability of shoreline retreat increases with higher rates of sea level rise. Where more specific information is included, the probability of shoreline change increases in a number of cases, indicating more confident predictions. A hindcast evaluation of the BN indicates that the network correctly predicts 71% of the cases. Evaluation of the results using Brier skill and log likelihood ratio scores indicates that the network provides shoreline change predictions that are better than the prior probability. Shoreline change outcomes indicating stability (-1 1 m/yr) was not well predicted. We find that BNs can assimilate important factors contributing to coastal change in response to sea level rise and can make quantitative, probabilistic predictions that can be applied to coastal management decisions. Copyright ?? 2011 by the American Geophysical Union.

  2. Potential for Zika Virus to Establish a Sylvatic Transmission Cycle in the Americas.

    PubMed

    Althouse, Benjamin M; Vasilakis, Nikos; Sall, Amadou A; Diallo, Mawlouth; Weaver, Scott C; Hanley, Kathryn A

    2016-12-01

    Zika virus (ZIKV) originated and continues to circulate in a sylvatic transmission cycle between non-human primate hosts and arboreal mosquitoes in tropical Africa. Recently ZIKV invaded the Americas, where it poses a threat to human health, especially to pregnant women and their infants. Here we examine the risk that ZIKV will establish a sylvatic cycle in the Americas, focusing on Brazil. We review the natural history of sylvatic ZIKV and present a mathematical dynamic transmission model to assess the probability of establishment of a sylvatic ZIKV transmission cycle in non-human primates and/or other mammals and arboreal mosquito vectors in Brazil. Brazil is home to multiple species of primates and mosquitoes potentially capable of ZIKV transmission, though direct assessment of host competence (ability to mount viremia sufficient to infect a feeding mosquito) and vector competence (ability to become infected with ZIKV and disseminate and transmit upon subsequent feedings) of New World species is lacking. Modeling reveals a high probability of establishment of sylvatic ZIKV across a large range of biologically plausible parameters. Probability of establishment is dependent on host and vector population sizes, host birthrates, and ZIKV force of infection. Research on the host competence of New World monkeys or other small mammals to ZIKV, on vector competence of New World Aedes, Sabethes, and Haemagogus mosquitoes for ZIKV, and on the geographic range of potential New World hosts and vectors is urgently needed. A sylvatic cycle of ZIKV would make future elimination efforts in the Americas practically impossible, and paints a dire picture for the epidemiology of ZIKV and our ability to end the ongoing outbreak of congenital Zika syndrome.

  3. Bioterrorism: An Assessment of Medical Response Capabilities at Ben Taub General Hospital

    DTIC Science & Technology

    2002-08-01

    themselves, probably in order to prevent contracting the disease. The substance was tested and determined not to be anthrax but a hoax ( Hendricks , 1999...are prepared for a biological incident. As pointed out in the incident noted by Hendricks , decontaminating people is the proper procedure to follow...was controlled at the site. Hendricks pointed out the risk in this incident would come from inhaling the organism, probably not from skin Bioterrorism

  4. Aerosol Polarimetry Sensor (APS): Design Summary, Performance and Potential Modifications

    NASA Technical Reports Server (NTRS)

    Cairns, Brian

    2014-01-01

    APS is a mature design that has already been built and has a TRL of 7. Algorithmic and retrieval capabilities continue to improve and make better and more sophisticated used of the data. Adjoint solutions, both in one dimensional and three dimensional are computationally efficient and should be the preferred implementation for the calculation of Jacobians (one dimensional), or cost-function gradients (three dimensional). Adjoint solutions necessarily provide resolution of internal fields and simplify incorporation of active measurements in retrievals, which will be necessary for a future ACE mission. Its best to test these capabilities when you know the answer: OSSEs that are well constrained observationally provide the best place to test future multi-instrument platform capabilities and ensure capabilities will meet scientific needs.

  5. Fallacies Leading to the Marginalization of Future CBRN Capabilities

    DTIC Science & Technology

    2013-05-23

    Leading to the Marginalization of Future CBRN Capabilities 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Tammy R...word monograph. At the time, this hurdle appeared daunting for this fearful soul . Through the support, guidance and faith of certain individuals the... Sampling and Identification of Biological, Chemical, and Radiological Agents SSE Sensitive Site Exploitation STANAG Standardization Agreement SUPCOM

  6. Proceedings of the Air Transportation Management Workshop

    NASA Technical Reports Server (NTRS)

    Tobias, Leonard (Editor); Tashker, Michael G. (Editor); Boyle, Angela M. (Editor)

    1995-01-01

    The Air Transportation Management (ATM) Workshop was held 31 Jan. - 1 Feb. 1995 at NASA Ames Research Center. The purpose of the workshop was to develop an initial understanding of user concerns and requirements for future ATM capabilities and to initiate discussions of alternative means and technologies for achieving more effective ATM capabilities. The topics for the sessions were as follows: viewpoints of future ATM capabilities, user requirements, lessons learned, and technologies for ATM. In addition, two panel sessions discussed priorities for ATM, and potential contributions of NASA to ATM. The proceedings contain transcriptions of all sessions.

  7. The living Drake equation of the Tau Zero Foundation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-03-01

    The living Drake equation is our statistical generalization of the Drake equation such that it can take into account any number of factors. This new result opens up the possibility to enrich the equation by inserting more new factors as long as the scientific learning increases. The adjective "Living" refers just to this continuous enrichment of the Drake equation and is the goal of a new research project that the Tau Zero Foundation has entrusted to this author as the discoverer of the statistical Drake equation described hereafter. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be arbitrarily distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov form of the CLT, or the Lindeberg form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the lognormal distribution. Then, the mean value, standard deviation, mode, median and all the moments of this lognormal N can be derived from the means and standard deviations of the seven input random variables. In fact, the seven factors in the ordinary Drake equation now become seven independent positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) distance between any two neighbouring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, this distance now becomes a new random variable. We derive the relevant probability density function, apparently previously unknown (dubbed "Maccone distribution" by Paul Davies). Data Enrichment Principle. It should be noticed that any positive number of random variables in the statistical Drake equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation we call the "Data Enrichment Principle", and regard as the key to more profound, future results in Astrobiology and SETI.

  8. Eco-hydrological Modeling in the Framework of Climate Change

    NASA Astrophysics Data System (ADS)

    Fatichi, Simone; Ivanov, Valeriy Y.; Caporali, Enrica

    2010-05-01

    A blueprint methodology for studying climate change impacts, as inferred from climate models, on eco-hydrological dynamics at the plot and small catchment scale is presented. Input hydro-meteorological variables for hydrological and eco-hydrological models for present and future climates are reproduced using a stochastic downscaling technique and a weather generator, "AWE-GEN". The generated time series of meteorological variables for the present climate and an ensemble of possible future climates serve as input to a newly developed physically-based eco-hydrological model "Tethys-Chloris". An application of the proposed methodology is realized reproducing the current (1961-2000) and multiple future (2081-2100) climates for the location of Tucson (Arizona). A general reduction of precipitation and a significant increase of air temperature are inferred. The eco-hydrological model is successively applied to detect changes in water recharge and vegetation dynamics for a desert shrub ecosystem, typical of the semi-arid climate of south Arizona. Results for the future climate account for uncertainties in the downscaling and are produced in terms of probability density functions. A comparison of control and future scenarios is discussed in terms of changes in the hydrological balance components, energy fluxes, and indices of vegetation productivity. An appreciable effect of climate change can be observed in metrics of vegetation performance. The negative impact on vegetation due to amplification of water stress in a warmer and dryer climate is offset by a positive effect of carbon dioxide augment. This implies a positive shift in plant capabilities to exploit water. Consequently, the plant water use efficiency and rain use efficiency are expected to increase. Interesting differences in the long-term vegetation productivity are also observed for the ensemble of future climates. The reduction of precipitation and the substantial maintenance of vegetation cover ultimately leads to the depletion of soil moisture and recharge to deeper layers. Such an outcome can affect the long-tem water availability in semi-arid systems and expose plants to more severe and frequent periods of stress.

  9. Operational Protection from Unmanned Aerial Systems

    DTIC Science & Technology

    2015-05-15

    future threat posed by adversary UAS to U.S. forces, both in the form of system capabilities and methods of employment . It also addresses present...both in the form of system capabilities and methods of employment . It also addresses present counter UAS capabilities and recommends ways and means to...capabilities and methods of employment . It also addresses present counter UAS capabilities and recommends ways and means to provide better operational

  10. The NASA automation and robotics technology program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee B.; Montemerlo, Melvin D.

    1986-01-01

    The development and objectives of the NASA automation and robotics technology program are reviewed. The objectives of the program are to utilize AI and robotics to increase the probability of mission success; decrease the cost of ground control; and increase the capability and flexibility of space operations. There is a need for real-time computational capability; an effective man-machine interface; and techniques to validate automated systems. Current programs in the areas of sensing and perception, task planning and reasoning, control execution, operator interface, and system architecture and integration are described. Programs aimed at demonstrating the capabilities of telerobotics and system autonomy are discussed.

  11. A bottom-up robust optimization framework for identifying river basin development pathways under deep climate uncertainty

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Ray, P.; Brown, C.

    2016-12-01

    Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.

  12. The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) Toolset

    NASA Technical Reports Server (NTRS)

    Zank, G. P.; Spann, J.

    2014-01-01

    We outline a plan to develop a physics based predictive toolset RISCS to describe the interplanetary energetic particle and radiation environment throughout the inner heliosphere, including at the Earth. To forecast and "nowcast" the radiation environment requires the fusing of three components: 1) the ability to provide probabilities for incipient solar activity; 2) the use of these probabilities and daily coronal and solar wind observations to model the 3D spatial and temporal heliosphere, including magnetic field structure and transients, within 10 AU; and 3) the ability to model the acceleration and transport of energetic particles based on current and anticipated coronal and heliospheric conditions. We describe how to address 1) - 3) based on our existing, well developed, and validated codes and models. The goal of RISCS toolset is to provide an operational forecast and "nowcast" capability that will a) predict solar energetic particle (SEP) intensities; b) spectra for protons and heavy ions; c) predict maximum energies and their duration; d) SEP composition; e) cosmic ray intensities, and f) plasma parameters, including shock arrival times, strength and obliquity at any given heliospheric location and time. The toolset would have a 72 hour predicative capability, with associated probabilistic bounds, that would be updated hourly thereafter to improve the predicted event(s) and reduce the associated probability bounds. The RISCS toolset would be highly adaptable and portable, capable of running on a variety of platforms to accommodate various operational needs and requirements.

  13. Future leisure environments

    Treesearch

    Elwood L. Shafer; George H. Moeller; Russell E. Getty

    1974-01-01

    As an aid to policy- and decision-making about future environmental problems, a panel of experts was asked to predict the probabilities of future events associated with natural-resource management, wildland-recreation management, environmental pollution, population-workforce-leisure, and urban environments. Though some of the predictions projected to the year 2050 may...

  14. Maximal Predictability Approach for Identifying the Right Descriptors for Electrocatalytic Reactions.

    PubMed

    Krishnamurthy, Dilip; Sumaria, Vaidish; Viswanathan, Venkatasubramanian

    2018-02-01

    Density functional theory (DFT) calculations are being routinely used to identify new material candidates that approach activity near fundamental limits imposed by thermodynamics or scaling relations. DFT calculations are associated with inherent uncertainty, which limits the ability to delineate materials (distinguishability) that possess high activity. Development of error-estimation capabilities in DFT has enabled uncertainty propagation through activity-prediction models. In this work, we demonstrate an approach to propagating uncertainty through thermodynamic activity models leading to a probability distribution of the computed activity and thereby its expectation value. A new metric, prediction efficiency, is defined, which provides a quantitative measure of the ability to distinguish activity of materials and can be used to identify the optimal descriptor(s) ΔG opt . We demonstrate the framework for four important electrochemical reactions: hydrogen evolution, chlorine evolution, oxygen reduction and oxygen evolution. Future studies could utilize expected activity and prediction efficiency to significantly improve the prediction accuracy of highly active material candidates.

  15. When the Sun's Away, N2O5 Comes Out to Play: An Updated Analysis of Ambient N2O5 Heterogeneous Chemistry

    NASA Astrophysics Data System (ADS)

    McDuffie, E. E.; Brown, S. S.

    2017-12-01

    The heterogeneous chemistry of N2O5 impacts the budget of tropospheric oxidants, which directly controls air quality at Earth's surface. The reaction between gas-phase N2O5 and aerosol particles occurs largely at night, and is therefore more important during the less-intensively-studied winter season. Though N2O5-aerosol interactions are vital for the accurate understanding and simulation of tropospheric chemistry and air quality, many uncertainties persist in our understanding of how various environmental factors influence the reaction rate and probability. Quantitative and accurate evaluation of these factors directly improves the predictive capabilities of atmospheric models, used to inform mitigation strategies for wintertime air pollution. In an update to last year's presentation, The Wintertime Fate of N2O5: Observations and Box Model Analysis for the 2015 WINTER Aircraft Campaign, this presentation will focus on recent field results regarding new information about N2O5 heterogeneous chemistry and future research directions.

  16. Propulsion Ground Testing: Planning for the Future

    NASA Technical Reports Server (NTRS)

    Bruce, Robert

    2003-01-01

    Advanced planners are constantly being asked to plan for the provision of future test capability. Historically, this capability is provided either by substantial investment in new test facility capabilities, or in the substantial investment in the modification of pre- existing test capabilities. The key words in the previous sentence are "substantial investment". In the evolving environment of increasingly constrained resources, how is an advanced planner to plan for the provisions of such capabilities? Additionally, the conundrum exists that program formulation decisions are being made based upon both life cycle cost decisions in an environment in which the more immediate challenge of "front-end" capital investment? Often times is the linch-pin upon which early decisions are made. In such an environment, how are plans and decisions made? This paper cites examples of decisions made in the past in the area of both major test facility upgrades, as well as major new test facility investment.

  17. Building Airport Surface HITL Simulation Capability

    NASA Technical Reports Server (NTRS)

    Chinn, Fay Cherie

    2016-01-01

    FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.

  18. Evaluation of artillery equipment maintenance support capability based on grey clustering

    NASA Astrophysics Data System (ADS)

    Zhai, Mei-jie; Gao, Peng

    2017-12-01

    This paper, the theory and method of evaluating the capability of equipment maintenance support in China and abroad are studied, from the point of view of the combat task of artillery troops and the strategic attachment in the future military struggle. This paper establishes the framework of the evaluation Index system of the equipment maintenance support capability of the artillery units, and applies the grey clustering method to the evaluation of the equipment maintenance support capability of the artillery units, and finally evaluates the equipment maintenance and support capability of the artillery brigade as an example, and analyzes the evaluation results. This paper finds out the outstanding problems existing in the maintenance and support of military equipment, and puts forward some constructive suggestions, in order to improve the status of military equipment maintenance and support and improve the level of future equipment maintenance.

  19. Counterforce Targeting Capabilities and Challenges

    DTIC Science & Technology

    2004-08-01

    COUNTERFORCE TARGETING CAPABILITIES AND CHALLENGES by Barry R. Schneider The Counterproliferation Papers Future Warfare Series No. 22 USAF...TITLE AND SUBTITLE Counterforce Targeting Capabilities and Challenges 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Rev. 8-98) Prescribed by ANSI Std Z39-18 Counterforce Targeting Capabilities and Challenges Barry R. Schneider August 2004 The Counterproliferation

  20. Mount St. Helens a decade after the 1980 eruptions: magmatic models, chemical cycles, and a revised hazards assessment

    USGS Publications Warehouse

    Pallister, J.S.; Hoblitt, R.P.; Crandell, D.R.; Mullineaux, D.R.

    1992-01-01

    Available geophysical and geologic data provide a simplified model of the current magmatic plumbing system of Mount St. Helens (MSH). This model and new geochemical data are the basis for the revised hazards assessment presented here. The assessment is weighted by the style of eruptions and the chemistry of magmas erupted during the past 500 years, the interval for which the most detailed stratigraphic and geochemical data are available. This interval includes the Kalama (A. D. 1480-1770s?), Goat Rocks (A.D. 1800-1857), and current eruptive periods. In each of these periods, silica content decreased, then increased. The Kalama is a large amplitude chemical cycle (SiO2: 57%-67%), produced by mixing of arc dacite, which is depleted in high field-strength and incompatible elements, with enriched (OIB-like) basalt. The Goat Rocks and current cycles are of small amplitude (SiO2: 61%-64% and 62%-65%) and are related to the fluid dynamics of magma withdrawal from a zoned reservoir. The cyclic behavior is used to forecast future activity. The 1980-1986 chemical cycle, and consequently the current eruptive period, appears to be virtually complete. This inference is supported by the progressively decreasing volumes and volatile contents of magma erupted since 1980, both changes that suggest a decreasing potential for a major explosive eruption in the near future. However, recent changes in seismicity and a series of small gas-release explosions (beginning in late 1989 and accompanied by eruption of a minor fraction of relatively low-silica tephra on 6 January and 5 November 1990) suggest that the current eruptive period may continue to produce small explosions and that a small amount of magma may still be present within the conduit. The gas-release explosions occur without warning and pose a continuing hazard, especially in the crater area. An eruption as large or larger than that of 18 May 1980 (???0.5 km3 dense-rock equivalent) probably will occur only if magma rises from an inferred deep (???7 km), relative large (5-7 km3) reservoir. A conservative approach to hazard assessment is to assume that this deep magma is rich in volatiles and capable of erupting explosively to produce voluminous fall deposits and pyroclastic flows. Warning of such an eruption is expectable, however, because magma ascent would probably be accompanied by shallow seismicity that could be detected by the existing seismic-monitoring system. A future large-volume eruption (???0.1 km3) is virtually certain; the eruptive history of the past 500 years indicates the probability of a large explosive eruption is at least 1% annually. Intervals between large eruptions at Mount St. Helens have varied widely; consequently, we cannot confidently forecast whether the next large eruption will be years decades, or farther in the future. However, we can forecast the types of hazards, and the areas that will be most affected by future large-volume eruptions, as well as hazards associated with the approaching end of the current eruptive period. ?? 1992 Springer-Verlag.

  1. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  2. Monitoring and Evaluation of Cultivated Land Irrigation Guarantee Capability with Remote Sensing

    NASA Astrophysics Data System (ADS)

    Zhang, C., Sr.; Huang, J.; Li, L.; Wang, H.; Zhu, D.

    2015-12-01

    Abstract: Cultivated Land Quality Grade monitoring and evaluation is an important way to improve the land production capability and ensure the country food safety. Irrigation guarantee capability is one of important aspects in the cultivated land quality monitoring and evaluation. In the current cultivated land quality monitoring processing based on field survey, the irrigation rate need much human resources investment in long investigation process. This study choses Beijing-Tianjin-Hebei as study region, taking the 1 km × 1 km grid size of cultivated land unit with a winter wheat-summer maize double cropping system as study object. A new irrigation capacity evaluation index based on the ratio of the annual irrigation requirement retrieved from MODIS data and the actual quantity of irrigation was proposed. With the years of monitoring results the irrigation guarantee capability of study area was evaluated comprehensively. The change trend of the irrigation guarantee capability index (IGCI) with the agricultural drought disaster area in rural statistical yearbook of Beijing-Tianjin-Hebei area was generally consistent. The average of IGCI value, the probability of irrigation-guaranteed year and the weighted average which controlled by the irrigation demand index were used and compared in this paper. The experiment results indicate that the classification result from the present method was close to that from irrigation probability in the gradation on agriculture land quality in 2012, with overlap of 73% similar units. The method of monitoring and evaluation of cultivated land IGCI proposed in this paper has a potential in cultivated land quality level monitoring and evaluation in China. Key words: remote sensing, evapotranspiration, MODIS cultivated land quality, irrigation guarantee capability Authors: Chao Zhang, Jianxi Huang, Li Li, Hongshuo Wang, Dehai Zhu China Agricultural University zhangchaobj@gmail.com

  3. Constructing alternative futures

    Treesearch

    David N. Wear; Robert Huggett; John G. Greis

    2013-01-01

    The desired product of the Southern Forest Futures Project is a mechanism that will help southerners think about and prepare for future changes in their forests and the benefits they provide. Because any single projection of the world’s (or a region’s) biological, physical, and social systems has a high probability of being incorrect, the Futures Project instead...

  4. Development of Modern Performance Assessment Tools and Capabilities for Underground Disposal of Transuranic Waste at WIPP

    NASA Astrophysics Data System (ADS)

    Zeitler, T.; Kirchner, T. B.; Hammond, G. E.; Park, H.

    2014-12-01

    The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. In a broad modernization effort, the DOE has overseen the transfer of these codes to modern hardware and software platforms. Additionally, there is a current effort to establish new performance assessment capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Improvements to the current computational environment will result in greater detail in the final models due to the parallelization afforded by the modern code. Parallelization will allow for relatively faster calculations, as well as a move from a two-dimensional calculation grid to a three-dimensional grid. The result of the modernization effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.

  5. NASA's Space Launch System: Deep-Space Delivery for SmallSats

    NASA Technical Reports Server (NTRS)

    Robinson, Kimberly F.; Norris, George

    2017-01-01

    Designed for human exploration missions into deep space, NASA's Space Launch System (SLS) represents a new spaceflight infrastructure asset, enabling a wide variety of unique utilization opportunities. While primarily focused on launching the large systems needed for crewed spaceflight beyond Earth orbit, SLS also offers a game-changing capability for the deployment of small satellites to deep-space destinations, beginning with its first flight. Currently, SLS is making rapid progress toward readiness for its first launch in two years, using the initial configuration of the vehicle, which is capable of delivering more than 70 metric tons (t) to Low Earth Orbit (LEO). Planning is underway for smallsat accomodations on future configurations of the vehicle, which will present additional opportunities. This paper will include an overview of the SLS vehicle and its capabilities, including the current status of progress toward first launch. It will also explain the current and future opportunities the vehicle offers for small satellites, including an overview of the CubeSat manifest for Exploration Mission-1 in 2018 and a discussion of future capabilities.

  6. Cyclone: A close air support aircraft for tomorrow

    NASA Technical Reports Server (NTRS)

    Cox, George; Croulet, Donald; Dunn, James; Graham, Michael; Ip, Phillip; Low, Scott; Vance, Gregg; Volckaert, Eric

    1991-01-01

    To meet the threat of the battlefield of the future, the U.S. ground forces will require reliable air support. To provide this support, future aircrews demand a versatile close air support aircraft capable of delivering ordinance during the day, night, or in adverse weather with pin-point accuracy. The Cyclone aircraft meets these requirements, packing the 'punch' necessary to clear the way for effective ground operations. Possessing anti-armor, missile, and precision bombing capability, the Cyclone will counter the threat into the 21st Century. Here, it is shown that the Cyclone is a realistic, economical answer to the demand for a capable close air support aircraft.

  7. Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds

    USGS Publications Warehouse

    Conway, C.J.; Gibbs, J.P.

    2011-01-01

    Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.

  8. China Naval Modernization: Implications for U.S. Navy Capabilities - Background and Issues for Congress

    DTIC Science & Technology

    2010-04-09

    blunting the operations of those forces that do deploy forward. North Korea and Iran , as part of their defiance of international norms, are actively...section ‘Deter and Defeat Aggression in Anti-Access Environments,’ those are primarily capabilities needed for defeating China, not Iran , North Korea or... North Korea and Iran . John J. Tkacik, a former State Department China specialist, said the changes were probably ordered by the White House. "By

  9. Utilizing an Intelligent Tutoring System in Tactical Action Officer Sandbox

    DTIC Science & Technology

    2014-06-01

    Office of Naval Research Future Naval Capabilities, the Defense Technology Area Plan from 2005 and the Department of Defense Science and Technology ...LEFT BLANK v ABSTRACT The Office of Naval Research Future Naval Capabilities, the Defense Technology Area Plan from 2005 and the Department...of Defense Science and Technology Priorities for FY13-17 all share a focus on systems to promote warfighter performance. The goal of these systems is

  10. Defining an Approach for Future Close Air Support Capability

    DTIC Science & Technology

    2017-01-01

    may take on the form of a force-mix study that considers multiple joint scenarios and missions. viii Acknowledgments The authors would like to thank...the Army and other services on an approach for defining future CAS capability. 9 Colin Clark, “Air...unit; one British soldier was killed, and five others were wounded.15 Only one A-10 was shot down during all of OIF and OEF. However, it should be

  11. Bullet Impact Safety Study of PBX-9502

    NASA Astrophysics Data System (ADS)

    Ferranti, Louis

    2013-06-01

    A new small arms capability for performing bullet impact testing into energetic materials has recently been activated at Lawrence Livermore National Laboratory located in the High Explosives Applications Facility (HEAF). The initial capability includes 0.223, 0.30, and 0.50 testing calibers with the flexibility to add other barrels in the near future. An initial test series has been performed using the 0.50 caliber barrel shooting bullets into targets using the TATB based explosive PBX-9502 and shows an expected non-violent reaction. Future experiments to evaluate the safety of new explosive formulations to bullet impact are planned. A highlight of the new capability along with discussion of the initial experiments to date will be presented including future areas of research. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  12. Comonotonic bounds on the survival probabilities in the Lee-Carter model for mortality projection

    NASA Astrophysics Data System (ADS)

    Denuit, Michel; Dhaene, Jan

    2007-06-01

    In the Lee-Carter framework, future survival probabilities are random variables with an intricate distribution function. In large homogeneous portfolios of life annuities, value-at-risk or conditional tail expectation of the total yearly payout of the company are approximately equal to the corresponding quantities involving random survival probabilities. This paper aims to derive some bounds in the increasing convex (or stop-loss) sense on these random survival probabilities. These bounds are obtained with the help of comonotonic upper and lower bounds on sums of correlated random variables.

  13. A blueprint for demonstrating quantum supremacy with superconducting qubits.

    PubMed

    Neill, C; Roushan, P; Kechedzhi, K; Boixo, S; Isakov, S V; Smelyanskiy, V; Megrant, A; Chiaro, B; Dunsworth, A; Arya, K; Barends, R; Burkett, B; Chen, Y; Chen, Z; Fowler, A; Foxen, B; Giustina, M; Graff, R; Jeffrey, E; Huang, T; Kelly, J; Klimov, P; Lucero, E; Mutus, J; Neeley, M; Quintana, C; Sank, D; Vainsencher, A; Wenner, J; White, T C; Neven, H; Martinis, J M

    2018-04-13

    A key step toward demonstrating a quantum system that can address difficult problems in physics and chemistry will be performing a computation beyond the capabilities of any classical computer, thus achieving so-called quantum supremacy. In this study, we used nine superconducting qubits to demonstrate a promising path toward quantum supremacy. By individually tuning the qubit parameters, we were able to generate thousands of distinct Hamiltonian evolutions and probe the output probabilities. The measured probabilities obey a universal distribution, consistent with uniformly sampling the full Hilbert space. As the number of qubits increases, the system continues to explore the exponentially growing number of states. Extending these results to a system of 50 qubits has the potential to address scientific questions that are beyond the capabilities of any classical computer. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  14. Metrics for Business Process Models

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  15. Idaho's Library Future.

    ERIC Educational Resources Information Center

    Idaho State Library, Boise.

    In l998, Idahoans gathered in a series of six Regional Futures Conferences to identify what they thought was probable during the next ten years, what was possible for libraries to do and be, and what a preferred future of Idaho libraries might be. Participants from the regional conferences then convened to refine and focus descriptions of the…

  16. The Future Outlook for School Facilities Planning and Design.

    ERIC Educational Resources Information Center

    Brubaker, C. William

    School design is influenced by four major factors: the education program, the community site, education technology, and building technology. Schools of the future are discussed in relation to the factors affecting school design. It is probable that future schools will be involved in a broader spectrum of programs and will serve a more diverse…

  17. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  18. Projected status of the Pacific walrus (Odobenus rosmarus divergens) in the twenty-first century

    USGS Publications Warehouse

    Jay, Chadwick V.; Marcot, Bruce G.; Douglas, David C.

    2011-01-01

    Extensive and rapid losses of sea ice in the Arctic have raised conservation concerns for the Pacific walrus (Odobenus rosmarus divergens), a large pinniped inhabiting arctic and subarctic continental shelf waters of the Chukchi and Bering seas. We developed a Bayesian network model to integrate potential effects of changing environmental conditions and anthropogenic stressors on the future status of the Pacific walrus population at four periods through the twenty-first century. The model framework allowed for inclusion of various sources and levels of knowledge, and representation of structural and parameter uncertainties. Walrus outcome probabilities through the century reflected a clear trend of worsening conditions for the subspecies. From the current observation period to the end of century, the greatest change in walrus outcome probabilities was a progressive decrease in the outcome state of robust and a concomitant increase in the outcome state of vulnerable. The probabilities of rare and extirpated states each progressively increased but remained <10% through the end of the century. The summed probabilities of vulnerable, rare, and extirpated (P(v,r,e)) increased from a current level of 10% in 2004 to 22% by 2050 and 40% by 2095. The degree of uncertainty in walrus outcomes increased monotonically over future periods. In the model, sea ice habitat (particularly for summer/fall) and harvest levels had the greatest influence on future population outcomes. Other potential stressors had much smaller influences on walrus outcomes, mostly because of uncertainty in their future states and our current poor understanding of their mechanistic influence on walrus abundance.

  19. Space shuttle solid rocket booster recovery system definition, volume 1

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.

  20. Analytical and Radiochemistry for Nuclear Forensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dry, Donald E.; Kinman, William Scott

    Information about nonproliferation nuclear forensics, activities in forensics at Los Alamos National Laboratory, radio analytical work at LANL, radiochemical characterization capabilities, bulk chemical and materials analysis capabilities, and future interests in forensics interactions.

  1. Developing a probability-based model of aquifer vulnerability in an agricultural region

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  2. Ares V Launch Capability Enables Future Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2007-01-01

    NASA's Ares V cargo launch vehicle offers the potential to completely change the paradigm of future space science mission architectures. A major finding of the NASA Advanced Telescope and Observatory Capability Roadmap Study was that current launch vehicle mass and volume constraints severely limit future space science missions. And thus, that significant technology development is required to package increasingly larger collecting apertures into existing launch shrouds. The Ares V greatly relaxes these constraints. For example, while a Delta IV has the ability to launch approximate a 4.5 meter diameter payload with a mass of 13,000 kg to L2, the Ares V is projected to have the ability to launch an 8 to 12 meter diameter payload with a mass of 60,000 kg to L2 and 130,000 kg to Low Earth Orbit. This paper summarizes the Ares V payload launch capability and introduces how it might enable new classes of future space telescopes such as 6 to 8 meter class monolithic primary mirror observatories, 15 meter class segmented telescopes, 6 to 8 meter class x-ray telescopes or high-energy particle calorimeters.

  3. Host-finding behaviour and navigation capabilities of symbiotic zooxanthellae

    NASA Astrophysics Data System (ADS)

    Pasternak, Zohar; Blasius, Bernd; Abelson, Avigdor; Achituv, Yair

    2006-05-01

    Past studies have shown that the initiation of symbiosis between the Red-Sea soft coral Heteroxenia fuscescens and its symbiotic dinoflagellates occurs due to the chemical attraction of the motile algal cells to substances emanating from the coral polyps. However, the resulting swimming patterns of zooxanthellae have not been previously studied. This work examined algal swimming behaviour, host location and navigation capabilities under four conditions: (1) still water, (2) in still water with waterborne host attractants, (3) in flowing water, and (4) in flow with host attractants. Algae were capable of actively and effectively locating their host in still water as well as in flow. When in water containing host attractants, swimming became slower, motion patterns straighter and the direction of motion was mainly towards the host—even if this meant advancing upstream against flow velocities of up to 0.5 mm s-1. Coral-algae encounter probability decreased the further downstream of the host algae were located, probably due to diffusion of the chemical signal. The results show how the chemoreceptive zooxanthellae modify their swimming pattern, direction, velocity, circuity and turning rate to accommodate efficient navigation in changing environmental conditions.

  4. Identifying 21st Century Capabilities

    ERIC Educational Resources Information Center

    Stevens, Robert

    2012-01-01

    What are the capabilities necessary to meet 21st century challenges? Much of the literature on 21st century skills focuses on skills necessary to meet those challenges associated with future work in a globalised world. The result is a limited characterisation of those capabilities necessary to address 21st century social, health and particularly…

  5. XRCF Testing Capabilities

    NASA Technical Reports Server (NTRS)

    Reily, Cary; Kegely, Jeff; Burdine, Robert (Technical Monitor)

    2001-01-01

    The Space Optics Manufacturing Technology Center's X-ray Calibration Facility has been recently modified to test Next Generation Space Telescope (NGST) developmental mirrors at cryogenic temperatures (35 degrees Kelvin) while maintaining capability for performance testing of x-ray optics and detectors. The facility's current cryo-optical testing capability and potential modifications for future support of NGST will be presented.

  6. Future Interagency Range and Spaceport Technologies (FIRST) Formulation Products: 1. Transformational Spaceport and Range Concept of Operations. 2. F.I.R.S.T. Business Case Analysis

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The Baseline Report captures range and spaceport capabilities at five sites: KSC, CCAFS, VAFB, Wallops, and Kodiak. The Baseline depicts a future state that relies on existing technology, planned upgrades, and straight-line recapitalization at these sites projected through 2030. The report presents an inventory of current spaceport and range capabilities at these five sites. The baseline is the first part of analyzing a business case for a set of capabilities designed to transform U.S. ground and space launch operations toward a single, integrated national "system" of space transportation systems. The second part of the business case compares current capabilities with technologies needed to support the integrated national "system". The final part, a return on investment analysis, identifies the technologies that best lead to the integrated national system and reduce recurring costs..Numerous data sources were used to define and describe the baseline spaceport and range by identifying major systems and elements and describing capabilities, limitations, and capabilities

  7. Initial Multidisciplinary Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  8. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    NASA Technical Reports Server (NTRS)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  9. Frequency analysis of uncertain structures using imprecise probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modares, Mehdi; Bergerson, Joshua

    2015-01-01

    Two new methods for finite element based frequency analysis of a structure with uncertainty are developed. An imprecise probability formulation based on enveloping p-boxes is used to quantify the uncertainty present in the mechanical characteristics of the structure. For each element, independent variations are considered. Using the two developed methods, P-box Frequency Analysis (PFA) and Interval Monte-Carlo Frequency Analysis (IMFA), sharp bounds on natural circular frequencies at different probability levels are obtained. These methods establish a framework for handling incomplete information in structural dynamics. Numerical example problems are presented that illustrate the capabilities of the new methods along with discussionsmore » on their computational efficiency.« less

  10. The Relevance and Future of Joint Logistics Over the Shore (JLOTS) Operations

    DTIC Science & Technology

    2013-04-01

    whether commercial or emerging technology, to ensure viable capability as older platforms and capability reach their economical useful life. 15...ensure viable capability as older platforms and capability reach their economical useful life. ii TABLE OF CONTENTS CHAPTER 1: INTRODUCTION...and in the Falklands, of course, out of the question.19 Therefore, when the military junta of Argentina invaded the Falklands on 2 April 1982

  11. Focus in High School Mathematics: Statistics and Probability

    ERIC Educational Resources Information Center

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  12. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  13. A blueprint for using climate change predictions in an eco-hydrological study

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.

    2009-12-01

    There is a growing interest to extend climate change predictions to smaller, catchment-size scales and identify their implications on hydrological and ecological processes. Small scale processes are, in fact, expected to mediate climate changes, producing local effects and feedbacks that can interact with the principal consequences of the change. This is particularly applicable, when a complex interaction, such as the inter-relationship between the hydrological cycle and vegetation dynamics, is considered. This study presents a blueprint methodology for studying climate change impacts, as inferred from climate models, on eco-hydrological dynamics at the catchment scale. Climate conditions, present or future, are imposed through input hydrometeorological variables for hydrological and eco-hydrological models. These variables are simulated with an hourly weather generator as an outcome of a stochastic downscaling technique. The generator is parameterized to reproduce the climate of southwestern Arizona for present (1961-2000) and future (2081-2100) conditions. The methodology provides the capability to generate ensemble realizations for the future that take into account the heterogeneous nature of climate predictions from different models. The generated time series of meteorological variables for the two scenarios corresponding to the current and mean expected future serve as input to a coupled hydrological and vegetation dynamics model, “Tethys-Chloris”. The hydrological model reproduces essential components of the land-surface hydrological cycle, solving the mass and energy budget equations. The vegetation model parsimoniously parameterizes essential plant life-cycle processes, including photosynthesis, phenology, carbon allocation, and tissue turnover. The results for the two mean scenarios are compared and discussed in terms of changes in the hydrological balance components, energy fluxes, and indices of vegetation productivity The need to account for uncertainties in projections of future climate is discussed and a methodology for propagating these uncertainties into the probability density functions of changes in eco-hydrological variables is presented.

  14. Stochastic Modeling of Past Volcanic Crises

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2018-01-01

    The statistical foundation of disaster risk analysis is past experience. From a scientific perspective, history is just one realization of what might have happened, given the randomness and chaotic dynamics of Nature. Stochastic analysis of the past is an exploratory exercise in counterfactual history, considering alternative possible scenarios. In particular, the dynamic perturbations that might have transitioned a volcano from an unrest to an eruptive state need to be considered. The stochastic modeling of past volcanic crises leads to estimates of eruption probability that can illuminate historical volcanic crisis decisions. It can also inform future economic risk management decisions in regions where there has been some volcanic unrest, but no actual eruption for at least hundreds of years. Furthermore, the availability of a library of past eruption probabilities would provide benchmark support for estimates of eruption probability in future volcanic crises.

  15. Potential Future Igneous Activity at Yucca Mountain, Nevada

    NASA Astrophysics Data System (ADS)

    Cline, M.; Perry, F. V.; Valentine, G. A.; Smistad, E.

    2005-12-01

    Location, timing, and volumes of post-Miocene volcanic activity, along with expert judgement, provide the basis for assessing the probability of future volcanism intersecting a proposed repository for nuclear waste at Yucca Mountain, Nevada. Analog studies of eruptive centers in the region that may represent the style and extent of possible future igneous activity at Yucca Mountain have aided in defining the consequence scenarios for intrusion into and eruption through a proposed repository. Modeling of magmatic processes related to magma/proposed repository interactions has been used to assess the potential consequences of a future igneous event through a proposed repository at Yucca Mountain. Results of work to date indicate future igneous activity in the Yucca Mountain region has a very low probability of intersecting the proposed repository. Probability of a future event intersecting a proposed repository at Yucca Mountain is approximately 1.7 X 10-8 per year. Since completion of the Probabilistic Volcanic Hazard Assessment (PVHA) in 1996, anomalies representing potential buried volcanic centers have been identified from aeromagnetic surveys. A re-assessment of the hazard is currently underway to evaluate the probability of intersection in light of new information and to estimate the probability of one or more volcanic conduits located in the proposed repository along a dike that intersects the proposed repository. U.S. Nuclear Regulatory Commission regulations for siting and licensing a proposed repository require that the consequences of a disruptive event (igneous event) with annual probability greater than 1 X 10-8 be evaluated. Two consequence scenarios are considered; 1) igneous intrusion-groundwater transport case and 2) volcanic eruptive case. These scenarios equate to a dike or dike swarm intersecting repository drifts containing waste packages, formation of a conduit leading to a volcanic eruption through the repository that carries the contents of the waste packages into the atmosphere, deposition of a tephra sheet, and redistribution of the contaminated ash. In both cases radioactive material is released to the accessible environment either through groundwater transport or through the atmospheric dispersal and deposition. Six Quaternary volcanic centers exist within 20 km of Yucca Mountain. Lathrop Wells cone (LWC), the youngest (approximately 75,000 yrs), is a well-preserved cinder cone with associated flows and tephra sheet that provides an excellent analogue for consequence studies related to future volcanism. Cone, lavas, hydrovolcanic ash, and ash-fall tephra have been examined to estimate eruptive volume and eruption type. LWC ejecta volumes suggest basaltic volcanism may be waning in the Yucca Mountain region.. The eruptive products indicate a sequence of initial fissure fountaining, early Strombolian ash and lapilli deposition forming the scoria cone, a brief hydrovolcanic pulse (possibly limited to the NW sector), and a violent Strombolian phase. Mathematical models have been developed to represent magmatic processes and their consequences on proposed repository performance. These models address dike propagation, magma interaction and flow into drifts, eruption through the proposed repository, and post intrusion/eruption effects. These models continue to be refined to reduce the uncertainty associated with the consequences from a possible future igneous event.

  16. Postflight analysis of the single-axis acoustic system on SPAR VI and recommendations for future flights

    NASA Technical Reports Server (NTRS)

    Naumann, R. J.; Oran, W. A.; Whymark, R. R.; Rey, C.

    1981-01-01

    The single axis acoustic levitator that was flown on SPAR VI malfunctioned. The results of a series of tests, analyses, and investigation of hypotheses that were undertaken to determine the probable cause of failure are presented, together with recommendations for future flights of the apparatus. The most probable causes of the SPAR VI failure were lower than expected sound intensity due to mechanical degradation of the sound source, and an unexpected external force that caused the experiment sample to move radially and eventually be lost from the acoustic energy well.

  17. Neutron detection in a high gamma-ray background with EJ-301 and EJ-309 liquid scintillators

    NASA Astrophysics Data System (ADS)

    Stevanato, L.; Cester, D.; Nebbia, G.; Viesti, G.

    2012-10-01

    Using a fast digitizer, the neutron-gamma discrimination capability of the new liquid scintillator EJ-309 is compared with that obtained using standard EJ-301. Moreover the capability of both the scintillation detectors to identify a weak neutron source in a high gamma-ray background is demonstrated. The probability of neutron detection is PD=95% at 95% confidence level for a gamma-ray background corresponding to a dose rate of 100 μSv/h.

  18. Set-up and validation of a Delft-FEWS based coastal hazard forecasting system

    NASA Astrophysics Data System (ADS)

    Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya

    2017-04-01

    European coasts are increasingly threatened by hazards related to low-probability and high-impact hydro-meteorological events. Uncertainties in hazard prediction and capabilities to cope with their impact lie in both future storm pattern and increasing coastal development. Therefore, adaptation to future conditions requires a re-evaluation of coastal disaster risk reduction (DRR) strategies and introduction of a more efficient mix of prevention, mitigation and preparedness measures. The latter presumes that development of tools, which can manage the complex process of merging data and models and generate products on the current and expected hydro-and morpho-dynamic states of the coasts, such as forecasting system of flooding and erosion hazards at vulnerable coastal locations (hotspots), is of vital importance. Output of such system can be of an utmost value for coastal stakeholders and the entire coastal community. In response to these challenges, Delft-FEWS provides a state-of-the-art framework for implementation of such system with vast capabilities to trigger the early warning process. In addition, this framework is highly customizable to the specific requirements of any individual coastal hotspot. Since its release many Delft-FEWS based forecasting system related to inland flooding have been developed. However, limited number of coastal applications was implemented. In this paper, a set-up of Delft-FEWS based forecasting system for Varna Bay (Bulgaria) and a coastal hotspot, which includes a sandy beach and port infrastructure, is presented. It is implemented in the frame of RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). The system output generated in hindcast mode is validated with available observations of surge levels, wave and morphodynamic parameters for a sequence of three short-duration and relatively weak storm events occurred during February 4-12, 2015. Generally, the models' performance is considered as very good and results obtained - quite promising for reliable prediction of both boundary conditions and coastal hazard and gives a good basis for estimation of onshore impact.

  19. Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Keyes, Gilbert

    1991-01-01

    Information is given in viewgraph form on Space Station Freedom. Topics covered include future evolution, man-tended capability, permanently manned capability, standard payload rack dimensions, the Crystals by Vapor Transport Experiment (CVTE), commercial space projects interfaces, and pricing policy.

  20. Precision Landing and Hazard Avoidance Doman

    NASA Technical Reports Server (NTRS)

    Robertson, Edward A.; Carson, John M., III

    2016-01-01

    The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking. Autonomous PL&HA builds upon the core GN&C capabilities developed to enable soft, controlled landings on the Moon, Mars, and other solar system bodies. Through the addition of a Terrain Relative Navigation (TRN) function, precision landing within tens of meters of a map-based target is possible. The addition of a 3-D terrain mapping lidar sensor improves the probability of a safe landing via autonomous, real-time Hazard Detection and Avoidance (HDA). PL&HA significantly improves the probability of mission success and enhances access to sites of scientific interest located in challenging terrain. PL&HA can also utilize external navigation aids, such as navigation satellites and surface beacons. Advanced Lidar Sensors High precision ranging, velocimetry, and 3-D terrain mapping Terrain Relative Navigation (TRN) TRN compares onboard reconnaissance data with real-time terrain imaging data to update the S/C position estimate Hazard Detection and Avoidance (HDA) Generates a high-resolution, 3-D terrain map in real-time during the approach trajectory to identify safe landing targets Inertial Navigation During Terminal Descent High precision surface relative sensors enable accurate inertial navigation during terminal descent and a tightly controlled touchdown within meters of the selected safe landing target.

  1. Effective bandwidth guaranteed routing schemes for MPLS traffic engineering

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Jain, Nidhi

    2001-07-01

    In this work, we present online algorithms for dynamic routing bandwidth guaranteed label switched paths (LSPs) where LSP set-up requests (in terms of a pair of ingress and egress routers as well as its bandwidth requirement) arrive one by one and there is no a priori knowledge regarding future LSP set-up requests. In addition, we consider rerouting of LSPs in this work. Rerouting of LSPs has not been well studied in previous work on LSP routing. The need of LSP rerouting arises in a number of ways: occurrence of faults (link and/or node failures), re-optimization of existing LSPs' routes to accommodate traffic fluctuation, requests with higher priorities, and so on. We formulate the bandwidth guaranteed LSP routing with rerouting capability as a multi-commodity flow problem. The solution to this problem is used as the benchmark for comparing other computationally less costly algorithms studied in this paper. Furthermore, to more efficiently utilize the network resources, we propose online routing algorithms which route bandwidth demands over multiple paths at the ingress router to satisfy the customer requests while providing better service survivability. Traffic splitting and distribution over the multiple paths are carefully handled using table-based hashing schemes while the order of packets within a flow is preserved. Preliminary simulations are conducted to show the performance of different design choices and the effectiveness of the rerouting and multi-path routing algorithms in terms of LSP set-up request rejection probability and bandwidth blocking probability.

  2. Exact sampling hardness of Ising spin models

    NASA Astrophysics Data System (ADS)

    Fefferman, B.; Foss-Feig, M.; Gorshkov, A. V.

    2017-09-01

    We study the complexity of classically sampling from the output distribution of an Ising spin model, which can be implemented naturally in a variety of atomic, molecular, and optical systems. In particular, we construct a specific example of an Ising Hamiltonian that, after time evolution starting from a trivial initial state, produces a particular output configuration with probability very nearly proportional to the square of the permanent of a matrix with arbitrary integer entries. In a similar spirit to boson sampling, the ability to sample classically from the probability distribution induced by time evolution under this Hamiltonian would imply unlikely complexity theoretic consequences, suggesting that the dynamics of such a spin model cannot be efficiently simulated with a classical computer. Physical Ising spin systems capable of achieving problem-size instances (i.e., qubit numbers) large enough so that classical sampling of the output distribution is classically difficult in practice may be achievable in the near future. Unlike boson sampling, our current results only imply hardness of exact classical sampling, leaving open the important question of whether a much stronger approximate-sampling hardness result holds in this context. The latter is most likely necessary to enable a convincing experimental demonstration of quantum supremacy. As referenced in a recent paper [A. Bouland, L. Mancinska, and X. Zhang, in Proceedings of the 31st Conference on Computational Complexity (CCC 2016), Leibniz International Proceedings in Informatics (Schloss Dagstuhl-Leibniz-Zentrum für Informatik, Dagstuhl, 2016)], our result completes the sampling hardness classification of two-qubit commuting Hamiltonians.

  3. Integrating Future Information through Scenarios. AIR 1985 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Zentner, Rene D.

    The way that higher education planners can take into account changes in the post-industrial society is discussed. The scenario method is proposed as a method of integrating futures information. The planner can be provided with several probable futures, each of which can be incorporated in a scenario. An effective scenario provides the planner…

  4. Global mega forces: Implications for the future of natural resources

    Treesearch

    George H. Kubik

    2012-01-01

    The purpose of this paper is to provide an overview of leading global mega forces and their importance to the future of natural resource decisionmaking, policy development, and operation. Global mega forces are defined as a combination of major trends, preferences, and probabilities that come together to produce the potential for future high-impact outcomes. These...

  5. Future-Orientated Approaches to Curriculum Development: Fictive Scripting

    ERIC Educational Resources Information Center

    Garraway, James

    2017-01-01

    Though the future cannot be accurately predicted, it is possible to envisage a number of probable developments which can promote thinking about the future and so promote a more informed stance about what should or should not be done. Studies in technology and society have claimed that the use of a type of forecasting using plausible but imaginary…

  6. Technology Needs to Support Future Mars Exploration

    NASA Technical Reports Server (NTRS)

    Nilsen, Erik N.; Baker, John; Lillard, Randolph P.

    2013-01-01

    The Mars Program Planning Group (MPPG) under the direction of Dr. Orlando Figueroa, was chartered to develop options for a program-level architecture for robotic exploration of Mars consistent with the objective to send humans to Mars in the 2030's. Scientific pathways were defined for future exploration, and multiple architectural options were developed that meet current science goals and support the future human exploration objectives. Integral to the process was the identification of critical technologies which enable the future scientific and human exploration goals. This paper describes the process for technology capabilities identification and examines the critical capability needs identified in the MPPG process. Several critical enabling technologies that have been identified to support the robotic exploration goals and with potential feedforward application to human exploration goals. Potential roadmaps for the development and validation of these technologies are discussed, including options for subscale technology demonstrations of future human exploration technologies on robotic missions.

  7. Autonomous RPOD Technology Challenges for the Coming Decade

    NASA Technical Reports Server (NTRS)

    Naasz, Bo J.; Moreau, Michael C.

    2012-01-01

    Rendezvous Proximity Operations and Docking (RPOD) technologies are important to a wide range of future space endeavors. This paper will review some of the recent and ongoing activities related to autonomous RPOD capabilities and summarize the current state of the art. Gaps are identified where future investments are necessary to successfully execute some of the missions likely to be conducted within the next ten years. A proposed RPOD technology roadmap that meets the broad needs of NASA's future missions will be outlined, and ongoing activities at OSFC in support of a future satellite servicing mission are presented. The case presented shows that an evolutionary, stair-step technology development program. including a robust campaign of coordinated ground tests and space-based system-level technology demonstration missions, will ultimately yield a multi-use main-stream autonomous RPOD capability suite with cross-cutting benefits across a wide range of future applications.

  8. Neurons That Update Representations of the Future.

    PubMed

    Seriès, Peggy

    2018-06-11

    A recent article shows that the brain automatically estimates the probabilities of possible future actions before it has even received all the information necessary to decide what to do next. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.

  9. Guidance, Navigation, and Control Technology Assessment for Future Planetary Science Missions

    NASA Technical Reports Server (NTRS)

    Beauchamp, Pat; Cutts, James; Quadrelli, Marco B.; Wood, Lincoln J.; Riedel, Joseph E.; McHenry, Mike; Aung, MiMi; Cangahuala, Laureano A.; Volpe, Rich

    2013-01-01

    Future planetary explorations envisioned by the National Research Council's (NRC's) report titled Vision and Voyages for Planetary Science in the Decade 2013-2022, developed for NASA Science Mission Directorate (SMD) Planetary Science Division (PSD), seek to reach targets of broad scientific interest across the solar system. This goal requires new capabilities such as innovative interplanetary trajectories, precision landing, operation in close proximity to targets, precision pointing, multiple collaborating spacecraft, multiple target tours, and advanced robotic surface exploration. Advancements in Guidance, Navigation, and Control (GN&C) and Mission Design in the areas of software, algorithm development and sensors will be necessary to accomplish these future missions. This paper summarizes the key GN&C and mission design capabilities and technologies needed for future missions pursuing SMD PSD's scientific goals.

  10. General aviation components. [performance and capabilities of general aviation aircraft

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An overview is presented of selected aviation vehicles. The capabilities and performance of these vehicles are first presented, followed by a discussion of the aerodynamics, structures and materials, propulsion systems, noise, and configurations of fixed-wing aircraft. Finally the discussion focuses on the history, status, and future of attempts to provide vehicles capable of short-field operations.

  11. Assessing Graduate Sustainability Capability Post-Degree Completion: Why Is It Important and What Are the Challenges?

    ERIC Educational Resources Information Center

    Sandri, Orana; Holdsworth, Sarah; Thomas, Ian

    2018-01-01

    Purpose: The purpose of this paper is to highlight both the need for measurement of graduate capabilities post-degree completion and the challenges posed by such a task. Higher education institutions provide an important site of learning that can equip future professionals with capabilities to manage and respond to complex sustainability…

  12. The Future Is Now for Renewed Commitment

    ERIC Educational Resources Information Center

    Campolo, Anthony

    2012-01-01

    The young people who are coming into school are not lacking in intelligence, are not lacking in intellectual capability, but are lacking in the capability of paying attention, and the capability of reading. And to a large degree, they are inadequately taught because in many instances they are taught by teachers who themselves leave much to be…

  13. Treatment Strategies for Lead in Drinking Water

    EPA Science Inventory

    Lead pipes are capable of lasting hundreds of years. Conservatively, there are over 12 million, still serving drinking water in the US. Probably, this is a substantial underestimate. Leaded solder joining copper pipe abounds. Leaded brasses have dominated the materials used for...

  14. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    PubMed

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  15. A Review of State-of-the-Art Separator Materials for Advanced Lithium-Based Batteries for Future Aerospace Missions

    NASA Technical Reports Server (NTRS)

    Bladwin, Richard S.

    2009-01-01

    As NASA embarks on a renewed human presence in space, safe, human-rated, electrical energy storage and power generation technologies, which will be capable of demonstrating reliable performance in a variety of unique mission environments, will be required. To address the future performance and safety requirements for the energy storage technologies that will enhance and enable future NASA Constellation Program elements and other future aerospace missions, advanced rechargeable, lithium-ion battery technology development is being pursued with an emphasis on addressing performance technology gaps between state-of-the-art capabilities and critical future mission requirements. The material attributes and related performance of a lithium-ion cell's internal separator component are critical for achieving overall optimal performance, safety and reliability. This review provides an overview of the general types, material properties and the performance and safety characteristics of current separator materials employed in lithium-ion batteries, such as those materials that are being assessed and developed for future aerospace missions.

  16. National Facilities study

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This study provides a set of recommendations for improving the effectiveness of our nation's aeronautics and space facilities. The study plan considers current and future government and commercial needs as well as DOD and NASA mission requirements through the year 2023. It addresses shortfalls in existing capabilities, new facility requirements, upgrades, consolidations, and phase-out of existing facilities. If the recommendations are implemented, they will provide world-class capability where it is vital to our country's needs and make us more efficient in meeting future needs.

  17. Next-Generation X-Ray Astronomy

    NASA Technical Reports Server (NTRS)

    White, Nicholas E.

    2011-01-01

    The future timing capabilities in X-ray astronomy will be reviewed. This will include reviewing the missions in implementation: Astro-H, GEMS, SRG, and ASTROSAT; those under study: currently ATHENA and LOFT; and new technologies that may enable future missions e.g. Lobster eye optics. These missions and technologies will bring exciting new capabilities across the entire time spectrum from micro-seconds to years that e.g. will allow us to probe close to the event horizon of black holes and constrain the equation of state of neutron stars.

  18. Drop Tower and Aircraft Capabilities

    NASA Technical Reports Server (NTRS)

    Urban, David L.

    2015-01-01

    This presentation is a brief introduction to existing capabilities in drop towers and low-gravity aircraft that will be presented as part of a Symposium: Microgravity Platforms Other Than the ISS, From Users to Suppliers which will be a half day program to bring together the international community of gravity-dependent scientists, program officials and technologists with the suppliers of low gravity platforms (current and future) to focus on the future requirements and use of platforms other than the International Space Station (ISS).

  19. The Role of X-Rays in Future Space Navigation and Communication

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Gendreau, Keith C.; Hasouneh, Monther A.; Mitchell, Jason W.; Fong, Wai H.; Lee, Wing-Tsz; Gavriil, Fotis; Arzoumanian, Zaven

    2013-01-01

    In the near future, applications using X-rays will enable autonomous navigation and time distribution throughout the solar system, high capacity and low-power space data links, highly accurate attitude sensing, and extremely high-precision formation flying capabilities. Each of these applications alone has the potential to revolutionize mission capabilities, particularly beyond Earth orbit. This paper will outline the NASA Goddard Space Flight Center vision and efforts toward realizing the full potential of X-ray navigation and communications.

  20. Unmanned Warfare: Second and Third Order Effects Stemming from the Afghan Operational Environment between 2001 and 2010

    DTIC Science & Technology

    2011-06-10

    the very nature of warfare took a dramatic step into the future. With new assets capable of remaining airborne for nearly 24 hours and live video ...warfare took a dramatic step into the future. With new assets capable of remaining airborne for nearly 24 hours and live video feeds streaming to...shape the battlefield during protracted combat operations. From the real time video feeds, to the 24 hour coverage of an area of interest, tangible

  1. Propulsion Ground Testing: Planning for the Future

    NASA Technical Reports Server (NTRS)

    Bruce, Robert

    2003-01-01

    Advanced planners are constantly being asked to plan for the provision of future test capability. Historically, this capability is provided either by substantial investment in new test facility capabilities, or in the substantial investment in the modification of pre-exiting test facilities. The key words in the previous sentence are 'substantial investment.' In the evolving environment of increasingly constrained resources, how is an advanced planner to plan for the provisions of such capabilities? Additionally, the conundrum exists that program formulation decisions are being made based on both life cycle cost decisions in an environment in which the more immediate challenge of front-end capital investment oftentimes is the linchpin upon which early decisions are made. In such an environment, how are plans and decisions made? This paper cites examples of decisions made in the past in the area of both major test facility upgrades, as well as major new test facility investment.

  2. Durability Challenges for Next Generation of Gas Turbine Engine Materials

    NASA Technical Reports Server (NTRS)

    Misra, Ajay K.

    2012-01-01

    Aggressive fuel burn and carbon dioxide emission reduction goals for future gas turbine engines will require higher overall pressure ratio, and a significant increase in turbine inlet temperature. These goals can be achieved by increasing temperature capability of turbine engine hot section materials and decreasing weight of fan section of the engine. NASA is currently developing several advanced hot section materials for increasing temperature capability of future gas turbine engines. The materials of interest include ceramic matrix composites with 1482 - 1648 C temperature capability, advanced disk alloys with 815 C capability, and low conductivity thermal barrier coatings with erosion resistance. The presentation will provide an overview of durability challenges with emphasis on the environmental factors affecting durability for the next generation of gas turbine engine materials. The environmental factors include gaseous atmosphere in gas turbine engines, molten salt and glass deposits from airborne contaminants, impact from foreign object damage, and erosion from ingestion of small particles.

  3. Past, Present, and Future Capabilities of the Transonic Dynamics Tunnel from an Aeroelasticity Perspective

    NASA Technical Reports Server (NTRS)

    Cole, Stanley R.; Garcia, Jerry L.

    2000-01-01

    The NASA Langley Transonic Dynamics Tunnel (TDT) has provided a unique capability for aeroelastic testing for forty years. The facility has a rich history of significant contributions to the design of many United States commercial transports, military aircraft, launch vehicles, and spacecraft. The facility has many features that contribute to its uniqueness for aeroelasticity testing, perhaps the most important feature being the use of a heavy gas test medium to achieve higher test densities. Higher test medium densities substantially improve model-building requirements and therefore simplify the fabrication process for building aeroelastically scaled wind tunnel models. Aeroelastic scaling for the heavy gas results in lower model structural frequencies. Lower model frequencies tend to a make aeroelastic testing safer. This paper will describe major developments in the testing capabilities at the TDT throughout its history, the current status of the facility, and planned additions and improvements to its capabilities in the near future.

  4. Communicating Environmental Risks: Clarifying the Severity Effect in Interpretations of Verbal Probability Expressions

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam

    2011-01-01

    Verbal probability expressions are frequently used to communicate risk and uncertainty. The Intergovernmental Panel on Climate Change (IPCC), for example, uses them to convey risks associated with climate change. Given the potential for human action to mitigate future environmental risks, it is important to understand how people respond to these…

  5. A new algorithm for finding survival coefficients employed in reliability equations

    NASA Technical Reports Server (NTRS)

    Bouricius, W. G.; Flehinger, B. J.

    1973-01-01

    Product reliabilities are predicted from past failure rates and reasonable estimate of future failure rates. Algorithm is used to calculate probability that product will function correctly. Algorithm sums the probabilities of each survival pattern and number of permutations for that pattern, over all possible ways in which product can survive.

  6. 32 CFR 154.40 - General.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY DEPARTMENT OF DEFENSE PERSONNEL SECURITY... effort to assess the probability of future behavior which could have an effect adverse to the national... the past but necessarily anticipating the future. Rarely is proof of trustworthiness and reliability...

  7. 78 FR 54509 - Tenth Meeting: RTCA Next Gen Advisory Committee (NAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ... Capabilities Work Group. Recommendation for Future Metroplex Optimization Activity. [cir] Recommendation for Future Use of Optimization of Airspace and Procedures in the Metroplex (OAPM) developed by the...

  8. National Institute for Rocket Propulsion Systems (NIRPS): Solutions Facilitator

    NASA Technical Reports Server (NTRS)

    Brown, Tom

    2011-01-01

    National Institute for Rocket Propulsion Systems (NIRPS) "Solutions" plans to enable our nation's future in rocket propulsion systems by leveraging existing skills and capabilities to support industry's future needs

  9. NASA DOEPOD NDE Capabilities Data Book

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. Although 0.90 POD with 95% confidence at critical flaw sizes is often stated as an inspection requirement in inspection documents, including NASA Standards, NASA critical aerospace applications have historically only accepted 0.978 POD or better with a 95% one-sided lower confidence bound exceeding 0.90 at critical flaw sizes, a90/95.

  10. Multiscale Characterization of the Probability Density Functions of Velocity and Temperature Increment Fields

    NASA Astrophysics Data System (ADS)

    DeMarco, Adam Ward

    The turbulent motions with the atmospheric boundary layer exist over a wide range of spatial and temporal scales and are very difficult to characterize. Thus, to explore the behavior of such complex flow enviroments, it is customary to examine their properties from a statistical perspective. Utilizing the probability density functions of velocity and temperature increments, deltau and deltaT, respectively, this work investigates their multiscale behavior to uncover the unique traits that have yet to be thoroughly studied. Utilizing diverse datasets, including idealized, wind tunnel experiments, atmospheric turbulence field measurements, multi-year ABL tower observations, and mesoscale models simulations, this study reveals remarkable similiarities (and some differences) between the small and larger scale components of the probability density functions increments fields. This comprehensive analysis also utilizes a set of statistical distributions to showcase their ability to capture features of the velocity and temperature increments' probability density functions (pdfs) across multiscale atmospheric motions. An approach is proposed for estimating their pdfs utilizing the maximum likelihood estimation (MLE) technique, which has never been conducted utilizing atmospheric data. Using this technique, we reveal the ability to estimate higher-order moments accurately with a limited sample size, which has been a persistent concern for atmospheric turbulence research. With the use robust Goodness of Fit (GoF) metrics, we quantitatively reveal the accuracy of the distributions to the diverse dataset. Through this analysis, it is shown that the normal inverse Gaussian (NIG) distribution is a prime candidate to be used as an estimate of the increment pdfs fields. Therefore, using the NIG model and its parameters, we display the variations in the increments over a range of scales revealing some unique scale-dependent qualities under various stability and ow conditions. This novel approach can provide a method of characterizing increment fields with the sole use of only four pdf parameters. Also, we investigate the capability of the current state-of-the-art mesoscale atmospheric models to predict the features and highlight the potential for use for future model development. With the knowledge gained in this study, a number of applications can benefit by using our methodology, including the wind energy and optical wave propagation fields.

  11. Deep Recurrent Neural Networks for Supernovae Classification

    NASA Astrophysics Data System (ADS)

    Charnock, Tom; Moss, Adam

    2017-03-01

    We apply deep recurrent neural networks, which are capable of learning complex sequential information, to classify supernovae (code available at https://github.com/adammoss/supernovae). The observational time and filter fluxes are used as inputs to the network, but since the inputs are agnostic, additional data such as host galaxy information can also be included. Using the Supernovae Photometric Classification Challenge (SPCC) data, we find that deep networks are capable of learning about light curves, however the performance of the network is highly sensitive to the amount of training data. For a training size of 50% of the representational SPCC data set (around 104 supernovae) we obtain a type-Ia versus non-type-Ia classification accuracy of 94.7%, an area under the Receiver Operating Characteristic curve AUC of 0.986 and an SPCC figure-of-merit F 1 = 0.64. When using only the data for the early-epoch challenge defined by the SPCC, we achieve a classification accuracy of 93.1%, AUC of 0.977, and F 1 = 0.58, results almost as good as with the whole light curve. By employing bidirectional neural networks, we can acquire impressive classification results between supernovae types I, II and III at an accuracy of 90.4% and AUC of 0.974. We also apply a pre-trained model to obtain classification probabilities as a function of time and show that it can give early indications of supernovae type. Our method is competitive with existing algorithms and has applications for future large-scale photometric surveys.

  12. Analysis of blocking probability for OFDM-based variable bandwidth optical network

    NASA Astrophysics Data System (ADS)

    Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi

    2011-12-01

    Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.

  13. Lightweight Tactical Client: A Capability-Based Approach to Command Post Computing

    DTIC Science & Technology

    2015-12-01

    bundles these capabilities together is proposed: a lightweight tactical client. In order to avoid miscommunication in the future, it is... solutions and almost definitely rules out most terminal-based thin clients. UNCLASSIFIED Approved for public release

  14. Transport implementation of the Bernstein-Vazirani algorithm with ion qubits

    NASA Astrophysics Data System (ADS)

    Fallek, S. D.; Herold, C. D.; McMahon, B. J.; Maller, K. M.; Brown, K. R.; Amini, J. M.

    2016-08-01

    Using trapped ion quantum bits in a scalable microfabricated surface trap, we perform the Bernstein-Vazirani algorithm. Our architecture takes advantage of the ion transport capabilities of such a trap. The algorithm is demonstrated using two- and three-ion chains. For three ions, an improvement is achieved compared to a classical system using the same number of oracle queries. For two ions and one query, we correctly determine an unknown bit string with probability 97.6(8)%. For three ions, we succeed with probability 80.9(3)%.

  15. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    NASA Technical Reports Server (NTRS)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  16. Stochastic Modelling of Past Volcanic Crises

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    It is customary to have continuous monitoring of volcanoes showing signs of unrest that might lead to an eruption threatening local populations. Despite scientific progress in estimating the probability of an eruption occurring, the concept of continuously tracking eruption probability remains a future aspiration for volcano risk analysts. During some recent major volcanic crises, attempts have been made to estimate the eruption probability in real time to support government decision-making. These include the possibility of an eruption of Katla linked with the eruption of Eyjafjallajökull in 2010, and the Santorini crisis of 2011-2012. However, once a crisis fades, interest in analyzing the probability that there might have been an eruption tends to wane. There is an inherent outcome bias well known to psychologists: if disaster was avoided, there is perceived to be little purpose in exploring scenarios where a disaster might have happened. Yet the better that previous periods of unrest are understood and modelled, the better that the risk associated with future periods of unrest will be quantified. Scenarios are counterfactual histories of the future. The task of quantifying the probability of an eruption for a past period of unrest should not be merely a statistical calculation, but should serve to elucidate and refine geophysical models of the eruptive processes. This is achieved by using a Bayesian Belief Network approach, in which monitoring observations are used to draw inferences on the underlying causal factors. Specifically, risk analysts are interested in identifying what dynamical perturbations might have tipped an unrest period in history over towards an eruption, and assessing what was the likelihood of such perturbations. Furthermore, in what ways might a historical volcano crisis have turned for the worse? Such important counterfactual questions are addressed in this paper.

  17. A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.

    PubMed

    Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen

    2014-01-01

    Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.

  18. Deployment Design of Wireless Sensor Network for Simple Multi-Point Surveillance of a Moving Target

    PubMed Central

    Tsukamoto, Kazuya; Ueda, Hirofumi; Tamura, Hitomi; Kawahara, Kenji; Oie, Yuji

    2009-01-01

    In this paper, we focus on the problem of tracking a moving target in a wireless sensor network (WSN), in which the capability of each sensor is relatively limited, to construct large-scale WSNs at a reasonable cost. We first propose two simple multi-point surveillance schemes for a moving target in a WSN and demonstrate that one of the schemes can achieve high tracking probability with low power consumption. In addition, we examine the relationship between tracking probability and sensor density through simulations, and then derive an approximate expression representing the relationship. As the results, we present guidelines for sensor density, tracking probability, and the number of monitoring sensors that satisfy a variety of application demands. PMID:22412326

  19. Coming of Age on a Shoestring Budget: Financial Capability and Financial Behaviors of Lower-Income Millennials.

    PubMed

    West, Stacia; Friedline, Terri

    2016-10-01

    Lower-income millennials make important financial decisions that may affect their future financial well-being. With limited resources, this population is at risk for acquiring too much debt or being unprepared for a financial emergency that can send them further into poverty and constrain their ability to leverage resources for future economic mobility. A financial capability approach, an intervention that combines financial education with financial inclusion through the use of a savings account, may correlate with millennials’ healthy financial behaviors. This study used data from the 2012 National Financial Capability Study to examine the relationship between financial capability and the financial behaviors of lower-income millennials between the ages of 18 and 34 years (N = 2,578). Compared with those lower-income millennials who were financially excluded, those who were financially capable were also 171 percent more likely to afford an unexpected expense, 182 percent more likely to save for emergencies, and 34 percent less likely to carry too much debt, relating to their greater overall financial satisfaction. The findings of this study indicate that interventions that develop lower-income millennials’ financial capability may be effective for promoting healthy financial behaviors.

  20. History and future of remote sensing technology and education

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.

    1980-01-01

    A historical overview of the discovery and development of photography, related sciences, and remote sensing technology is presented. The role of education to date in the development of remote sensing is discussed. The probable future and potential of remote sensing and training is described.

  1. The role of magical thinking in forecasting the future.

    PubMed

    Stavrova, Olga; Meckel, Andrea

    2017-02-01

    This article explores the role of magical thinking in the subjective probabilities of future chance events. In five experiments, we show that individuals tend to predict a more lucky future (reflected in probability judgements of lucky and unfortunate chance events) for someone who happened to purchase a product associated with a highly moral person than for someone who unknowingly purchased a product associated with a highly immoral person. In the former case, positive events were considered more likely than negative events, whereas in the latter case, the difference in the likelihood judgement of positive and negative events disappeared or even reversed. Our results indicate that this effect is unlikely to be driven by participants' immanent justice beliefs, the availability heuristic, or experimenter demand. Finally, we show that individuals rely more heavily on magical thinking when their need for control is threatened, thus suggesting that lack of control represents a factor in driving magical thinking in making predictions about the future. © 2016 The British Psychological Society.

  2. Probability of criminal acts of violence: a test of jury predictive accuracy.

    PubMed

    Reidy, Thomas J; Sorensen, Jon R; Cunningham, Mark D

    2013-01-01

    The ability of capital juries to accurately predict future prison violence at the sentencing phase of aggravated murder trials was examined through retrospective review of the disciplinary records of 115 male inmates sentenced to either life (n = 65) or death (n = 50) in Oregon from 1985 through 2008, with a mean post-conviction time at risk of 15.3 years. Violent prison behavior was completely unrelated to predictions made by capital jurors, with bidirectional accuracy simply reflecting the base rate of assaultive misconduct in the group. Rejection of the special issue predicting future violence enjoyed 90% accuracy. Conversely, predictions that future violence was probable had 90% error rates. More than 90% of the assaultive rule violations committed by these offenders resulted in no harm or only minor injuries. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Future Plans for NASA's Deep Space Network

    NASA Technical Reports Server (NTRS)

    Deutsch, Leslie J.; Preston, Robert A.; Geldzahler, Barry J.

    2008-01-01

    This slide presentation reviews the importance of NASA's Deep Space Network (DSN) to space exploration, and future planned improvements to the communication capabilities that the network allows, in terms of precision, and communication power.

  4. Development and Transition of the Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) Toolset

    NASA Technical Reports Server (NTRS)

    Spann, James F.; Zank, G.

    2014-01-01

    We outline a plan to develop and transition a physics based predictive toolset called The Radiation, Interplanetary Shocks, and Coronal Sources (RISCS) to describe the interplanetary energetic particle and radiation environment throughout the inner heliosphere, including at the Earth. To forecast and "nowcast" the radiation environment requires the fusing of three components: 1) the ability to provide probabilities for incipient solar activity; 2) the use of these probabilities and daily coronal and solar wind observations to model the 3D spatial and temporal heliosphere, including magnetic field structure and transients, within 10 Astronomical Units; and 3) the ability to model the acceleration and transport of energetic particles based on current and anticipated coronal and heliospheric conditions. We describe how to address 1) - 3) based on our existing, well developed, and validated codes and models. The goal of RISCS toolset is to provide an operational forecast and "nowcast" capability that will a) predict solar energetic particle (SEP) intensities; b) spectra for protons and heavy ions; c) predict maximum energies and their duration; d) SEP composition; e) cosmic ray intensities, and f) plasma parameters, including shock arrival times, strength and obliquity at any given heliospheric location and time. The toolset would have a 72 hour predicative capability, with associated probabilistic bounds, that would be updated hourly thereafter to improve the predicted event(s) and reduce the associated probability bounds. The RISCS toolset would be highly adaptable and portable, capable of running on a variety of platforms to accommodate various operational needs and requirements. The described transition plan is based on a well established approach developed in the Earth Science discipline that ensures that the customer has a tool that meets their needs

  5. The Integrated Mission Design Center (IMDC) at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Karpati, Gabriel; Martin, John; Steiner, Mark; Reinhardt, K.

    2002-01-01

    NASA Goddard has used its Integrated Mission Design Center (IMDC) to perform more than 150 mission concept studies. The IMDC performs rapid development of high-level, end-to-end mission concepts, typically in just 4 days. The approach to the studies varies, depending on whether the proposed mission is near-future using existing technology, mid-future using new technology being actively developed, or far-future using technology which may not yet be clearly defined. The emphasis and level of detail developed during any particular study depends on which timeframe (near-, mid-, or far-future) is involved and the specific needs of the study client. The most effective mission studies are those where mission capabilities required and emerging technology developments can synergistically work together; thus both enhancing mission capabilities and providing impetus for ongoing technology development.

  6. Changing Knowledge, Changing Technology: Implications for Teacher Education Futures

    ERIC Educational Resources Information Center

    Burden, Kevin; Aubusson, Peter; Brindley, Sue; Schuck, Sandy

    2016-01-01

    Recent research in teacher education futures has identified two themes that require further study: the changing nature of knowledge and the changing capabilities of technologies. This article examines the intersection of these two themes and their implications for teacher education. The research employed futures methodologies based on scenario…

  7. A methodological framework to assess PMP and PMF in snow-dominated watersheds under changing climate conditions - A case study of three watersheds in Québec (Canada)

    NASA Astrophysics Data System (ADS)

    Rouhani, Hassan; Leconte, Robert

    2018-06-01

    Climate change will affect precipitation and flood regimes. It is anticipated that the Probable Maximum Precipitation (PMP) and Probable Maximum Flood (PMF) will be modified in a changing climate. This paper aims to quantify and analyze climate change influences on PMP and PMF in three watersheds with different climatic conditions across the province of Québec, Canada. Output data from the Canadian Regional Climate Model (CRCM) was used to estimate PMP and Probable Maximum Snow Accumulation (PMSA) in future climate projections, which was then used to force the SWAT hydrological model to estimate PMF. PMP and PMF values were estimated for two time horizons each spanning 30 years: 1961-1990 (recent past) and 2041-2070 (future). PMP and PMF were separately analyzed for two seasons: summer-fall and spring. Results show that PMF in the watershed located in southern Québec would remain unchanged in the future horizon, but the trend for the watersheds located in the northeastern and northern areas of the province is an increase of up to 11%.

  8. Toward a Climate OSSE for NASA Earth Sciences

    NASA Astrophysics Data System (ADS)

    Leroy, S. S.; Collins, W. D.; Feldman, D.; Field, R. D.; Ming, Y.; Pawson, S.; Sanderson, B.; Schmidt, G. A.

    2016-12-01

    In the Continuity Study, the National Academy of Sciences advised that future space missions be rated according to five categories: the importance of a well-defined scientific objective, the utility of the observation in addressing the scientific objective, the quality with which the observation can be made, the probability of the mission's success, and the mission's affordability. The importance, probability, and affordability are evaluated subjectively by scientific consensus, by engineering review panels, and by cost models; however, the utility and quality can be evaluated objectively by a climate observation system simulation experiment (COSSE). A discussion of the philosophical underpinnings of a COSSE for NASA Earth Sciences will be presented. A COSSE is built upon a perturbed physics ensemble of a sophisticated climate model that can simulate a mission's prospective observations and its well-defined quantitative scientific objective and that can capture the uncertainty associated with each. A strong correlation between observation and scientific objective after consideration of physical uncertainty leads to a high quality. Persistence of a high correlation after inclusion of the proposed measurement error leads to a high utility. There are five criteria that govern that nature of a particular COSSE: (1) whether the mission's scientific objective is one of hypothesis testing or climate prediction, (2) whether the mission is empirical or inferential, (3) whether the core climate model captures essential physical uncertainties, (4) the level of detail of the simulated observations, and (5) whether complementarity or redundancy of information is to be valued. Computation of the quality and utility is done using Bayesian statistics, as has been done previously for multi-decadal climate prediction conditioned on existing data. We advocate for a new program within NASA Earth Sciences to establish a COSSE capability. Creation of a COSSE program within NASA Earth Sciences will require answers from the climate research community to basic questions, such as whether a COSSE capability should be centralized or de-centralized. Most importantly, the quantified scientific objective of a proposed mission must be defined with extreme specificity for a COSSE to be applied.

  9. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences.

    PubMed

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-02-02

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.

  10. Potential economic benefits of adapting agricultural production systems to future climate change

    USGS Publications Warehouse

    Fagre, Daniel B.; Pederson, Gregory; Bengtson, Lindsey E.; Prato, Tony; Qui, Zeyuan; Williams, Jimmie R.

    2010-01-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960–2005) and future climate period (2006–2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.

  11. Potential economic benefits of adapting agricultural production systems to future climate change.

    PubMed

    Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E; Williams, Jimmy R

    2010-03-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO(2) emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.

  12. Biodegradation potential of cyano-based ionic liquid anions in a culture of Cupriavidus spp. and their in vitro enzymatic hydrolysis by nitrile hydratase.

    PubMed

    Neumann, Jennifer; Pawlik, Magdalena; Bryniok, Dieter; Thöming, Jorg; Stolte, Stefan

    2014-01-01

    Biodegradation tests with bacteria from activated sludge revealed the probable persistence of cyano-based ionic liquid anions when these leave waste water treatment plants. A possible biological treatment using bacteria capable of biodegrading similar compounds, namely cyanide and cyano-complexes, was therefore examined. With these bacteria from the genera Cupriavidus, the ionic liquid anions B(CN)₄(-), C(CN)₃(-), N(CN)₂(-) combined with alkaline cations were tested in different growth media using ion chromatography for the examination of their primary biodegradability. However, no enhanced biodegradability of the tested cyano-based ionic liquids was observed. Therefore, an in vitro enzymatic hydrolysis test was additionally run showing that all tested ionic liquid (IL) anions can be hydrolysed to their corresponding amides by nitrile hydratase, but not by nitrilase under the experimental conditions. The biological stability of the cyano-based anions is an advantage in technological application, but the occurrence of enzymes that are able to hydrolyse the parent compound gives a new perspective on future cyano-based IL anion treatment.

  13. Near-Earth Object Interception Using Nuclear Thermal Rock Propulsion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    X-L. Zhang; E. Ball; L. Kochmanski

    Planetary defense has drawn wide study: despite the low probability of a large-scale impact, its consequences would be disastrous. The study presented here evaluates available protection strategies to identify bottlenecks limiting the scale of near-Earth object that could be deflected, using cutting-edge and near-future technologies. It discusses the use of a nuclear thermal rocket (NTR) as a propulsion device for delivery of thermonuclear payloads to deflect or destroy a long-period comet on a collision course with Earth. A ‘worst plausible scenario’ for the available warning time (10 months) and comet approach trajectory are determined, and empirical data are used tomore » make an estimate of the payload necessary to deflect such a comet. Optimizing the tradeoff between early interception and large deflection payload establishes the ideal trajectory for an interception mission to follow. The study also examines the potential for multiple rocket launch dates. Comparison of propulsion technologies for this mission shows that NTR outperforms other options substantially. The discussion concludes with an estimate of the comet size (5 km) that could be deflected usingNTRpropulsion, given current launch capabilities.« less

  14. A Risk Analysis of the Molybdenum-99 Supply Chain Using Bayesian Networks

    NASA Astrophysics Data System (ADS)

    Liang, Jeffrey Ryan

    The production of Molybdenum-99 (99Mo) is critical to the field of nuclear medicine, where it is utilized in roughly 80% of all nuclear imaging procedures. In October of 2016, the National Research Universal (NRU) reactor in Canada, which historically had the highest 99Mo production capability worldwide, ceased routine production and will be permanently shut down in 2018. This loss of capacity has led to widespread concern over the ability of the 99Mo supply chain and to meet demand. There is significant disagreement among analyses from trade groups, governments, and other researchers, predicting everything from no significant impact to major worldwide shortages. Using Bayesian networks, this research focused on modeling the 99Mo supply chain to quantify how a disrupting event, such as the unscheduled downtime of a reactor, will impact the global supply. This not only includes quantifying the probability of a shortage occurring, but also identifying which nodes in the supply chain introduce the most risk to better inform decision makers on where future facilities or other risk mitigation techniques should be applied.

  15. Nonlinear problems in data-assimilation : Can synchronization help?

    NASA Astrophysics Data System (ADS)

    Tribbia, J. J.; Duane, G. S.

    2009-12-01

    Over the past several years, operational weather centers have initiated ensemble prediction and assimilation techniques to estimate the error covariance of forecasts in the short and the medium range. The ensemble techniques used are based on linear methods. The theory This technique s been shown to be a useful indicator of skill in the linear range where forecast errors are small relative to climatological variance. While this advance has been impressive, there are still ad hoc aspects of its use in practice, like the need for covariance inflation which are troubling. Furthermore, to be of utility in the nonlinear range an ensemble assimilation and prediction method must be capable of giving probabilistic information for the situation where a probability density forecast becomes multi-modal. A prototypical, simplest example of such a situation is the planetary-wave regime transition where the pdf is bimodal. Our recent research show how the inconsistencies and extensions of linear methodology can be consistently treated using the paradigm of synchronization which views the problems of assimilation and forecasting as that of optimizing the forecast model state with respect to the future evolution of the atmosphere.

  16. Psychoactive substances of the South Seas: betel, kava and pituri.

    PubMed

    Cawte, J

    1985-03-01

    Before white man brought his alcohol to the South Pacific, the indigenes were using many wild plants possessing psychoactive properties. The most prominent were betel in much of Melanesia, kava in much of Polynesia, and pituri in much of Australia. The use of each of these three drugs was widespread, institutionalised as a ritual and the occasion for extensive trade. Each was valued for its effect in reducing tension or in producing altered states of consciousness. Each was also capable of inducing intoxication. Since few physicians nowadays have had my opportunity to observe the use of all three of these substances, their main features are recalled here. Attention is paid to their traditional use and probable future use, to their pharmacological and clinical properties, and to their place in the zeitgeist of people and period. There is no indication that these substances will be espoused by the drug enthusiasts of the West as avidly as other ethno-psychopharmacological agents such as Peruvian coca leaf, the Indian hemp, the Asian poppy, or the American tobacco. The possibility, however, of some use in the West cannot be discounted.

  17. Technology development plan: Geotechnical survey systems for OTEC (Ocean Thermal Energy Conversion) cold water pipes

    NASA Astrophysics Data System (ADS)

    Valent, Philip J.; Riggins, Michael

    1989-04-01

    An overview is given of current and developing technologies and techniques for performing geotechnical investigations for siting and designing Cold Water Pipes (CWP) for shelf-resting Ocean Thermal Energy Conversion (OTEC) power plants. The geotechnical in situ tools used to measure the required parameters and the equipment/systems used to deploy these tools are identified. The capabilities of these geotechnical tools and deployment systems are compared to the data requirements for the CWP foundation/anchor design, and shortfalls are identified. For the last phase of geotechnical data gathering for design, a drillship will be required to perform soil boring work, to obtain required high quality sediment samples for laboratory dynamic testing, and to perform deep penetration in situ tests. To remedy shortfalls and to reduce the future OTEC CWP geotechnical survey costs, it is recommended that a seafloor resting machine be developed to advance the friction cone penetrometer, and also probably a pressuremeter, to provide geotechnical parameters to shallow subseafloor penetrations on slopes of 35 deg and in water depths to 1300 m.

  18. SETI Observations of Low Mass Stars at the SETI Institute

    NASA Astrophysics Data System (ADS)

    Harp, Gerald R.

    2017-05-01

    Are planets orbiting low-mass stars suitable for the development of life? Observations in the near future, including radio, will help to assess whether atmospheres do persist over long timescales for planets orbiting nearby M dwarfs, and clarify the nature of the radiation that penetrates to the surface of these planets. These are important ingredients for assessing planetary habitability, yet the question of habitability can be answered only with the positive measurement of an unambiguous biosignature. Radio and optical SETI observations capable of detecting technological activities of intelligent inhabitants could provide the most compelling evidence for the habitability of exoplanets orbiting M dwarfs. In this presentation we shall consider what information can be gleaned from our observations so far. The SETI Institute is currently undertaking a large survey of 20,000 low mass stars that is now about 30% complete. The frequency coverage on each star is about 450 MHz bandwidth (per star) over a range of selected frequencies from 1-10 GHz. From these observations we derive quantitative results relating to the probability that M dwarfs are actually inhabited.

  19. A Summary of Actinide Enrichment Technologies and Capability Gaps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Bradley D.; Robinson, Sharon M.

    2017-01-01

    The evaluation performed in this study indicates that a new program is needed to efficiently provide a national actinide radioisotope enrichment capability to produce milligram-to-gram quantities of unique materials for user communities. This program should leverage past actinide enrichment, the recent advances in stable isotope enrichment, and assessments of the future requirements to cost effectively develop this capability while establishing an experience base for a new generation of researchers in this vital area. Preliminary evaluations indicate that an electromagnetic isotope separation (EMIS) device would have the capability to meet the future needs of the user community for enriched actinides. Themore » EMIS technology could be potentially coupled with other enrichment technologies, such as irradiation, as pre-enrichment and/or post-enrichment systems to increase the throughput, reduce losses of material, and/or reduce operational costs of the base EMIS system. Past actinide enrichment experience and advances in the EMIS technology applied in stable isotope separations should be leveraged with this new evaluation information to assist in the establishment of a domestic actinide radioisotope enrichment capability.« less

  20. Space Station Freedom extravehicular activity systems evolution study

    NASA Technical Reports Server (NTRS)

    Rouen, Michael

    1990-01-01

    Evaluation of Space Station Freedom (SSF) support of manned exploration is in progress to identify SSF extravehicular activity (EVA) system evolution requirements and capabilities. The output from these studies will provide data to support the preliminary design process to ensure that Space Station EVA system requirements for future missions (including the transportation node) are adequately considered and reflected in the baseline design. The study considers SSF support of future missions and the EVA system baseline to determine adequacy of EVA requirements and capabilities and to identify additional requirements, capabilities, and necessary technology upgrades. The EVA demands levied by formal requirements and indicated by evolutionary mission scenarios are high for the out-years of Space Station Freedom. An EVA system designed to meet the baseline requirements can easily evolve to meet evolution demands with few exceptions. Results to date indicate that upgrades or modifications to the EVA system may be necessary to meet the full range of EVA thermal environments associated with the transportation node. Work continues to quantify the EVA capability in this regard. Evolution mission scenarios with EVA and ground unshielded nuclear propulsion engines are inconsistent with anthropomorphic EVA capabilities.

  1. Recent Investments by NASA's National Force Measurement Technology Capability

    NASA Technical Reports Server (NTRS)

    Commo, Sean A.; Ponder, Jonathan D.

    2016-01-01

    The National Force Measurement Technology Capability (NFMTC) is a nationwide partnership established in 2008 and sponsored by NASA's Aeronautics Evaluation and Test Capabilities (AETC) project to maintain and further develop force measurement capabilities. The NFMTC focuses on force measurement in wind tunnels and provides operational support in addition to conducting balance research. Based on force measurement capability challenges, strategic investments into research tasks are designed to meet the experimental requirements of current and future aerospace research programs and projects. This paper highlights recent and force measurement investments into several areas including recapitalizing the strain-gage balance inventory, developing balance best practices, improving calibration and facility capabilities, and researching potential technologies to advance balance capabilities.

  2. Representation of Probability Density Functions from Orbit Determination using the Particle Filter

    NASA Technical Reports Server (NTRS)

    Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell

    2012-01-01

    Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.

  3. Interrelationships Between Receiver/Relative Operating Characteristics Display, Binomial, Logit, and Bayes' Rule Probability of Detection Methodologies

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2014-01-01

    Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.

  4. Toward an Objectivistic Theory of Probability

    DTIC Science & Technology

    1956-01-01

    OBJECTIVISTIC PROBABILITY THEORY 2 7 three potential acts: the individual may choose an apple, an orange or a banana . Each of these acts corresponds to a point...its veneer having begun to peel at one corner, etc., etc. Its future there-ness lies in that it may have its legs gnawed at by the new puppy in the

  5. Recent research on the high-probability instructional sequence: A brief review.

    PubMed

    Lipschultz, Joshua; Wilder, David A

    2017-04-01

    The high-probability (high-p) instructional sequence consists of the delivery of a series of high-probability instructions immediately before delivery of a low-probability or target instruction. It is commonly used to increase compliance in a variety of populations. Recent research has described variations of the high-p instructional sequence and examined the conditions under which the sequence is most effective. This manuscript reviews the most recent research on the sequence and identifies directions for future research. Recommendations for practitioners regarding the use of the high-p instructional sequence are also provided. © 2017 Society for the Experimental Analysis of Behavior.

  6. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    USGS Publications Warehouse

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving more confidence in the estimate, and we use those data to calculate the annual probability of a large eruption in the next year at 1.4x10-5.

  7. Retrieval Lesson Learned from NAST-I Hyperspectral Data

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L.; Liu, Xu; Larar, Allen M.; Mango, Stephen A.

    2007-01-01

    The retrieval lesson learned is important to many current and future hyperspectral remote sensors. Validated retrieval algorithms demonstrate the advancement of hyperspectral remote sensing capabilities to be achieved with current and future satellite instruments.

  8. Importance biasing scheme implemented in the PRIZMA code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kandiev, I.Z.; Malyshkin, G.N.

    1997-12-31

    PRIZMA code is intended for Monte Carlo calculations of linear radiation transport problems. The code has wide capabilities to describe geometry, sources, material composition, and to obtain parameters specified by user. There is a capability to calculate path of particle cascade (including neutrons, photons, electrons, positrons and heavy charged particles) taking into account possible transmutations. Importance biasing scheme was implemented to solve the problems which require calculation of functionals related to small probabilities (for example, problems of protection against radiation, problems of detection, etc.). The scheme enables to adapt trajectory building algorithm to problem peculiarities.

  9. Future Directions in Navy Electronic System Reliability and Survivability.

    DTIC Science & Technology

    1981-06-01

    CENTERSAN DIEGO, CA 92152 AN ACTIVITY OF THE NAVAL MATERIAL COMMAND SL GUILLE, CAPT, USN HLBLOOD Commander Technical Director ADMINISTRATIVE INFORMATION...maintenancepoiys proposed as one remedy to these problems. To implement this policy, electronic systems which are very reliable and which include health ...distribute vital data, data-processing capability, and communication capability through the use of intraship and intership networks. The capability to

  10. Utilization of extended bayesian networks in decision making under uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Eeckhout, Edward M; Leishman, Deborah A; Gibson, William L

    2009-01-01

    Bayesian network tool (called IKE for Integrated Knowledge Engine) has been developed to assess the probability of undesirable events. The tool allows indications and observables from sensors and/or intelligence to feed directly into hypotheses of interest, thus allowing one to quantify the probability and uncertainty of these events resulting from very disparate evidence. For example, the probability that a facility is processing nuclear fuel or assembling a weapon can be assessed by examining the processes required, establishing the observables that should be present, then assembling information from intelligence, sensors and other information sources related to the observables. IKE also hasmore » the capability to determine tasking plans, that is, prioritize which observable should be collected next to most quickly ascertain the 'true' state and drive the probability toward 'zero' or 'one.' This optimization capability is called 'evidence marshaling.' One example to be discussed is a denied facility monitoring situation; there is concern that certain process(es) are being executed at the site (due to some intelligence or other data). We will show how additional pieces of evidence will then ascertain with some degree of certainty the likelihood of this process(es) as each piece of evidence is obtained. This example shows how both intelligence and sensor data can be incorporated into the analysis. A second example involves real-time perimeter security. For this demonstration we used seismic, acoustic, and optical sensors linked back to IKE. We show how these sensors identified and assessed the likelihood of 'intruder' versus friendly vehicles.« less

  11. Policing and COIN Operations: Lessons Learned, Strategies, and Future Directions

    DTIC Science & Technology

    2011-01-01

    from the U.S. Reserves forces). In other instances such as the efforts in Southeast Asia, “medical (and veterinarian ) capabilities were very...The analysis found significant shortages and vulnerabilities in the following areas: Equipment.  Immature logistics capability and

  12. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.

  13. Flight Dynamics and GN&C for Spacecraft Servicing Missions

    NASA Technical Reports Server (NTRS)

    Naasz, Bo; Zimpfer, Doug; Barrington, Ray; Mulder, Tom

    2010-01-01

    Future human exploration missions and commercial opportunities will be enabled through In-space assembly and satellite servicing. Several recent efforts have developed technologies and capabilities to support these exciting future missions, including advances in flight dynamics and Guidance, Navigation and Control. The Space Shuttle has demonstrated significant capabilities for crewed servicing of the Hubble Space Telescope (HST) and assembly of the International Space Station (ISS). Following the Columbia disaster NASA made significant progress in developing a robotic mission to service the HST. The DARPA Orbital Express mission demonstrated automated rendezvous and capture, In-space propellant transfer, and commodity replacement. This paper will provide a summary of the recent technology developments and lessons learned, and provide a focus for potential future missions.

  14. Adult Services in the Third Millennium.

    ERIC Educational Resources Information Center

    Monroe, Margaret E.

    1979-01-01

    Presents a four-step model for "planning" or "forecasting" future of adult services in public libraries: (1) identification of forces at work; (2) analysis of probable impacts of one force upon another; (3) identification of preferred (and rejected) elements of future with forces that control elements; and (4) strategies to be…

  15. Error protection capability of space shuttle data bus designs

    NASA Technical Reports Server (NTRS)

    Proch, G. E.

    1974-01-01

    Error protection assurance in the reliability of digital data communications is discussed. The need for error protection on the space shuttle data bus system has been recognized and specified as a hardware requirement. The error protection techniques of particular concern are those designed into the Shuttle Main Engine Interface (MEI) and the Orbiter Multiplex Interface Adapter (MIA). The techniques and circuit design details proposed for these hardware are analyzed in this report to determine their error protection capability. The capability is calculated in terms of the probability of an undetected word error. Calculated results are reported for a noise environment that ranges from the nominal noise level stated in the hardware specifications to burst levels which may occur in extreme or anomalous conditions.

  16. Non-contact temperature measurement requirements for electronic materials processing

    NASA Technical Reports Server (NTRS)

    Lehoczky, S. L.; Szofran, F. R.

    1988-01-01

    The requirements for non-contact temperature measurement capabilities for electronic materials processing in space are assessed. Non-contact methods are probably incapable of sufficient accuracy for the actual absolute measurement of temperatures in most such applications but would be useful for imaging in some applications.

  17. Anti-Submarine Warfare (ASW) Capability Transformation: Strategy of Response to Effects Based Warfare

    DTIC Science & Technology

    2011-06-01

    the ROK Cheonan destroyer by a probable PRK submarine. Despite the authoritative findings of an International team that forensically examined the...evidence of the sinking implicating PRK , North Korea continues to maintain its innocence and deny any involvement, especially since there is

  18. Atomic Spectra Database (ASD)

    National Institute of Standards and Technology Data Gateway

    SRD 78 NIST Atomic Spectra Database (ASD) (Web, free access)   This database provides access and search capability for NIST critically evaluated data on atomic energy levels, wavelengths, and transition probabilities that are reasonably up-to-date. The NIST Atomic Spectroscopy Data Center has carried out these critical compilations.

  19. US Navy SHF SATCOM: Past, present and future

    NASA Astrophysics Data System (ADS)

    Bushnell, Christopher J.

    1994-06-01

    This thesis discusses the Navy's Super High Frequency Satellite Communications (SHF SATCOM) capabilities prior to Desert Shield/Desert Storm, and the requirements for future systems that were generated due to Navy SATCOM shortcomings during the Gulf War. The four-phased evolutionary approach the Navy has designed (based on post-war requirements) to provide itself with a medium for SHF SATCOM into the 21st Century, as well as the Defense Satellite Communications Systems (DSCS), are examined in detail. Decreasing defense budgets have begun to have a significant impact on future military satellite communication (MILSATCOM) systems. A cost comparison between utilization of DSCS III satellites and the INMARSAT commercial SATCOM system is presented. Recommended improvements to current MILSATCOM procedures and training practices are proposed that could improve operational C4I capabilities. Finally, this study determines that future SATCOM architectures should include a mixture of commercial systems and MILSATCOM systems to provide both cost savings and command and control protection.

  20. Computer-aided mathematical analysis of probability of intercept for ground-based communication intercept system

    NASA Astrophysics Data System (ADS)

    Park, Sang Chul

    1989-09-01

    We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.

  1. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery-based measurement capability to support cross cutting applications that span the Agency mission directorates as well as meeting potential needs of the commercial sector and national interests of the Intelligence, Surveillance and Reconnaissance community are explored. A recommendation is made for an assessment study to baseline current imaging technology including the identification of future mission requirements. Development of requirements fostered by the applications suggested in this paper would be used to identify technology gaps and direct roadmapping for implementation of an affordable and sustainable next generation sensor/platform system.

  2. 77 FR 19169 - Fisheries of the Caribbean, Gulf of Mexico, and South Atlantic; Snapper-Grouper Fishery Off the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... for ABC is the projected yield stream with a 70 percent probability of rebuilding success. The Council... to have an 81 percent chance of rebuilding in 10 years, greater than the 70 percent probability... AM applications. Should this ACT be used in the future to trigger AMs, then it may be expected to...

  3. Lightning Characteristics and Lightning Strike Peak Current Probabilities as Related to Aerospace Vehicle Operations

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Vaughan, William W.

    1998-01-01

    A summary is presented of basic lightning characteristics/criteria for current and future NASA aerospace vehicles. The paper estimates the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating probabilities of launch vehicles/objects being struck by lightning. This paper presents these results.

  4. Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness

    NASA Astrophysics Data System (ADS)

    Hardy, Tyler J.; Cain, Stephen C.

    2016-05-01

    The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.

  5. More than words: Adults learn probabilities over categories and relationships between them.

    PubMed

    Hudson Kam, Carla L

    2009-04-01

    This study examines whether human learners can acquire statistics over abstract categories and their relationships to each other. Adult learners were exposed to miniature artificial languages containing variation in the ordering of the Subject, Object, and Verb constituents. Different orders (e.g. SOV, VSO) occurred in the input with different frequencies, but the occurrence of one order versus another was not predictable. Importantly, the language was constructed such that participants could only match the overall input probabilities if they were tracking statistics over abstract categories, not over individual words. At test, participants reproduced the probabilities present in the input with a high degree of accuracy. Closer examination revealed that learner's were matching the probabilities associated with individual verbs rather than the category as a whole. However, individual nouns had no impact on word orders produced. Thus, participants learned the probabilities of a particular ordering of the abstract grammatical categories Subject and Object associated with each verb. Results suggest that statistical learning mechanisms are capable of tracking relationships between abstract linguistic categories in addition to individual items.

  6. The search for life's origins: Progress and future directions in planetary biology and chemical evolution

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The current state is reviewed of the study of chemical evolution and planetary biology and the probable future is discussed of the field, at least for the near term. To this end, the report lists the goals and objectives of future research and makes detailed, comprehensive recommendations for accomplishing them, emphasizing those issues that were inadequately discussed in earlier Space Studies Board reports.

  7. Assessment of potential future hydrogen markets in the U.S.

    NASA Technical Reports Server (NTRS)

    Kashani, A. K.

    1980-01-01

    Potential future hydrogen markets in the United States are assessed. Future hydrogen markets for various use sectors are projected, the probable range of hydrogen production costs from various alternatives is estimated, stimuli and barriers to the development of hydrogen markets are discussed, an overview of the status of technologies for the production and utilization of hydrogen is presented, and, finally, societal aspects of hydrogen production and utilization are discussed.

  8. Transition probabilities of health states for workers in Malaysia using a Markov chain model

    NASA Astrophysics Data System (ADS)

    Samsuddin, Shamshimah; Ismail, Noriszura

    2017-04-01

    The aim of our study is to estimate the transition probabilities of health states for workers in Malaysia who contribute to the Employment Injury Scheme under the Social Security Organization Malaysia using the Markov chain model. Our study uses four states of health (active, temporary disability, permanent disability and death) based on the data collected from the longitudinal studies of workers in Malaysia for 5 years. The transition probabilities vary by health state, age and gender. The results show that men employees are more likely to have higher transition probabilities to any health state compared to women employees. The transition probabilities can be used to predict the future health of workers in terms of a function of current age, gender and health state.

  9. Modeling nonbreeding distributions of shorebirds and waterfowl in response to climate change

    USGS Publications Warehouse

    Reese, Gordon; Skagen, Susan K.

    2017-01-01

    To identify areas on the landscape that may contribute to a robust network of conservation areas, we modeled the probabilities of occurrence of several en route migratory shorebirds and wintering waterfowl in the southern Great Plains of North America, including responses to changing climate. We predominantly used data from the eBird citizen-science project to model probabilities of occurrence relative to land-use patterns, spatial distribution of wetlands, and climate. We projected models to potential future climate conditions using five representative general circulation models of the Coupled Model Intercomparison Project 5 (CMIP5). We used Random Forests to model probabilities of occurrence and compared the time periods 1981–2010 (hindcast) and 2041–2070 (forecast) in “model space.” Projected changes in shorebird probabilities of occurrence varied with species-specific general distribution pattern, migration distance, and spatial extent. Species using the western and northern portion of the study area exhibited the greatest likelihoods of decline, whereas species with more easterly occurrences, mostly long-distance migrants, had the greatest projected increases in probability of occurrence. At an ecoregional extent, differences in probabilities of shorebird occurrence ranged from −0.015 to 0.045 when averaged across climate models, with the largest increases occurring early in migration. Spatial shifts are predicted for several shorebird species. Probabilities of occurrence of wintering Mallards and Northern Pintail are predicted to increase by 0.046 and 0.061, respectively, with northward shifts projected for both species. When incorporated into partner land management decision tools, results at ecoregional extents can be used to identify wetland complexes with the greatest potential to support birds in the nonbreeding season under a wide range of future climate scenarios.

  10. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Hayden, Jeffrey L.

    2005-01-01

    For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

  11. Managing Computer Systems Development: Understanding the Human and Technological Imperatives.

    DTIC Science & Technology

    1985-06-01

    for their organization’s use? How can they predict tle impact of future systems ca their management control capabilities ? Cf equal importance is the...commercial organizations discovered that there was only a limited capability of interaction between various types of computers. These organizations were...Viewed together, these three interrelated subsystems, EDP, MIS, and DSS, establish the framework of an overall systems capability known as a Computer

  12. Capabilities for Constrained Military Operations

    DTIC Science & Technology

    2016-12-01

    capabilities that have low technology risk and accomplish all of this on a short timeline. I fully endorse all of the recommendations contained in...for the U.S. to address such conflicts. The good news is that The DoD can prevail with inexpensive capabilities that have low technology risk and on a...future actions. The Study took a three-pronged approach to countering potential adversaries’ strategies for waging long-term campaigns for

  13. Graphical Visualization of Human Exploration Capabilities

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex

    2016-01-01

    NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description of planned future work to modify the computer program to include additional data and of alternate capability roadmap formats currently under consideration.

  14. Extravehicular Activity Operations Concepts Under Communication Latency and Bandwidth Constraints

    NASA Technical Reports Server (NTRS)

    Beaton, Kara H.; Chappell, Steven P.; Abercromby, Andrew F. J.; Miller, Matthew J.; Nawotniak, Shannon Kobs; Hughes, Scott; Brady, Allyson; Lim, Darlene S. S.

    2017-01-01

    The Biologic Analog Science Associated with Lava Terrains (BASALT) project is a multi-year program dedicated to iteratively develop, implement, and evaluate concepts of operations (ConOps) and supporting capabilities intended to enable and enhance human scientific exploration of Mars. This pa-per describes the planning, execution, and initial results from the first field deployment, referred to as BASALT-1, which consisted of a series of 10 simulated extravehicular activities (EVAs) on volcanic flows in Idaho's Craters of the Moon (COTM) National Monument. The ConOps and capabilities deployed and tested during BASALT-1 were based on previous NASA trade studies and analog testing. Our primary research question was whether those ConOps and capabilities work acceptably when performing real (non-simulated) biological and geological scientific exploration under 4 different Mars-to-Earth communication conditions: 5 and 15 min one-way light time (OWLT) communication latencies and low (0.512 Mb/s uplink, 1.54 Mb/s downlink) and high (5.0 Mb/s uplink, 10.0 Mb/s downlink) bandwidth conditions representing the lower and higher limits of technical communication capabilities currently proposed for future human exploration missions. The synthesized results of BASALT-1 with respect to the ConOps and capabilities assessment were derived from a variety of sources, including EVA task timing data, network analytic data, and subjective ratings and comments regarding the scientific and operational acceptability of the ConOp and the extent to which specific capabilities were enabling and enhancing, and are presented here. BASALT-1 established preliminary findings that baseline ConOp, software systems, and communication protocols were scientifically and operationally acceptable with minor improvements desired by the "Mars" extravehicular (EV) and intravehicular (IV) crewmembers, but unacceptable with improvements required by the "Earth" Mission Support Center. These data will provide a basis for guiding and prioritizing capability development for future BASALT deployments and, ultimately, future human exploration missions.

  15. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  16. Investigations of turbulent scalar fields using probability density function approach

    NASA Technical Reports Server (NTRS)

    Gao, Feng

    1991-01-01

    Scalar fields undergoing random advection have attracted much attention from researchers in both the theoretical and practical sectors. Research interest spans from the study of the small scale structures of turbulent scalar fields to the modeling and simulations of turbulent reacting flows. The probability density function (PDF) method is an effective tool in the study of turbulent scalar fields, especially for those which involve chemical reactions. It has been argued that a one-point, joint PDF approach is the one to choose from among many simulation and closure methods for turbulent combustion and chemically reacting flows based on its practical feasibility in the foreseeable future for multiple reactants. Instead of the multi-point PDF, the joint PDF of a scalar and its gradient which represents the roles of both scalar and scalar diffusion is introduced. A proper closure model for the molecular diffusion term in the PDF equation is investigated. Another direction in this research is to study the mapping closure method that has been recently proposed to deal with the PDF's in turbulent fields. This method seems to have captured the physics correctly when applied to diffusion problems. However, if the turbulent stretching is included, the amplitude mapping has to be supplemented by either adjusting the parameters representing turbulent stretching at each time step or by introducing the coordinate mapping. This technique is still under development and seems to be quite promising. The final objective of this project is to understand some fundamental properties of the turbulent scalar fields and to develop practical numerical schemes that are capable of handling turbulent reacting flows.

  17. Multistage variable probability forest volume inventory. [the Defiance Unit of the Navajo Nation

    NASA Technical Reports Server (NTRS)

    Anderson, J. E. (Principal Investigator)

    1979-01-01

    An inventory scheme based on the use of computer processed LANDSAT MSS data was developed. Output from the inventory scheme provides an estimate of the standing net saw timber volume of a major timber species on a selected forested area of the Navajo Nation. Such estimates are based on the values of parameters currently used for scaled sawlog conversion to mill output. The multistage variable probability sampling appears capable of producing estimates which compare favorably with those produced using conventional techniques. In addition, the reduction in time, manpower, and overall costs lend it to numerous applications.

  18. Continuation of probability density functions using a generalized Lyapunov approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baars, S., E-mail: s.baars@rug.nl; Viebahn, J.P., E-mail: viebahn@cwi.nl; Mulder, T.E., E-mail: t.e.mulder@uu.nl

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.

  19. Controlling quantum interference in phase space with amplitude.

    PubMed

    Xue, Yinghong; Li, Tingyu; Kasai, Katsuyuki; Okada-Shudo, Yoshiko; Watanabe, Masayoshi; Zhang, Yun

    2017-05-23

    We experimentally show a quantum interference in phase space by interrogating photon number probabilities (n = 2, 3, and 4) of a displaced squeezed state, which is generated by an optical parametric amplifier and whose displacement is controlled by amplitude of injected coherent light. It is found that the probabilities exhibit oscillations of interference effect depending upon the amplitude of the controlling light field. This phenomenon is attributed to quantum interference in phase space and indicates the capability of controlling quantum interference using amplitude. This remarkably contrasts with the oscillations of interference effects being usually controlled by relative phase in classical optics.

  20. History of intelligent transportation systems.

    DOT National Transportation Integrated Search

    2016-05-01

    ITS capabilities have matured significantly over the past 25 years since the ITS Joint Program Office was created, and this document celebrates the advances in this field and explore its exciting future, while also serving as a guide for future ITS r...

  1. Extending NASA's SPICE ancillary information system to meet future mission needs

    NASA Technical Reports Server (NTRS)

    Acton, C.; Bachman, N.; Elson, L.; Semenov, B.; Turner, F.; Wright, E.

    2002-01-01

    This paper summarizes the architecture, capabilities, characteristics and uses of the current SPICE ancillary information system, and then outlines plans and ideas for how this system can be extended to meet future space mission requirements.

  2. On the Development and Application of High Data Rate Architecture (HiDRA) in Future Space Networks

    NASA Technical Reports Server (NTRS)

    Hylton, Alan; Raible, Daniel; Clark, Gilbert

    2017-01-01

    Historically, space missions have been severely constrained by their ability to downlink the data they have collected. These constraints are a result of relatively low link rates on the spacecraft as well as limitations on the time during which data can be sent. As part of a coherent strategy to address existing limitations and get more data to the ground more quickly, the Space Communications and Navigation (SCaN) program has been developing an architecture for a future solar system Internet. The High Data Rate Architecture (HiDRA) project is designed to fit into such a future SCaN network. HiDRA's goal is to describe a general packet-based networking capability which can be used to provide assets with efficient networking capabilities while simultaneously reducing the capital costs and operational costs of developing and flying future space systems.Along these lines, this paper begins by reviewing various characteristics of modern satellite design as well as relevant characteristics of emerging technologies (such as free-space optical links capable of working at 100+ Gbps). Next, the paper describes HiDRA's design, and how the system is able to both integrate and support the operation of not only today's high-rate systems, but also the high-rate systems likely to be found in the future. This section also explores both existing and future networking technologies, such as Delay Tolerant Networking (DTN) protocol (RFC4838 citeRFC:1, RFC5050citeRFC:2), and explains how HiDRA supports them. Additionally, this section explores how HiDRA is used for scheduling data movement through both proactive and reactive link management. After this, the paper moves on to explore a reference implementation of HiDRA. This implementation is currently being realized based on a Field Programmable Gate Array (FPGA) memory and interface controller that is itself controlled by a local computer running DTN software. Next, this paper explores HiDRA's natural evolution, which includes an integration path for software-defined networking (SDN) switches. This section also describes considerations for both near-Earth and deep-space instantiations of HiDRA, describing how differences in latencies between the environments will necessarily influence how the system is configured and the networks operate. Finally, this paper describes future work. This section includes a description of a potential ISS implementation which will allow rapid advancement through the technology readiness levels (TRL). This section also explores work being done to support HiDRA's successful implementation and operation in a heterogeneous network: such a network could include communications equipment spanning many vintages and capabilities, and one significant aspect of HiDRA's future development involves balancing compatibility with capability.

  3. Capability and dependency in the Newcastle 85+ cohort study. Projections of future care needs.

    PubMed

    Jagger, Carol; Collerton, Joanna C; Davies, Karen; Kingston, Andrew; Robinson, Louise A; Eccles, Martin P; von Zglinicki, Thomas; Martin-Ruiz, Carmen; James, Oliver F W; Kirkwood, Tom B L; Bond, John

    2011-05-04

    Little is known of the capabilities of the oldest old, the fastest growing age group in the population. We aimed to estimate capability and dependency in a cohort of 85 year olds and to project future demand for care. Structured interviews at age 85 with 841 people born in 1921 and living in Newcastle and North Tyneside, UK who were permanently registered with participating general practices. Measures of capability included were self-reported activities of daily living (ADL), timed up and go test (TUG), standardised mini-mental state examination (SMMSE), and assessment of urinary continence in order to classify interval-need dependency. To project future demand for care the proportion needing 24-hour care was applied to the 2008 England and Wales population projections of those aged 80 years and over by gender. Of participants, 62% (522/841) were women, 77% (651/841) lived in standard housing, 13% (106/841) in sheltered housing and 10% (84/841) in a care home. Overall, 20% (165/841) reported no difficulty with any of the ADLs. Men were more capable in performing ADLs and more independent than women. TUG validated self-reported ADLs. When classified by 'interval of need' 41% (332/810) were independent, 39% (317/810) required help less often than daily, 12% (94/810) required help at regular times of the day and 8% (67/810) required 24-hour care. Of care-home residents, 94% (77/82) required daily help or 24-hour care. Future need for 24-hour care for people aged 80 years or over in England and Wales is projected to increase by 82% from 2010 to 2030 with a demand for 630,000 care-home places by 2030. This analysis highlights the diversity of capability and levels of dependency in this cohort. A remarkably high proportion remain independent, particularly men. However a significant proportion of this population require 24-hour care at home or in care homes. Projections for the next 20 years suggest substantial increases in the number requiring 24-hour care due to population ageing and a proportionate increase in demand for care-home places unless innovative health and social care interventions are found.

  4. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  5. iPAS: AES Flight System Technology Maturation for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Othon, William L.

    2014-01-01

    In order to realize the vision of expanding human presence in space, NASA will develop new technologies that can enable future crewed spacecraft to go far beyond Earth orbit. These technologies must be matured to the point that future project managers can accept the risk of incorporating them safely and effectively within integrated spacecraft systems, to satisfy very challenging mission requirements. The technologies must also be applied and managed within an operational context that includes both on-board crew and mission support on Earth. The Advanced Exploration Systems (AES) Program is one part of the NASA strategy to identify and develop key capabilities for human spaceflight, and mature them for future use. To support this initiative, the Integrated Power Avionics and Software (iPAS) environment has been developed that allows engineers, crew, and flight operators to mature promising technologies into applicable capabilities, and to assess the value of these capabilities within a space mission context. This paper describes the development of the integration environment to support technology maturation and risk reduction, and offers examples of technology and mission demonstrations executed to date.

  6. Self-Sensitivity in Fetal Development

    ERIC Educational Resources Information Center

    Alberts, Jeffrey R.

    2008-01-01

    In mammalian species, behavior begins in utero, hidden within the mother's body. This biological fact has made it difficult to observe or to access fetuses, leaving the beginnings of behavior to the imagination or allowing it to be forgotten or ignored. Such truncation of perspective probably helped many to consider behavioral capabilities first…

  7. Beyond the Horizon: Developing Future Airpower Strategy

    DTIC Science & Technology

    2014-01-01

    is or becomes more capable, then it is further evidence the USAF failed to proactively usher in these emerging and vital airpower capabilities...USAF airpower. 9. USAF chief of staff, Gen Norton Schwartz, 2009, offered in numerous speeches. 10. Carl H. Builder, The Icarus Syndrome : The Role of

  8. The Future Institutional Research Office: Brave New Workplace or Electronic Sweatshop? AIR 1989 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Liebmann, Jeffrey D.

    Information technology is changing the workplace. Forecasts range from wondrous visions of future capabilities to dark scenarios of employment loss and dehumanization. Some predict revolutionary impacts, while others conclude that the way we do business will change only gradually if much at all. The less positive visions of the future workplace…

  9. Changes in the probability of co-occurring extreme climate events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.

    2017-12-01

    Extreme climate events such as floods, droughts, heatwaves, and severe storms exert acute stresses on natural and human systems. When multiple extreme events co-occur, either in space or time, the impacts can be substantially compounded. A diverse set of human interests - including supply chains, agricultural commodities markets, reinsurance, and deployment of humanitarian aid - have historically relied on the rarity of extreme events to provide a geographic hedge against the compounded impacts of co-occuring extremes. However, changes in the frequency of extreme events in recent decades imply that the probability of co-occuring extremes is also changing, and is likely to continue to change in the future in response to additional global warming. This presentation will review the evidence for historical changes in extreme climate events and the response of extreme events to continued global warming, and will provide some perspective on methods for quantifying changes in the probability of co-occurring extremes in the past and future.

  10. Portability scenarios for intelligent robotic control agent software

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-06-01

    Portability scenarios are critical in ensuring that a piece of AI control software will run effectively across the collection of craft that it is required to control. This paper presents scenarios for control software that is designed to control multiple craft with heterogeneous movement and functional characteristics. For each prospective target-craft type, its capabilities, mission function, location, communications capabilities and power profile are presented and performance characteristics are reviewed. This work will inform future work related to decision making related to software capabilities, hardware control capabilities and processing requirements.

  11. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  12. Trajectory Design for the Europa Clipper Mission Concept

    NASA Technical Reports Server (NTRS)

    Buffington, Brent

    2014-01-01

    Europa is one of the most scientifically intriguing targets in planetary science due to its potential suitability for extant life. As such, NASA has funded the California Institute of Technology Jet Propulsion Laboratory and the Johns Hopkins University Applied Physics Laboratory to jointly determine and develop the best mission concept to explore Europa in the near future. The result of nearly 4 years of work--the Europa Clipper mission concept--is a multiple Europa flyby mission that could efficiently execute a number of high caliber science investigations to meet Europa science priorities specified in the 2011 NRC Decadal Survey, and is capable of providing reconnaissance data to maximize the probability of both a safe landing and access to surface material of high scientific value for a future Europa lander. This paper will focus on the major enabling component for this mission concept--the trajectory. A representative trajectory, referred to as 13F7-A21, would obtain global-regional coverage of Europa via a complex network of 45 flybys over the course of 3.5 years while also mitigating the effects of the harsh Jovian radiation environment. In addition, 5 Ganymede and 9 Callisto flybys would be used to manipulate the trajectory relative to Europa. The tour would reach a maximum Jovicentric inclination of 20.1 deg. have a deterministic (Delta)V of 164 m/s (post periapsis raise maneuver), and a total ionizing dose of 2.8 Mrad (Si).

  13. A Serendipitous MWA Search for Narrowband Signals from ‘Oumuamua

    NASA Astrophysics Data System (ADS)

    Tingay, S. J.; Kaplan, D. L.; Lenc, E.; Croft, S.; McKinley, B.; Beardsley, A.; Crosse, B.; Emrich, D.; Franzen, T. M. O.; Gaensler, B. M.; Horsley, L.; Johnston-Hollitt, M.; Kenney, D.; Morales, M. F.; Pallot, D.; Steele, K.; Trott, C. M.; Walker, M.; Wayth, R. B.; Williams, A.; Wu, C.

    2018-04-01

    We examine data from the Murchison Widefield Array (MWA) in the frequency range 72–102 MHz for a field of view that serendipitously contained the interstellar object ‘Oumuamua on 2017 November 28. Observations took place with a time resolution of 0.5 s and a frequency resolution of 10 kHz. Based on the interesting but highly unlikely suggestion that ‘Oumuamua is an interstellar spacecraft, due to some unusual orbital and morphological characteristics, we examine our data for signals that might indicate the presence of intelligent life associated with ‘Oumuamua. We searched our radio data for (1) impulsive narrowband signals, (2) persistent narrowband signals, and (3) impulsive broadband signals. We found no such signals with nonterrestrial origins and make estimates of the upper limits on equivalent isotropic radiated power (EIRP) for these three cases of approximately 7 kW, 840 W, and 100 kW, respectively. These transmitter powers are well within the capabilities of human technologies, and are therefore plausible for alien civilizations. While the chances of positive detection in any given search for extraterrestrial intelligence (SETI) experiment are vanishingly small, the characteristics of new generation telescopes such as the MWA (and, in the future, the Square Kilometre Array) make certain classes of SETI experiments easy, or even a trivial by-product of astrophysical observations. This means that the future costs of SETI experiments are very low, allowing large target lists to partially balance the low probability of a positive detection.

  14. Robust Bayesian Experimental Design for Conceptual Model Discrimination

    NASA Astrophysics Data System (ADS)

    Pham, H. V.; Tsai, F. T. C.

    2015-12-01

    A robust Bayesian optimal experimental design under uncertainty is presented to provide firm information for model discrimination, given the least number of pumping wells and observation wells. Firm information is the maximum information of a system can be guaranteed from an experimental design. The design is based on the Box-Hill expected entropy decrease (EED) before and after the experiment design and the Bayesian model averaging (BMA) framework. A max-min programming is introduced to choose the robust design that maximizes the minimal Box-Hill EED subject to that the highest expected posterior model probability satisfies a desired probability threshold. The EED is calculated by the Gauss-Hermite quadrature. The BMA method is used to predict future observations and to quantify future observation uncertainty arising from conceptual and parametric uncertainties in calculating EED. Monte Carlo approach is adopted to quantify the uncertainty in the posterior model probabilities. The optimal experimental design is tested by a synthetic 5-layer anisotropic confined aquifer. Nine conceptual groundwater models are constructed due to uncertain geological architecture and boundary condition. High-performance computing is used to enumerate all possible design solutions in order to identify the most plausible groundwater model. Results highlight the impacts of scedasticity in future observation data as well as uncertainty sources on potential pumping and observation locations.

  15. HMM for hyperspectral spectrum representation and classification with endmember entropy vectors

    NASA Astrophysics Data System (ADS)

    Arabi, Samir Y. W.; Fernandes, David; Pizarro, Marco A.

    2015-10-01

    The Hyperspectral images due to its good spectral resolution are extensively used for classification, but its high number of bands requires a higher bandwidth in the transmission data, a higher data storage capability and a higher computational capability in processing systems. This work presents a new methodology for hyperspectral data classification that can work with a reduced number of spectral bands and achieve good results, comparable with processing methods that require all hyperspectral bands. The proposed method for hyperspectral spectra classification is based on the Hidden Markov Model (HMM) associated to each Endmember (EM) of a scene and the conditional probabilities of each EM belongs to each other EM. The EM conditional probability is transformed in EM vector entropy and those vectors are used as reference vectors for the classes in the scene. The conditional probability of a spectrum that will be classified is also transformed in a spectrum entropy vector, which is classified in a given class by the minimum ED (Euclidian Distance) among it and the EM entropy vectors. The methodology was tested with good results using AVIRIS spectra of a scene with 13 EM considering the full 209 bands and the reduced spectral bands of 128, 64 and 32. For the test area its show that can be used only 32 spectral bands instead of the original 209 bands, without significant loss in the classification process.

  16. Interfutures: Facing the Future, Mastering the Probable and Managing the Unpredictable.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This report discusses the findings of the three year Interfutures Project which studied the future development of advanced industrial societies and the relations between these countries and the developing countries. The major emphasis of the project was to analyze economic problems. However, political and social elements were also studied. The…

  17. Lunar Exploration and Science in ESA

    NASA Astrophysics Data System (ADS)

    Carpenter, James; Houdou, Bérengère; Fisackerly, Richard; De Rosa, Diego; Patti, Bernardo; Schiemann, Jens; Hufenbach, Bernhard; Foing, Bernard

    2015-04-01

    ESA seeks to provide Europe with access to the lunar surface, and allow Europeans to benefit from the opening up of this new frontier, as part of a global endeavor. This will be best achieved through an exploration programme which combines the strengths and capabilities of both robotic and human explorers. ESA is preparing for future participation in lunar exploration through a combination of human and robotic activities, in cooperation with international partners. Future planned activities include the contribution of key technological capabilities to the Russian led robotic missions, Luna-Glob, Luna-Resurs orbiter and Luna-Resurs lander. For the Luna-Resurs lander ESA will provide analytical capabilities to compliment the Russian led science payload, focusing on developing an characterising the resource opportunities offered at the lunar surface. This should be followed by the contributions at the level of mission elements to a Lunar Polar Sample Return mission. These robotic activities are being performed with a view to enabling a future more comprehensive programme in which robotic and human activities are integrated to provide the maximum benefits from lunar surface access. Activities on the ISS and ESA participation to the US led Multi-Purpose Crew Vehicle, which is planned for a first unmanned lunar flight in 2017, are also important steps towards achieving this. In the frame of a broader future international programme under discussion through the International Space Exploration Coordination Group (ISECG) future missions are under investigation that would provide access to the lunar surface through international cooperation and human-robotic partnerships.

  18. Communications for unattended sensor networks

    NASA Astrophysics Data System (ADS)

    Nemeroff, Jay L.; Angelini, Paul; Orpilla, Mont; Garcia, Luis; DiPierro, Stefano

    2004-07-01

    The future model of the US Army's Future Combat Systems (FCS) and the Future Force reflects a combat force that utilizes lighter armor protection than the current standard. Survival on the future battlefield will be increased by the use of advanced situational awareness provided by unattended tactical and urban sensors that detect, identify, and track enemy targets and threats. Successful implementation of these critical sensor fields requires the development of advanced sensors, sensor and data-fusion processors, and a specialized communications network. To ensure warfighter and asset survivability, the communications must be capable of near real-time dissemination of the sensor data using robust, secure, stealthy, and jam resistant links so that the proper and decisive action can be taken. Communications will be provided to a wide-array of mission-specific sensors that are capable of processing data from acoustic, magnetic, seismic, and/or Chemical, Biological, Radiological, and Nuclear (CBRN) sensors. Other, more powerful, sensor node configurations will be capable of fusing sensor data and intelligently collect and process data images from infrared or visual imaging cameras. The radio waveform and networking protocols being developed under the Soldier Level Integrated Communications Environment (SLICE) Soldier Radio Waveform (SRW) and the Networked Sensors for the Future Force Advanced Technology Demonstration are part of an effort to develop a common waveform family which will operate across multiple tactical domains including dismounted soldiers, ground sensor, munitions, missiles and robotics. These waveform technologies will ultimately be transitioned to the JTRS library, specifically the Cluster 5 requirement.

  19. A method of decision analysis quantifying the effects of age and comorbidities on the probability of deriving significant benefit from medical treatments

    PubMed Central

    Bean, Nigel G.; Ruberu, Ravi P.

    2017-01-01

    Background The external validity, or generalizability, of trials and guidelines has been considered poor in the context of multiple morbidity. How multiple morbidity might affect the magnitude of benefit of a given treatment, and thereby external validity, has had little study. Objective To provide a method of decision analysis to quantify the effects of age and comorbidity on the probability of deriving a given magnitude of treatment benefit. Design We developed a method to calculate probabilistically the effect of all of a patient’s comorbidities on their underlying utility, or well-being, at a future time point. From this, we derived a distribution of possible magnitudes of treatment benefit at that future time point. We then expressed this distribution as the probability of deriving at least a given magnitude of treatment benefit. To demonstrate the applicability of this method of decision analysis, we applied it to the treatment of hypercholesterolaemia in a geriatric population of 50 individuals. We highlighted the results of four of these individuals. Results This method of analysis provided individualized quantifications of the effect of age and comorbidity on the probability of treatment benefit. The average probability of deriving a benefit, of at least 50% of the magnitude of benefit available to an individual without comorbidity, was only 0.8%. Conclusion The effects of age and comorbidity on the probability of deriving significant treatment benefits can be quantified for any individual. Even without consideration of other factors affecting external validity, these effects may be sufficient to guide decision-making. PMID:29090189

  20. Real-Time Interactive Facilities Associated With A 3-D Medical Workstation

    NASA Astrophysics Data System (ADS)

    Goldwasser, S. M.; Reynolds, R. A.; Talton, D.; Walsh, E.

    1986-06-01

    Biomedical workstations of the future will incorporate three-dimensional interactive capabilities which provide real-time response to most common operator requests. Such systems will find application in many areas of medicine including clinical diagnosis, surgical and radiation therapy planning, biomedical research based on functional imaging, and medical education. This paper considers the requirements of these future systems in terms of image quality, performance, and the interactive environment, and examines the relationship of workstation capabilities to specific medical applications. We describe a prototype physician's workstation that we have designed and built to meet many of these requirements (using conventional graphics technology in conjunction with a custom real-time 3-D processor), and give an account of the remaining issues and challenges that future designers of such systems will have to address.

  1. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density function, apparently previously unknown and dubbed "Maccone distribution" by Paul Davies. DATA ENRICHMENT PRINCIPLE. It should be noticed that ANY positive number of random variables in the Statistical Drake Equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation, we call the "Data Enrichment Principle," and we regard it as the key to more profound future results in the fields of Astrobiology and SETI. Finally, a practical example is given of how our statistical Drake equation works numerically. We work out in detail the case, where each of the seven random variables is uniformly distributed around its own mean value and has a given standard deviation. For instance, the number of stars in the Galaxy is assumed to be uniformly distributed around (say) 350 billions with a standard deviation of (say) 1 billion. Then, the resulting lognormal distribution of N is computed numerically by virtue of a MathCad file that the author has written. This shows that the mean value of the lognormal random variable N is actually of the same order as the classical N given by the ordinary Drake equation, as one might expect from a good statistical generalization.

  2. The Royal Naval Medical Services: delivering medical operational capability. the 'black art' of Medical Operational Planning.

    PubMed

    Faye, M

    2013-01-01

    This article looks to dispel the mysteries of the 'black art' of Medical Operational Planning whilst giving an overview of activity within the Medical Operational Capability area of Medical Division (Med Div) within Navy Command Headquarters (NCHQ) during a period when the Royal Naval Medical Services (RNMS) have been preparing and reconfiguring medical capability for the future contingent battle spaces. The rolling exercise program has been used to illustrate the ongoing preparations taken by the Medical Operational Capability (Med Op Cap) and the Medical Force Elements to deliver medical capability in the littoral and maritime environments.

  3. Mars Network: Strategies for Deploying Enabling Telecommunications Capabilities in Support of Mars Exploration

    NASA Technical Reports Server (NTRS)

    Edwards, C. D.; Adams, J. T.; Agre, J. R.; Bell, D. J.; Clare, L. P.; Durning, J. F.; Ely, T. A.; Hemmati, H.; Leung, R. Y.; McGraw, C. A.

    2000-01-01

    The coming decade of Mars exploration will involve a diverse set of robotic science missions, including in situ and sample return investigations, and ultimately moving towards sustained robotic presence on the Martian surface. In supporting this mission set, NASA must establish a robust telecommunications architecture that meets the specific science needs of near-term missions while enabling new methods of future exploration. This paper will assess the anticipated telecommunications needs of future Mars exploration, examine specific options for deploying capabilities, and quantify the performance of these options in terms of key figures of merit.

  4. DNA origami nanopores: developments, challenges and perspectives

    NASA Astrophysics Data System (ADS)

    Hernández-Ainsa, Silvia; Keyser, Ulrich F.

    2014-11-01

    DNA nanotechnology has enabled the construction of DNA origami nanopores; synthetic nanopores that present improved capabilities for the area of single molecule detection. Their extraordinary versatility makes them a new and powerful tool in nanobiotechnology for a wide range of important applications beyond molecular sensing. In this review, we briefly present the recent developments in this emerging field of research. We discuss the current challenges and possible solutions that would enhance the sensing capabilities of DNA origami nanopores. Finally, we anticipate novel avenues for future research and highlight a range of exciting ideas and applications that could be explored in the near future.

  5. Quantification of effective exoelectrogens by most probable number (MPN) in a microbial fuel cell.

    PubMed

    Heidrich, Elizabeth S; Curtis, Thomas P; Woodcock, Stephen; Dolfing, Jan

    2016-10-01

    The objective of this work was to quantify the number of exoelectrogens in wastewater capable of producing current in a microbial fuel cell by adapting the classical most probable number (MPN) methodology using current production as end point. Inoculating a series of microbial fuel cells with various dilutions of domestic wastewater and with acetate as test substrate yielded an apparent number of exoelectrogens of 17perml. Using current as a proxy for activity the apparent exoelectrogen growth rate was 0.03h(-1). With starch or wastewater as more complex test substrates similar apparent growth rates were obtained, but the apparent MPN based numbers of exoelectrogens in wastewater were significantly lower, probably because in contrast to acetate, complex substrates require complex food chains to deliver the electrons to the electrodes. Consequently, the apparent MPN is a function of the combined probabilities of members of the food chain being present. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  6. Options for the Navy’s Future Fleet

    DTIC Science & Technology

    2006-05-01

    capability than most of those options by other than for inflation, how big and how capable can the most measures of capability. But even under the Navy’s...The most prominent of those vessels are fast combat support ships, which operate with carrier strike groups to resupply them with fuel, dry Supply...categories of ships-submarines and large surface Similarly, the ship construction schedule for large surface combatants-are responsible for most of the

  7. Smart instruments and the national collaboratory

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M. (Editor)

    1989-01-01

    Here, we explore the process of scientific experimental investigation and ask what capabilities are required of the collaboratory to support such investigations. We first look at a number of examples of scientific research being conducted using remote instruments. We then examine the process of such research, asking at each stage what are the required capabilities. We finally integrate these results into a statement of the required set of capabilities needed to support scientific research in the future.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rich, B.L.

    Measurements of beta and/or nonpenetrating exposure results is complicated and past techniques and capabilities have resulted in significant inaccuracies in recorded results. Current developments have resulted in increased capabilities which make the results more accurate and should result in less total exposure to the work force. Continued development of works in progress should provide equivalent future improvements.

  9. Leadership Values, Trust and Negative Capability: Managing the Uncertainties of Future English Higher Education

    ERIC Educational Resources Information Center

    Jameson, Jill

    2012-01-01

    The complex leadership attribute of "negative capability" in managing uncertainty and engendering trust may be amongst the qualities enabling institutions to cope with multiple recent government policy challenges affecting English higher education, including significant increases in student fees. Research findings are reported on changes…

  10. Ground Robotics Capabilities Conference and Exhibition Held in Miami, Florida on March 16-18, 2010

    DTIC Science & Technology

    2010-03-18

    Underwater Vehicle Environmentally Non-Disturbing Under- ice Robotic Antarctic Explorer (ENDURANCE) 4/10/07 Elachi ASU 23 Possible future submersible...seeking liquid water on Europa or Enceladus 1 Ground Robotics Capability Conference and Exhibit Mr. George Solhan Office of Naval Research Code 30

  11. Moxie matters: associations of future orientation with active life expectancy.

    PubMed

    Laditka, Sarah B; Laditka, James N

    2017-10-01

    Being oriented toward the future has been associated with better future health. We studied associations of future orientation with life expectancy and the percentage of life with disability. We used the Panel Study of Income Dynamics (n = 5249). Participants' average age in 1968 was 33.0. Six questions repeatedly measured future orientation, 1968-1976. Seven waves (1999-2011, 33,331 person-years) measured disability in activities of daily living for the same individuals, whose average age in 1999 was 64.0. We estimated monthly probabilities of disability and death with multinomial logistic Markov models adjusted for age, sex, race/ethnicity, childhood health, and education. Using the probabilities, we created large populations with microsimulation, measuring disability in each month for each individual, age 55 through death. Life expectancy from age 55 for white men with high future orientation was age 77.6 (95% confidence interval 75.5-79.0), 6.9% (4.9-7.2) of those years with disability; results with low future orientation were 73.6 (72.2-75.4) and 9.6% (7.7-10.7). Comparable results for African American men were 74.8 (72.9-75.3), 8.1 (5.6-9.3), 71.0 (69.6-72.8), and 11.3 (9.1-11.7). For women, there were no significant differences associated with levels of future orientation for life expectancy. For white women with high future orientation 9.1% of remaining life from age 55 was disabled (6.3-9.9), compared to 12.4% (10.2-13.2) with low future orientation. Disability results for African American women were similar but statistically significant only at age 80 and over. High future orientation during early to middle adult ages may be associated with better health in older age.

  12. Battlefield Air Interdiction: Airpower for the Future

    DTIC Science & Technology

    1980-01-01

    recommendations for the effective use of airpower for this purpose are made. A future war will probably be against the Soviet Union or one of its...emphasis will be placed upon the Soviet forces since it is likely that any future belligerence will be against the _ _......6 I Soviet Union or one of its...offensive operations (see figure 3) stress rapid, continuous movement. Objectives are established which demand high rates of advance. A regiment, for

  13. Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2009-01-01

    Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.

  14. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  15. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. The Joint Distribution Process Analysis Center (JDPAC): Background and Current Capability

    DTIC Science & Technology

    2007-06-12

    Systems Integration and Data Management JDDE Analysis/Global Distribution Performance Assessment Futures/Transformation Analysis Balancing Operational Art ... Science JDPAC “101” USTRANSCOM Future Operations Center SDDC – TEA Army SES (Dual Hat) • Transportability Engineering • Other Title 10

  17. Current capabilities and future directions in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A summary of significant findings is given, followed by specific recommendations for future directions of emphasis for computational fluid dynamics development. The discussion is organized into three application areas: external aerodynamics, hypersonics, and propulsion - and followed by a turbulence modeling synopsis.

  18. A review of the theory of interstellar communication

    NASA Technical Reports Server (NTRS)

    Billingham, J.; Wolfe, J. H.; Oliver, B. M.

    1975-01-01

    The probability is analyzed that intelligent civilizations capable of interstellar communication exist in the galaxy. Drake's (1960) equation for the prevalence of communicative civilization is used in the calculations, and attempts are made to place limits on the search range that must be covered to contact other civilizations, the longevity of the communicative phase of such civilizations, and the possible number of two-way exchanges between civilizations in contact with each other. The minimum estimates indicate that some 100,000 civilizations probably coexist within several tens of astronomical units of each other and that some 1,000,000 probably coexist within 10 light years of each other. Attempts to detect coherent signals characteristic of intelligent life are briefly noted, including Projects Ozma and Cyclops as well as some Soviet attempts. Recently proposed American and Soviet programs for interstellar communication are outlined.

  19. What kind of students should be developed through aeronautical engineering education?

    NASA Technical Reports Server (NTRS)

    Holloway, R. B.

    1975-01-01

    The educational requirements for future aeronautical engineering students are postulated. The change in aeronautical engineering from increasing aircraft performance without regard to cost is compared with the cost effective aspects of future research. The capabilities of future engineers are discussed with respect to the following areas: (1) problem solving, (2) planning and organizing, (3) communication, and (4) professionalism.

  20. New Millenium Program Serving Earth and Space Sciences

    NASA Technical Reports Server (NTRS)

    Li, Fuk

    1999-01-01

    A cross-Enterprise program is to identify and validate flight breakthrough technologies that will significantly benefit future space science and earth science missions. The breakthrough technologies are: enable new capabilities to meet earth and space science needs and reducing costs of future missions. The flight validation are: mitigates risks to first users and enables rapid technology infusion into future missions.

  1. Review of the use of pretest probability for molecular testing in non-small cell lung cancer and overview of new mutations that may affect clinical practice.

    PubMed

    Martin, Petra; Leighl, Natasha B

    2017-06-01

    This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice.

  2. Lightning Strike Peak Current Probabilities as Related to Space Shuttle Operations

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Vaughan, William W.

    2000-01-01

    A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.

  3. Lunar Exploration and Science in ESA

    NASA Astrophysics Data System (ADS)

    Carpenter, James; Houdou, Bérengère; Fisackerly, Richard; De Rosa, Diego; Patti, Bernardo; Schiemann, Jens; Hufenbach, Bernhard; Foing, Bernard

    2014-05-01

    ESA seeks to provide Europe with access to the lunar surface, and allow Europeans to benefit from the opening up of this new frontier, as part of a global endeavor. This will be best achieved through an exploration programme which combines the strengths and capabilities of both robotic and human explorers. ESA is preparing for future participation in lunar exploration through a combination of human and robotic activities, in cooperation with international partners. Future planned activities include the contribution of key technological capabilities to the Russian led robotic missions, Luna-Glob, Luna-Resurs orbiter and Luna-Resurs lander. For the Luna-Resurs lander ESA will provide analytical capabilities to compliment the already selected Russian led payload, focusing on the composition and isotopic abundances of lunar volatiles in polar regions. This should be followed by the contributions at the level of mission elements to a Lunar Polar Sample Return mission. This partnership will provide access for European investigators to the opportunities offered by the Russian led instruments on the missions, as well as providing Europe with a unique opportunity to characterize and utilize polar volatile populations. Ultimately samples of high scientific value, from as of yet unexplored and unsampled locations shall be made available to the scientific community. These robotic activities are being performed with a view to enabling a future more comprehensive programme in which robotic and human activities are integrated to provide the maximum benefits from lunar surface access. Activities on the ISS and ESA participation to the US led Multi-Purpose Crew Vehicle, which is planned for a first unmanned lunar flight in 2017, are also important steps towards achieving this. All of these activities are performed with a view to generating the technologies, capabilities, knowledge and heritage that will make Europe an indispensable partner in the exploration missions of the future.

  4. Process Approach to Determining Quality Inspection Deployment

    DTIC Science & Technology

    2015-06-08

    27 B.1 The Deming Rule...k1/k2? [5] At this stage it is assumed that the manufacturing process is capable and that inspection is effective. The Deming rule is explained in...justify reducing inspectors. (See Appendix B for Deming rule discussion.) Three quantities must be determined: p, the probability of a nonconformity

  5. Validating Teacher Performativity through Lifelong School-University Collaboration

    ERIC Educational Resources Information Center

    Lewis, Theodore

    2013-01-01

    The main point of this article is that more credence should be given in teacher education to performative dimensions of teaching. I agree with David Carr (1999) that the requisite capabilities are probably best learned in actual schools. I employ Turnbull's (2000) conception of performativity, which speaks of tacit cultural learning. Following…

  6. Sensing Surveillance & Navigation

    DTIC Science & Technology

    2012-03-07

    Removing Atmospheric Turbulence Goal: to restore a single high quality image from the observed sequence Prof. Peyman...Computer Sciences – Higher wavelet studies , time-scale, time-frequency transformations, Reduced Signature Targets, Low Probability of Intercept...Range Dependent Beam -patterns •Electronic Steering with Frequency Offsets •Inherent Countermeasure Capability Why? W1(t) W2(t) W3

  7. Bayesian superresolution

    NASA Astrophysics Data System (ADS)

    Isakson, Steve Wesley

    2001-12-01

    Well-known principles of physics explain why resolution restrictions occur in images produced by optical diffraction-limited systems. The limitations involved are present in all diffraction-limited imaging systems, including acoustical and microwave. In most circumstances, however, prior knowledge about the object and the imaging system can lead to resolution improvements. In this dissertation I outline a method to incorporate prior information into the process of reconstructing images to superresolve the object beyond the above limitations. This dissertation research develops the details of this methodology. The approach can provide the most-probable global solution employing a finite number of steps in both far-field and near-field images. In addition, in order to overcome the effects of noise present in any imaging system, this technique provides a weighted image that quantifies the likelihood of various imaging solutions. By utilizing Bayesian probability, the procedure is capable of incorporating prior information about both the object and the noise to overcome the resolution limitation present in many imaging systems. Finally I will present an imaging system capable of detecting the evanescent waves missing from far-field systems, thus improving the resolution further.

  8. Bayesian inference for heterogeneous caprock permeability based on above zone pressure monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Namhata, Argha; Small, Mitchell J.; Dilmore, Rober

    The presence of faults/ fractures or highly permeable zones in the primary sealing caprock of a CO2 storage reservoir can result in leakage of CO2. Monitoring of leakage requires the capability to detect and resolve the onset, location, and volume of leakage in a systematic and timely manner. Pressure-based monitoring possesses such capabilities. This study demonstrates a basis for monitoring network design based on the characterization of CO2 leakage scenarios through an assessment of the integrity and permeability of the caprock inferred from above zone pressure measurements. Four representative heterogeneous fractured seal types are characterized to demonstrate seal permeability rangingmore » from highly permeable to impermeable. Based on Bayesian classification theory, the probability of each fractured caprock scenario given above zone pressure measurements with measurement error is inferred. The sensitivity to injection rate and caprock thickness is also evaluated and the probability of proper classification is calculated. The time required to distinguish between above zone pressure outcomes and the associated leakage scenarios is also computed.« less

  9. The Ensemble Space Weather Modeling System (eSWMS): Status, Capabilities and Challenges

    NASA Astrophysics Data System (ADS)

    Fry, C. D.; Eccles, J. V.; Reich, J. P.

    2010-12-01

    Marking a milestone in space weather forecasting, the Space Weather Modeling System (SWMS) successfully completed validation testing in advance of operational testing at Air Force Weather Agency’s primary space weather production center. This is the first coupling of stand-alone, physics-based space weather models that are currently in operations at AFWA supporting the warfighter. Significant development effort went into ensuring the component models were portable and scalable while maintaining consistent results across diverse high performance computing platforms. Coupling was accomplished under the Earth System Modeling Framework (ESMF). The coupled space weather models are the Hakamada-Akasofu-Fry version 2 (HAFv2) solar wind model and GAIM1, the ionospheric forecast component of the Global Assimilation of Ionospheric Measurements (GAIM) model. The SWMS was developed by team members from AFWA, Explorations Physics International, Inc. (EXPI) and Space Environment Corporation (SEC). The successful development of the SWMS provides new capabilities beyond enabling extended lead-time, data-driven ionospheric forecasts. These include ingesting diverse data sets at higher resolution, incorporating denser computational grids at finer time steps, and performing probability-based ensemble forecasts. Work of the SWMS development team now focuses on implementing the ensemble-based probability forecast capability by feeding multiple scenarios of 5 days of solar wind forecasts to the GAIM1 model based on the variation of the input fields to the HAFv2 model. The ensemble SWMS (eSWMS) will provide the most-likely space weather scenario with uncertainty estimates for important forecast fields. The eSWMS will allow DoD mission planners to consider the effects of space weather on their systems with more advance warning than is currently possible. The payoff is enhanced, tailored support to the warfighter with improved capabilities, such as point-to-point HF propagation forecasts, single-frequency GPS error corrections, and high cadence, high-resolution Space Situational Awareness (SSA) products. We present the current status of eSWMS, its capabilities, limitations and path of transition to operational use.

  10. Mammalian Toxicology Testing: Problem Definition Study. Capability Modules.

    DTIC Science & Technology

    1981-04-01

    Department of the Army position unless so designated by other authorized documents 82 U, - 042 SECURITY CLASSIFICATION OF THIS PAGE (When Daoes ntoree0...immediate to ten or more years in the future. Of the many assumptions used, a major one was not to design the Facility for a specific capability, capacity or... design flexibility and to establish capability option. This will enable the Army’s decision-makers the greatest latitude in selecting the final

  11. The Army’s Future Combat System (FCS): Background and Issues for Congress

    DTIC Science & Technology

    2008-05-12

    Operational Capability 2017 The full attainment of the capability to employ the system, including a fully manned, equipped, trained, and logistically...readiness crisis ,” and that “how FCS funding fits into that equation is yet to be determined.”24 Representative Murtha, Chairman of the House Appropriations...Oversight,” InsideDefense.com, May 7, 2008. 52 Ibid. 53 Ann Roosevelt, “FCS Incurs Schedule Breach, Operational Capability Slips to 2017 ,” Defense Daily

  12. Temperature and tree growth [editorial

    Treesearch

    Michael G. Ryan

    2010-01-01

    Tree growth helps US forests take up 12% of the fossil fuels emitted in the USA (Woodbury et al. 2007), so predicting tree growth for future climates matters. Predicting future climates themselves is uncertain, but climate scientists probably have the most confidence in predictions for temperature. Temperatures are projected to rise by 0.2 °C in the next two decades,...

  13. 20 Years of Research into Violence and Trauma: Past and Future Developments

    ERIC Educational Resources Information Center

    Kamphuis, Jan H.; Emmelkamp, Paul M. G.

    2005-01-01

    This reflection on major developments in the past, present, and future of the wider field of violence and trauma is a personal (and probably biased) sampling of what the authors hold to be important. The authors reviewed advances for victims and perpetrators of violence separately. For victims, the authors note that empirical research has…

  14. Deciphering the MSSM Higgs mass at future hadron colliders

    DOE PAGES

    Agrawal, Prateek; Fan, JiJi; Reece, Matthew; ...

    2017-06-06

    Here, future hadron colliders will have a remarkable capacity to discover massive new particles, but their capabilities for precision measurements of couplings that can reveal underlying mechanisms have received less study. In this work we study the capability of future hadron colliders to shed light on a precise, focused question: is the higgs mass of 125 GeV explained by the MSSM? If supersymmetry is realized near the TeV scale, a future hadron collider could produce huge numbers of gluinos and electroweakinos. We explore whether precision measurements of their properties could allow inference of the scalar masses and tan β withmore » sufficient accuracy to test whether physics beyond the MSSM is needed to explain the higgs mass. We also discuss dark matter direct detection and precision higgs physics as complementary probes of tan β. For concreteness, we focus on the mini-split regime of MSSM parameter space at a 100 TeV pp collider, with scalar masses ranging from 10s to about 1000 TeV.« less

  15. A Capabilities Approach to Higher Education: Geocapabilities and Implications for Geography Curricula

    ERIC Educational Resources Information Center

    Walkington, Helen; Dyer, Sarah; Solem, Michael; Haigh, Martin; Waddington, Shelagh

    2018-01-01

    A geographical education offers more than skills, subject knowledge and generic attributes. It also develops a set of discipline-specific capabilities that contribute to a graduate's future learning and experience, granting them special ways of thinking for lifelong development and for contributing to the welfare of themselves, their community and…

  16. Skills, Capabilities and Inequalities at School Entry in a Disadvantaged Community

    ERIC Educational Resources Information Center

    Doyle, Orla; McEntee, Louise; McNamara, Kelly A.

    2012-01-01

    Socioeconomic inequalities in children's skills and capabilities begin early in life and can have detrimental effects on future success in school. The present study examined the relationships between school readiness and socioeconomic (SES) inequalities using teacher reports of the Short Early Development Instrument (Janus et al. 2005) in a…

  17. Gap Analysis: Rethinking the Conceptual Foundations

    DTIC Science & Technology

    2008-01-30

    2 Gap Analysis Background.......................................................................4 Research ...which we are intending, then there could exist a basis for gap in capability and therefore a desire to close the capability gap . What you desire...future reality that can be formulated, definitized, and established or constructed. But, Gap Analysis is not intended to close the space between the

  18. Human Capability, Mild Perfectionism and Thickened Educational Praxis

    ERIC Educational Resources Information Center

    Walker, Melanie

    2008-01-01

    This paper argues for a mild perfectionism in applying Amartya Sen's capability approach for an education transformative of student agency and well-being. Key to the paper is the significance of education as a process of being and becoming in the future, and education's fundamental objective of a positively changed human being. The capability…

  19. Investments by NASA to build planetary protection capability

    NASA Astrophysics Data System (ADS)

    Buxbaum, Karen; Conley, Catharine; Lin, Ying; Hayati, Samad

    NASA continues to invest in capabilities that will enable or enhance planetary protection planning and implementation for future missions. These investments are critical to the Mars Exploration Program and will be increasingly important as missions are planned for exploration of the outer planets and their icy moons. Since the last COSPAR Congress, there has been an opportunity to respond to the advice of NRC-PREVCOM and the analysis of the MEPAG Special Regions Science Analysis Group. This stimulated research into such things as expanded bioburden reduction options, modern molecular assays and genetic inventory capability, and approaches to understand or avoid recontamination of spacecraft parts and samples. Within NASA, a portfolio of PP research efforts has been supported through the NASA Office of Planetary Protection, the Mars Technology Program, and the Mars Program Office. The investment strategy focuses on technology investments designed to enable future missions and reduce their costs. In this presentation we will provide an update on research and development supported by NASA to enhance planetary protection capability. Copyright 2008 California Institute of Technology. Government sponsorship acknowledged.

  20. Power system voltage stability and agent based distribution automation in smart grid

    NASA Astrophysics Data System (ADS)

    Nguyen, Cuong Phuc

    2011-12-01

    Our interconnected electric power system is presently facing many challenges that it was not originally designed and engineered to handle. The increased inter-area power transfers, aging infrastructure, and old technologies, have caused many problems including voltage instability, widespread blackouts, slow control response, among others. These problems have created an urgent need to transform the present electric power system to a highly stable, reliable, efficient, and self-healing electric power system of the future, which has been termed "smart grid". This dissertation begins with an investigation of voltage stability in bulk transmission networks. A new continuation power flow tool for studying the impacts of generator merit order based dispatch on inter-area transfer capability and static voltage stability is presented. The load demands are represented by lumped load models on the transmission system. While this representation is acceptable in traditional power system analysis, it may not be valid in the future smart grid where the distribution system will be integrated with intelligent and quick control capabilities to mitigate voltage problems before they propagate into the entire system. Therefore, before analyzing the operation of the whole smart grid, it is important to understand the distribution system first. The second part of this dissertation presents a new platform for studying and testing emerging technologies in advanced Distribution Automation (DA) within smart grids. Due to the key benefits over the traditional centralized approach, namely flexible deployment, scalability, and avoidance of single-point-of-failure, a new distributed approach is employed to design and develop all elements of the platform. A multi-agent system (MAS), which has the three key characteristics of autonomy, local view, and decentralization, is selected to implement the advanced DA functions. The intelligent agents utilize a communication network for cooperation and negotiation. Communication latency is modeled using a user-defined probability density function. Failure-tolerant communication strategies are developed for agent communications. Major elements of advanced DA are developed in a completely distributed way and successfully tested for several IEEE standard systems, including: Fault Detection, Location, Isolation, and Service Restoration (FLISR); Coordination of Distributed Energy Storage Systems (DES); Distributed Power Flow (DPF); Volt-VAR Control (VVC); and Loss Reduction (LR).

  1. Advanced E-O test capability for Army Next-Generation Automated Test System (NGATS)

    NASA Astrophysics Data System (ADS)

    Errea, S.; Grigor, J.; King, D. F.; Matis, G.; McHugh, S.; McKechnie, J.; Nehring, B.

    2015-05-01

    The Future E-O (FEO) program was established to develop a flexible, modular, automated test capability as part of the Next Generation Automatic Test System (NGATS) program to support the test and diagnostic needs of currently fielded U.S. Army electro-optical (E-O) devices, as well as being expandable to address the requirements of future Navy, Marine Corps and Air Force E-O systems. Santa Barbara infrared (SBIR) has designed, fabricated, and delivered three (3) prototype FEO for engineering and logistics evaluation prior to anticipated full-scale production beginning in 2016. In addition to presenting a detailed overview of the FEO system hardware design, features and testing capabilities, the integration of SBIR's EO-IR sensor and laser test software package, IRWindows 4™, into FEO to automate the test execution, data collection and analysis, archiving and reporting of results is also described.

  2. Advances in computer-aided well-test interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, R.N.

    1994-07-01

    Despite the feeling expressed several times over the past 40 years that well-test analysis had reached it peak development, an examination of recent advances shows continuous expansion in capability, with future improvement likely. The expansion in interpretation capability over the past decade arose mainly from the development of computer-aided techniques, which, although introduced 20 years ago, have come into use only recently. The broad application of computer-aided interpretation originated with the improvement of the methodologies and continued with the expansion in computer access and capability that accompanied the explosive development of the microcomputer industry. This paper focuses on the differentmore » pieces of the methodology that combine to constitute a computer-aided interpretation and attempts to compare some of the approaches currently used. Future directions of the approach are also discussed. The separate areas discussed are deconvolution, pressure derivatives, model recognition, nonlinear regression, and confidence intervals.« less

  3. Workshop on the Use of Future Multispectral Imaging Capabilities for Lithologic Mapping: Workshop summary

    NASA Technical Reports Server (NTRS)

    Settle, M.; Adams, J.

    1982-01-01

    Improved orbital imaging capabilities from the standpoint of different scientific disciplines, such as geology, botany, hydrology, and geography were evaluated. A discussion on how geologists might exploit the anticipated measurement capabilities of future orbital imaging systems to discriminate and characterize different types of geologic materials exposed at the Earth's surface is presented. Principle objectives are to summarize past accomplishments in the use of multispectral imaging techniques for lithologic mapping; to identify critical gaps in earlier research efforts that currently limit the ability to extract useful information about the physical and chemical characteristics of geological materials from orbital multispectral surveys; and to define major thresholds, resolution and sensitivity within the visible and infrared portions of the electromagnetic spectrum which, if achieved would result in significant improvement in our ability to discriminate and characterize different geological materials exposed at the Earth's surface.

  4. Bird Radar Validation in the Field by Time-Referencing Line-Transect Surveys

    PubMed Central

    Dokter, Adriaan M.; Baptist, Martin J.; Ens, Bruno J.; Krijgsveld, Karen L.; van Loon, E. Emiel

    2013-01-01

    Track-while-scan bird radars are widely used in ornithological studies, but often the precise detection capabilities of these systems are unknown. Quantification of radar performance is essential to avoid observational biases, which requires practical methods for validating a radar’s detection capability in specific field settings. In this study a method to quantify the detection capability of a bird radar is presented, as well a demonstration of this method in a case study. By time-referencing line-transect surveys, visually identified birds were automatically linked to individual tracks using their transect crossing time. Detection probabilities were determined as the fraction of the total set of visual observations that could be linked to radar tracks. To avoid ambiguities in assigning radar tracks to visual observations, the observer’s accuracy in determining a bird’s transect crossing time was taken into account. The accuracy was determined by examining the effect of a time lag applied to the visual observations on the number of matches found with radar tracks. Effects of flight altitude, distance, surface substrate and species size on the detection probability by the radar were quantified in a marine intertidal study area. Detection probability varied strongly with all these factors, as well as species-specific flight behaviour. The effective detection range for single birds flying at low altitude for an X-band marine radar based system was estimated at ∼1.5 km. Within this range the fraction of individual flying birds that were detected by the radar was 0.50±0.06 with a detection bias towards higher flight altitudes, larger birds and high tide situations. Besides radar validation, which we consider essential when quantification of bird numbers is important, our method of linking radar tracks to ground-truthed field observations can facilitate species-specific studies using surveillance radars. The methodology may prove equally useful for optimising tracking algorithms. PMID:24066103

  5. Bird radar validation in the field by time-referencing line-transect surveys.

    PubMed

    Dokter, Adriaan M; Baptist, Martin J; Ens, Bruno J; Krijgsveld, Karen L; van Loon, E Emiel

    2013-01-01

    Track-while-scan bird radars are widely used in ornithological studies, but often the precise detection capabilities of these systems are unknown. Quantification of radar performance is essential to avoid observational biases, which requires practical methods for validating a radar's detection capability in specific field settings. In this study a method to quantify the detection capability of a bird radar is presented, as well a demonstration of this method in a case study. By time-referencing line-transect surveys, visually identified birds were automatically linked to individual tracks using their transect crossing time. Detection probabilities were determined as the fraction of the total set of visual observations that could be linked to radar tracks. To avoid ambiguities in assigning radar tracks to visual observations, the observer's accuracy in determining a bird's transect crossing time was taken into account. The accuracy was determined by examining the effect of a time lag applied to the visual observations on the number of matches found with radar tracks. Effects of flight altitude, distance, surface substrate and species size on the detection probability by the radar were quantified in a marine intertidal study area. Detection probability varied strongly with all these factors, as well as species-specific flight behaviour. The effective detection range for single birds flying at low altitude for an X-band marine radar based system was estimated at ~1.5 km. Within this range the fraction of individual flying birds that were detected by the radar was 0.50 ± 0.06 with a detection bias towards higher flight altitudes, larger birds and high tide situations. Besides radar validation, which we consider essential when quantification of bird numbers is important, our method of linking radar tracks to ground-truthed field observations can facilitate species-specific studies using surveillance radars. The methodology may prove equally useful for optimising tracking algorithms.

  6. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.

  7. No food for thought: moderating effects of delay discounting and future time perspective on the relation between income and food insecurity.

    PubMed

    Epstein, Leonard H; Jankowiak, Noelle; Lin, Henry; Paluch, Rocco; Koffarnus, Mikhail N; Bickel, Warren K

    2014-09-01

    Low income is related to food insecurity, and research has suggested that a scarcity of resources associated with low income can shift attention to the present, thereby discounting the future. We tested whether attending to the present and discounting the future may moderate the influence of income on food insecurity. Delay discounting and measures of future time perspective (Zimbardo Time Perspective Inventory, Consideration of Future Consequences Scale, time period of financial planning, and subjective probability of living to age 75 y) were studied as moderators of the relation between income and food insecurity in a diverse sample of 975 adults, 31.8% of whom experienced some degree of food insecurity. Income, financial planning, subjective probability of living to age 75 y, and delay discounting predicted food insecurity as well as individuals who were high in food insecurity. Three-way interactions showed that delay discounting interacted with financial planning and income to predict food insecurity (P = 0.003). At lower levels of income, food insecurity was lowest for subjects who had good financial planning skills and did not discount the future, whereas having good financial skills and discounting the future had minimal influence on food insecurity. The same 3-way interaction was observed when high food insecurity was predicted (P = 0.008). Because of the role of scarce resources on narrowing attention and reducing prospective thinking, research should address whether modifying future orientation may reduce food insecurity even in the face of diminishing financial resources. © 2014 American Society for Nutrition.

  8. Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties

    NASA Astrophysics Data System (ADS)

    Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris

    2014-08-01

    We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.

  9. Micro- and nano-NDE systems for aircraft: great things in small packages

    NASA Astrophysics Data System (ADS)

    Malas, James C.; Kropas-Hughes, Claudia V.; Blackshire, James L.; Moran, Thomas; Peeler, Deborah; Frazier, W. G.; Parker, Danny

    2003-07-01

    Recent advancements in small, microscopic NDE sensor technologies will revolutionize how aircraft maintenance is done, and will significantly improve the reliability and airworthiness of current and future aircraft systems. A variety of micro/nano systems and concepts are being developed that will enable whole new capabilities for detecting and tracking structural integrity damage. For aging aircraft systems, the impact of micro-NDE sensor technologies will be felt immediately, with dramatic reductions in labor for maintenance, and extended useable life of critical components being two of the primary benefits. For the fleet management of future aircraft systems, a comprehensive evaluation and tracking of vehicle health throughout its entire life cycle will be needed. Indeed, micro/nano NDE systems will be instrumental in realizing this futuristic vision. Several major challenges will need to be addressed, however, before micro- and nano-NDE systems can effectively be implemented, and this will require interdisciplinary research approaches, and a systematic engineering integration of the new technologies into real systems. Future research will need to emphasize systems engineering approaches for designing materials and structures with in-situ inspection and prognostic capabilities. Recent advances in 1) embedded / add-on micro-sensors, 2) computer modeling of nondestructive evaluation responses, and 3) wireless communications are important steps toward this goal, and will ultimately provide previously unimagined opportunities for realizing whole new integrated vehicle health monitoring capabilities. The future use of micro/nano NDE technologies as vehicle health monitoring tools will have profound implications, and will provide a revolutionary way of doing NDE in the near and distant future.

  10. Multimission airborne radar for the 1990s

    NASA Astrophysics Data System (ADS)

    Robinson, Thomas H.

    1986-07-01

    The continuing trend towards the development and production of aircraft capable of multiple missions indicates that future airborne radars must provide a broad spectrum of air-to-air and air-to-ground modes. This paper investigates the modal and functional requirements of a multimode radar projected for the mid-1990s period. The paper is divided into two sections. In the first, the multimission capabilities of current radars are presented to establish trends and capabilities. In the second, the requirements of the next generation system are established. Current multimode radars lay the basis for future systems. The experience gained on the APG-65 and APG-63/70 radars is presented and conclusions are drawn regarding their impact on future system requirements. Not only are modes and performance reviewed for these radars but also their system architecture. The discussion starts with the APG-65 radar which is the first true multimission radar with programmable signal and data processing. Following this, the evolution of the APG-63 radar, culminating with the most recent upgrading resulting in redesignation of APG-70, is presented. The incorporation of air-to-ground capabilities in the APG-70, resulting from the Dual Role Fighter program, is reviewed. Results from the Advanced Fighter Capabilities Demonstration program are presented showing how high resolution SAR was incorporated into a full weapon delivery solution. The specific radar requirements for the next decade radar system are developed. This development is done in two parts. First, mode requirements are synthesized for air superiority, navigation and strike/interdiction operation. This includes low altitude penetration requirements and a review of radar timeline constraints which arise. Second, the fundamental functional requirements needed to implement the mode requirements are explored. Architectural issues and their impact on reliability and sustainability are also considered.

  11. Variable/Multispeed Rotorcraft Drive System Concepts

    NASA Technical Reports Server (NTRS)

    Stevens, Mark A.; Handschuh, Robert F.; Lewicki, David G.

    2009-01-01

    Several recent studies for advanced rotorcraft have identified the need for variable, or multispeed-capable rotors. A speed change of up to 50 percent has been proposed for future rotorcraft to improve vehicle performance. Varying rotor speed during flight not only requires a rotor capable of performing effectively over the extended operation speed and load range, but also requires an advanced propulsion system to provide the required speed changes. A study has been completed, which investigated possible drive system arrangements to accommodate up to the 50 percent speed change. These concepts are presented. The most promising configurations are identified and will be developed for future validation testing.

  12. Sustaining PICA for Future NASA Robotic Science Missions Including NF-4 and Discovery

    NASA Technical Reports Server (NTRS)

    Stackpoole, Mairead; Venkatapathy, Ethiraj; Violette, Steve

    2018-01-01

    Phenolic Impregnated Carbon Ablator (PICA), invented in the mid 1990's, is a low-density ablative thermal protection material proven capable of meeting sample return mission needs from the moon, asteroids, comets and other unrestricted class V destinations as well as for Mars. Its low density and efficient performance characteristics have proven effective for use from Discovery to Flag-ship class missions. It is important that NASA maintain this thermal protection material capability and ensure its availability for future NASA use. The rayon based carbon precursor raw material used in PICA preform manufacturing has experienced multiple supply chain issues and required replacement and requalification at least twice in the past 25 years and a third substitution is now needed. The carbon precursor replacement challenge is twofold - the first involves finding a long-term replacement for the current rayon and the second is to assess its future availability periodically to ensure it is sustainable and be alerted if additional replacement efforts need to be initiated. This paper reviews current PICA sustainability activities to identify a rayon replacement and to establish that the capability of the new PICA derived from an alternative precursor is in family with previous versions.

  13. Optimization of Second Fault Detection Thresholds to Maximize Mission POS

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan

    2018-01-01

    In order to support manned spaceflight safety requirements, the Space Launch System (SLS) has defined program-level requirements for key systems to ensure successful operation under single fault conditions. To accommodate this with regards to Navigation, the SLS utilizes an internally redundant Inertial Navigation System (INS) with built-in capability to detect, isolate, and recover from first failure conditions and still maintain adherence to performance requirements. The unit utilizes multiple hardware- and software-level techniques to enable detection, isolation, and recovery from these events in terms of its built-in Fault Detection, Isolation, and Recovery (FDIR) algorithms. Successful operation is defined in terms of sufficient navigation accuracy at insertion while operating under worst case single sensor outages (gyroscope and accelerometer faults at launch). In addition to first fault detection and recovery, the SLS program has also levied requirements relating to the capability of the INS to detect a second fault, tracking any unacceptable uncertainty in knowledge of the vehicle's state. This detection functionality is required in order to feed abort analysis and ensure crew safety. Increases in navigation state error and sensor faults can drive the vehicle outside of its operational as-designed environments and outside of its performance envelope causing loss of mission, or worse, loss of crew. The criteria for operation under second faults allows for a larger set of achievable missions in terms of potential fault conditions, due to the INS operating at the edge of its capability. As this performance is defined and controlled at the vehicle level, it allows for the use of system level margins to increase probability of mission success on the operational edges of the design space. Due to the implications of the vehicle response to abort conditions (such as a potentially failed INS), it is important to consider a wide range of failure scenarios in terms of both magnitude and time. As such, the Navigation team is taking advantage of the INS's capability to schedule and change fault detection thresholds in flight. These values are optimized along a nominal trajectory in order to maximize probability of mission success, and reducing the probability of false positives (defined as when the INS would report a second fault condition resulting in loss of mission, but the vehicle would still meet insertion requirements within system-level margins). This paper will describe an optimization approach using Genetic Algorithms to tune the threshold parameters to maximize vehicle resilience to second fault events as a function of potential fault magnitude and time of fault over an ascent mission profile. The analysis approach, and performance assessment of the results will be presented to demonstrate the applicability of this process to second fault detection to maximize mission probability of success.

  14. More than Just Convenient: The Scientific Merits of Homogeneous Convenience Samples

    PubMed Central

    Jager, Justin; Putnick, Diane L.; Bornstein, Marc H.

    2017-01-01

    Despite their disadvantaged generalizability relative to probability samples, non-probability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. PMID:28475254

  15. Collaborative Learning Environments for Management Education.

    ERIC Educational Resources Information Center

    Seufert, Sabine; Seufert, Andreas

    Confronted with the pressure of a rapidly changing environment, organizations demand new skills and capabilities of future managers. These demands and the findings of learning theory necessitate a corresponding change in education of tomorrow's managers. Future management education requires a balance between the imparting of knowledge to the…

  16. The Cheshire Jet: Harnessing Metamaterials to Achieve an Optical Stealth Capability

    DTIC Science & Technology

    2010-07-01

    Prometheus Books, 2005. Hamilton, Dave. Address. 23rd Annual National Test and Evaluation Conference, Hilton Head Island, SC, 15 March 2007. http...ment 2008: Challenges and Implications for the Future Joint Force. Norfolk, VA: Center for Joint Futures, 25 November 2008. Valentine, Jason , Shuang

  17. Quantifying the tracking capability of space-based AIS systems

    NASA Astrophysics Data System (ADS)

    Skauen, Andreas Nordmo

    2016-01-01

    The Norwegian Defence Research Establishment (FFI) has operated three Automatic Identification System (AIS) receivers in space. Two are on dedicated nano-satellites, AISSat-1 and AISSat-2. The third, the NORAIS Receiver, was installed on the International Space Station. A general method for calculating the upper bound on the tracking capability of a space-based AIS system has been developed and the results from the algorithm applied to AISSat-1 and the NORAIS Receiver individually. In addition, a constellation of AISSat-1 and AISSat-2 is presented. The tracking capability is defined as the probability of re-detecting ships as they move around the globe and is explained to represent and upper bound on a space-based AIS system performance. AISSat-1 and AISSat-2 operates on the nominal AIS1 and AIS2 channels, while the NORAIS Receiver data used are from operations on the dedicated space AIS channels, AIS3 and AIS4. The improved tracking capability of operations on the space AIS channels is presented.

  18. Neural response to reward anticipation under risk is nonlinear in probabilities.

    PubMed

    Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F

    2009-02-18

    A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.

  19. Competitive assessment of aerospace systems using system dynamics

    NASA Astrophysics Data System (ADS)

    Pfaender, Jens Holger

    Aircraft design has recently experienced a trend away from performance centric design towards a more balanced approach with increased emphasis on engineering an economically successful system. This approach focuses on bringing forward a comprehensive economic and life-cycle cost analysis. Since the success of any system also depends on many external factors outside of the control of the designer, this traditionally has been modeled as noise affecting the uncertainty of the design. However, this approach is currently lacking a strategic treatment of necessary early decisions affecting the probability of success of a given concept in a dynamic environment. This suggests that the introduction of a dynamic method into a life-cycle cost analysis should allow the analysis of the future attractiveness of such a concept in the presence of uncertainty. One way of addressing this is through the use of a competitive market model. However, existing market models do not focus on the dynamics of the market. Instead, they focus on modeling and predicting market share through logit regression models. The resulting models exhibit relatively poor predictive capabilities. The method proposed here focuses on a top-down approach that integrates a competitive model based on work in the field of system dynamics into the aircraft design process. Demonstrating such integration is one of the primary contributions of this work, which previously has not been demonstrated. This integration is achieved through the use of surrogate models, in this case neural networks. This enabled not only the practical integration of analysis techniques, but also reduced the computational requirements so that interactive exploration as envisioned was actually possible. The example demonstration of this integration is built on the competition in the 250 seat large commercial aircraft market exemplified by the Boeing 767-400ER and the Airbus A330-200. Both aircraft models were calibrated to existing performance and certification data and then integrated into the system dynamics market model. The market model was then calibrated with historical market data. This calibration showed a much improved predictive capability as compared to the conventional logit regression models. An additional advantage of this dynamic model is that to realize this improved capability, no additional explanatory variables were required. Furthermore, the resulting market model was then integrated into a prediction profiler environment with a time variant Monte-Carlo analysis resulting in a unique trade-off environment. This environment was shown to allow interactive trade-off between aircraft design decisions and economic considerations while allowing the exploration potential market success in the light of varying external market conditions and scenarios. The resulting method is capable of reduced decision support uncertainty and identification of robust design decisions in future scenarios with a high likelihood of occurrence with special focus on the path dependent nature of future implications of decisions. Furthermore, it was possible to demonstrate the increased importance of design and technology choices on the competitiveness in scenarios with drastic increases in commodity prices during the time period modeled. Another use of the existing outputs of the Monte-Carlo analysis was then realized by showing them on a multivariate scatter plot. This plot was then shown to enable by appropriate grouping of variables to enable the top down definition of an aircraft design, also known as inverse design. In other words this enables the designer to define strategic market and return on investment goals for a number of scenarios, for example the development of fuel prices, and then directly see which specific aircraft designs meet these goals.

  20. NASA Capability Roadmaps Executive Summary

    NASA Technical Reports Server (NTRS)

    Willcoxon, Rita; Thronson, Harley; Varsi, Guilio; Mueller, Robert; Regenie, Victoria; Inman, Tom; Crooke, Julie; Coulter, Dan

    2005-01-01

    This document is the result of eight months of hard work and dedication from NASA, industry, other government agencies, and academic experts from across the nation. It provides a summary of the capabilities necessary to execute the Vision for Space Exploration and the key architecture decisions that drive the direction for those capabilities. This report is being provided to the Exploration Systems Architecture Study (ESAS) team for consideration in development of an architecture approach and investment strategy to support NASA future mission, programs and budget requests. In addition, it will be an excellent reference for NASA's strategic planning. A more detailed set of roadmaps at the technology and sub-capability levels are available on CD. These detailed products include key driving assumptions, capability maturation assessments, and technology and capability development roadmaps.

  1. Qualitative Exploration of the Suitability of Capability Based Instruments to Measure Quality of Life in Family Carers of People with Dementia

    PubMed Central

    Jones, Carys; Edwards, Rhiannon Tudor; Hounsome, Barry

    2014-01-01

    Background. In an ageing population, many individuals find themselves becoming a carer for an elderly relative. This qualitative study explores aspects of quality of life affected by caring for a person with dementia, with the aim of identifying whether capability based questionnaires are suitable for measuring carer quality of life. Methods. Semistructured interviews lasting up to an hour were conducted, November 2010–July 2011, with eight family carers of people with dementia. Interviews typically took place at the participants' homes and were recorded and transcribed verbatim. Framework analysis was used to code and analyse data. Domains from three capability based questionnaires (ICECAP-O, Carer Experience Scale, and ASCOT) were used as initial codes. Similar codes were grouped into categories, and broader themes were developed from these categories. Results. Four themes were identified: social network and relationships; interactions with agencies; recognition of role; and time for oneself. Conclusions. By identifying what affects carers' quality of life, an appropriate choice can be made when selecting instruments for future carer research. The themes identified had a high degree of overlap with the capability instruments, suggesting that the capabilities approach would be suitable for future research involving carers of people with dementia. PMID:24967332

  2. Candidate control design metrics for an agile fighter

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Bailey, Melvin L.; Ostroff, Aaron J.

    1991-01-01

    Success in the fighter combat environment of the future will certainly demand increasing capability from aircraft technology. These advanced capabilities in the form of superagility and supermaneuverability will require special design techniques which translate advanced air combat maneuvering requirements into design criteria. Control design metrics can provide some of these techniques for the control designer. Thus study presents an overview of control design metrics and investigates metrics for advanced fighter agility. The objectives of various metric users, such as airframe designers and pilots, are differentiated from the objectives of the control designer. Using an advanced fighter model, metric values are documented over a portion of the flight envelope through piloted simulation. These metric values provide a baseline against which future control system improvements can be compared and against which a control design methodology can be developed. Agility is measured for axial, pitch, and roll axes. Axial metrics highlight acceleration and deceleration capabilities under different flight loads and include specific excess power measurements to characterize energy meneuverability. Pitch metrics cover both body-axis and wind-axis pitch rates and accelerations. Included in pitch metrics are nose pointing metrics which highlight displacement capability between the nose and the velocity vector. Roll metrics (or torsion metrics) focus on rotational capability about the wind axis.

  3. Future prospects for high resolution X-ray spectrometers

    NASA Technical Reports Server (NTRS)

    Canizares, C. R.

    1981-01-01

    Capabilities of the X-ray spectroscopy payloads were compared. Comparison of capabilities of AXAF in the context of the science to be achieved is reported. The Einstein demonstrated the tremendous scientific power of spectroscopy to probe deeply the astrophysics of all types of celestial X-ray source. However, it has limitations in sensitivity and resolution. Each of the straw man instruments has a sensitivity that is at least an order of magnitude better than that of the Einstein FPSC. The AXAF promises powerful spectral capability.

  4. The FAA Plans and Programs for the Future Airport and Air Traffic control System.

    DTIC Science & Technology

    1980-11-13

    for system improvements and innovation so as to permit evolution to new capabilities in a timely fashion. o To reduce negative impacts of aviation on...demand pattern which tends to bunch arrivals and departures in blocks providing the capability to interchange connecting passengers in a high level of...phased in over the next 5 to 7 years. It will introduce a number of new capabilities which will utltimately provide specialists with rapid access to

  5. NASA Capabilities That Could Impact Terrestrial Smart Grids of the Future

    NASA Technical Reports Server (NTRS)

    Beach, Raymond F.

    2015-01-01

    Incremental steps to steadily build, test, refine, and qualify capabilities that lead to affordable flight elements and a deep space capability. Potential Deep Space Vehicle Power system characteristics: power 10 kilowatts average; two independent power channels with multi-level cross-strapping; solar array power 24 plus kilowatts; multi-junction arrays; lithium Ion battery storage 200 plus ampere-hours; sized for deep space or low lunar orbit operation; distribution120 volts secondary (SAE AS 5698); 2 kilowatt power transfer between vehicles.

  6. Useful Sensor Web Capabilities to Enable Progressive Mission Autonomy

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2007-01-01

    This viewgraph presentation reviews using the Sensor Web capabilities as an enabling technology to allow for progressive autonomy of NASA space missions. The presentation reviews technical challenges for future missions, and some of the capabilities that exist to meet those challenges. To establish the ability of the technology to meet the challenges, experiments were conducted on three missions: Earth Observing 1 (EO-1), Cosmic Hot Interstellar Plasma Spectrometer (CHIPS) and Space Technology 5 (ST-5). These experiments are reviewed.

  7. Percept User Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, Brian; Kennon, Stephen Ray

    2017-05-01

    This document is the main user guide for the Sierra/Percept capabilities including the mesh_adapt and mesh_transfer tools. Basic capabilities for uniform mesh refinement (UMR) and mesh transfers are discussed. Examples are used to provide illustration. Future versions of this manual will include more advanced features such as geometry and mesh smoothing. Additionally, all the options for the mesh_adapt code will be described in detail. Capabilities for local adaptivity in the context of offline adaptivity will also be included. This page intentionally left blank.

  8. Capability and dependency in the Newcastle 85+ cohort study. Projections of future care needs

    PubMed Central

    2011-01-01

    Background Little is known of the capabilities of the oldest old, the fastest growing age group in the population. We aimed to estimate capability and dependency in a cohort of 85 year olds and to project future demand for care. Methods Structured interviews at age 85 with 841 people born in 1921 and living in Newcastle and North Tyneside, UK who were permanently registered with participating general practices. Measures of capability included were self-reported activities of daily living (ADL), timed up and go test (TUG), standardised mini-mental state examination (SMMSE), and assessment of urinary continence in order to classify interval-need dependency. To project future demand for care the proportion needing 24-hour care was applied to the 2008 England and Wales population projections of those aged 80 years and over by gender. Results Of participants, 62% (522/841) were women, 77% (651/841) lived in standard housing, 13% (106/841) in sheltered housing and 10% (84/841) in a care home. Overall, 20% (165/841) reported no difficulty with any of the ADLs. Men were more capable in performing ADLs and more independent than women. TUG validated self-reported ADLs. When classified by 'interval of need' 41% (332/810) were independent, 39% (317/810) required help less often than daily, 12% (94/810) required help at regular times of the day and 8% (67/810) required 24-hour care. Of care-home residents, 94% (77/82) required daily help or 24-hour care. Future need for 24-hour care for people aged 80 years or over in England and Wales is projected to increase by 82% from 2010 to 2030 with a demand for 630,000 care-home places by 2030. Conclusions This analysis highlights the diversity of capability and levels of dependency in this cohort. A remarkably high proportion remain independent, particularly men. However a significant proportion of this population require 24-hour care at home or in care homes. Projections for the next 20 years suggest substantial increases in the number requiring 24-hour care due to population ageing and a proportionate increase in demand for care-home places unless innovative health and social care interventions are found. PMID:21542901

  9. Context recognition and situation assessment in autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Yavnai, Arie

    1993-05-01

    The capability to recognize the operating context and to assess the situation in real-time is needed, if a high functionality autonomous mobile robot has to react properly and effectively to continuously changing situations and events, either external or internal, while the robot is performing its assigned tasks. A new approach and architecture for context recognition and situation assessment module (CORSA) is presented in this paper. CORSA is a multi-level information processing module which consists of adaptive decision and classification algorithms. It performs dynamic mapping from the data space to the context space, and dynamically decides on the context class. Learning mechanism is employed to update the decision variables so as to minimize the probability of misclassification. CORSA is embedded within the Mission Manager module of the intelligent autonomous hyper-controller (IAHC) of the mobile robot. The information regarding operating context, events and situation is then communicated to other modules of the IAHC where it is used to: (a) select the appropriate action strategy; (b) support the processes to arbitration and conflict resolution between reflexive behaviors and reasoning-driven behaviors; (c) predict future events and situations; and (d) determine criteria and priorities for planning, replanning, and decision making.

  10. Phase-shifting coronagraph

    NASA Astrophysics Data System (ADS)

    Hénault, François; Carlotti, Alexis; Vérinaud, Christophe

    2017-09-01

    With the recent commissioning of ground instruments such as SPHERE or GPI and future space observatories like WFIRST-AFTA, coronagraphy should probably become the most efficient tool for identifying and characterizing extrasolar planets in the forthcoming years. Coronagraphic instruments such as Phase mask coronagraphs (PMC) are usually based on a phase mask or plate located at the telescope focal plane, spreading the starlight outside the diameter of a Lyot stop that blocks it. In this communication is investigated the capability of a PMC to act as a phase-shifting wavefront sensor for better control of the achieved star extinction ratio in presence of the coronagraphic mask. We discuss the two main implementations of the phase-shifting process, either introducing phase-shifts in a pupil plane and sensing intensity variations in an image plane, or reciprocally. Conceptual optical designs are described in both cases. Numerical simulations allow for better understanding of the performance and limitations of both options, and optimizing their fundamental parameters. In particular, they demonstrate that the phase-shifting process is a bit more efficient when implemented into an image plane, and is compatible with the most popular phase masks currently employed, i.e. fourquadrants and vortex phase masks.

  11. Adjuvant Treatment of Melanoma

    PubMed Central

    Moreno Nogueira, J. A.; Valero Arbizu, M.; Pérez Temprano, R.

    2013-01-01

    Melanomas represent 4% of all malignant tumors of the skin, yet account for 80% of deaths from skin cancer.While in the early stages patients can be successfully treated with surgical resection, metastatic melanoma prognosis is dismal. Several oncogenes have been identified in melanoma as BRAF, NRAS, c-Kit, and GNA11 GNAQ, each capable of activating MAPK pathway that increases cell proliferation and promotes angiogenesis, although NRAS and c-Kit also activate PI3 kinase pathway, including being more commonly BRAF activated oncogene. The treatment of choice for localised primary cutaneous melanoma is surgery plus lymphadenectomy if regional lymph nodes are involved. The justification for treatment in addition to surgery is based on the poor prognosis for high risk melanomas with a relapse index of 50–80%. Patients included in the high risk group should be assessed for adjuvant treatment with high doses of Interferon-α2b, as it is the only treatment shown to significantly improve disease free and possibly global survival. In the future we will have to analyze all these therapeutic possibilities on specific targets, probably associated with chemotherapy and/or interferon in the adjuvant treatment, if we want to change the natural history of melanomas. PMID:23476798

  12. Rapidly locating and characterizing pollutant releases in buildings.

    PubMed

    Sohn, Michael D; Reynolds, Pamela; Singh, Navtej; Gadgil, Ashok J

    2002-12-01

    Releases of airborne contaminants in or near a building can lead to significant human exposures unless prompt response measures are taken. However, possible responses can include conflicting strategies, such as shutting the ventilation system off versus running it in a purge mode or having occupants evacuate versus sheltering in place. The proper choice depends in part on knowing the source locations, the amounts released, and the likely future dispersion routes of the pollutants. We present an approach that estimates this information in real time. It applies Bayesian statistics to interpret measurements of airborne pollutant concentrations from multiple sensors placed in the building and computes best estimates and uncertainties of the release conditions. The algorithm is fast, capable of continuously updating the estimates as measurements stream in from sensors. We demonstrate the approach using a hypothetical pollutant release in a five-room building. Unknowns to the interpretation algorithm include location, duration, and strength of the source, and some building and weather conditions. Two sensor sampling plans and three levels of data quality are examined. Data interpretation in all examples is rapid; however, locating and characterizing the source with high probability depends on the amount and quality of data and the sampling plan.

  13. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  14. Establishment of a highly efficient virus-inducible CRISPR/Cas9 system in insect cells.

    PubMed

    Dong, Zhan-Qi; Chen, Ting-Ting; Zhang, Jun; Hu, Nan; Cao, Ming-Ya; Dong, Fei-Fan; Jiang, Ya-Ming; Chen, Peng; Lu, Cheng; Pan, Min-Hui

    2016-06-01

    Although current antiviral strategies can inhibit baculovirus infection and decrease viral DNA replication to a certain extent, novel tools are required for specific and accurate elimination of baculovirus genomes from infected insects. Using the newly developed clustered regularly interspaced short palindromic repeats/associated protein 9 nuclease (CRISPR/Cas9) technology, we disrupted a viral genome in infected insect cells in vitro as a defense against viral infection. We optimized the CRISPR/Cas9 system to edit foreign and viral genome in insect cells. Using Bombyx mori nucleopolyhedrovirus (BmNPV) as a model, we found that the CRISPR/Cas9 system was capable of cleaving the replication key factor ie-1 in BmNPV thus effectively inhibiting virus proliferation. Furthermore, we constructed a virus-inducible CRISPR/Cas9 editing system, which minimized the probability of off-target effects and was rapidly activated after viral infection. This is the first report describing the application of the CRISPR/Cas9 system in insect antiviral research. Establishment of a highly efficient virus-inducible CRISPR/Cas9 system in insect cells provides insights to produce virus-resistant transgenic strains for future. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. On-orbit Metrology and Calibration Requirements for Space Station Activities Definition Study

    NASA Technical Reports Server (NTRS)

    Cotty, G. M.; Ranganathan, B. N.; Sorrell, A. L.

    1989-01-01

    The Space Station is the focal point for the commercial development of space. The long term routine operation of the Space Station and the conduct of future commercial activities suggests the need for in-space metrology capabilities analogous when possible to those on-Earth. The ability to perform periodic calibrations and measurements with proper traceability is imperative for the routine operation of the Space Station. An initial review, however, indicated a paucity of data related to metrology and calibration requirements for in-space operations. This condition probably exists because of the highly developmental aspect of space activities to date, their short duration, and nonroutine nature. The on-orbit metrology and calibration needs of the Space Station were examined and assessed. In order to achieve this goal, the following tasks were performed: an up-to-date literature review; identification of on-orbit calibration techniques; identification of sensor calibration requirements; identification of calibration equipment requirements; definition of traceability requirements; preparation of technology development plans; and preparation of the final report. Significant information and major highlights pertaining to each task is presented. In addition, some general (generic) conclusions/observations and recommendations that are pertinent to the overall in-space metrology and calibration activities are presented.

  16. Achieving unequal error protection with convolutional codes

    NASA Technical Reports Server (NTRS)

    Mills, D. G.; Costello, D. J., Jr.; Palazzo, R., Jr.

    1994-01-01

    This paper examines the unequal error protection capabilities of convolutional codes. Both time-invariant and periodically time-varying convolutional encoders are examined. The effective free distance vector is defined and is shown to be useful in determining the unequal error protection (UEP) capabilities of convolutional codes. A modified transfer function is used to determine an upper bound on the bit error probabilities for individual input bit positions in a convolutional encoder. The bound is heavily dependent on the individual effective free distance of the input bit position. A bound relating two individual effective free distances is presented. The bound is a useful tool in determining the maximum possible disparity in individual effective free distances of encoders of specified rate and memory distribution. The unequal error protection capabilities of convolutional encoders of several rates and memory distributions are determined and discussed.

  17. A probability-based multi-cycle sorting method for 4D-MRI: A simulation study.

    PubMed

    Liang, Xiao; Yin, Fang-Fang; Liu, Yilin; Cai, Jing

    2016-12-01

    To develop a novel probability-based sorting method capable of generating multiple breathing cycles of 4D-MRI images and to evaluate performance of this new method by comparing with conventional phase-based methods in terms of image quality and tumor motion measurement. Based on previous findings that breathing motion probability density function (PDF) of a single breathing cycle is dramatically different from true stabilized PDF that resulted from many breathing cycles, it is expected that a probability-based sorting method capable of generating multiple breathing cycles of 4D images may capture breathing variation information missing from conventional single-cycle sorting methods. The overall idea is to identify a few main breathing cycles (and their corresponding weightings) that can best represent the main breathing patterns of the patient and then reconstruct a set of 4D images for each of the identified main breathing cycles. This method is implemented in three steps: (1) The breathing signal is decomposed into individual breathing cycles, characterized by amplitude, and period; (2) individual breathing cycles are grouped based on amplitude and period to determine the main breathing cycles. If a group contains more than 10% of all breathing cycles in a breathing signal, it is determined as a main breathing pattern group and is represented by the average of individual breathing cycles in the group; (3) for each main breathing cycle, a set of 4D images is reconstructed using a result-driven sorting method adapted from our previous study. The probability-based sorting method was first tested on 26 patients' breathing signals to evaluate its feasibility of improving target motion PDF. The new method was subsequently tested for a sequential image acquisition scheme on the 4D digital extended cardiac torso (XCAT) phantom. Performance of the probability-based and conventional sorting methods was evaluated in terms of target volume precision and accuracy as measured by the 4D images, and also the accuracy of average intensity projection (AIP) of 4D images. Probability-based sorting showed improved similarity of breathing motion PDF from 4D images to reference PDF compared to single cycle sorting, indicated by the significant increase in Dice similarity coefficient (DSC) (probability-based sorting, DSC = 0.89 ± 0.03, and single cycle sorting, DSC = 0.83 ± 0.05, p-value <0.001). Based on the simulation study on XCAT, the probability-based method outperforms the conventional phase-based methods in qualitative evaluation on motion artifacts and quantitative evaluation on tumor volume precision and accuracy and accuracy of AIP of the 4D images. In this paper the authors demonstrated the feasibility of a novel probability-based multicycle 4D image sorting method. The authors' preliminary results showed that the new method can improve the accuracy of tumor motion PDF and the AIP of 4D images, presenting potential advantages over the conventional phase-based sorting method for radiation therapy motion management.

  18. Active Longitude and Coronal Mass Ejection Occurrences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyenge, N.; Kiss, T. S.; Erdélyi, R.

    The spatial inhomogeneity of the distribution of coronal mass ejection (CME) occurrences in the solar atmosphere could provide a tool to estimate the longitudinal position of the most probable CME-capable active regions in the Sun. The anomaly in the longitudinal distribution of active regions themselves is often referred to as active longitude (AL). In order to reveal the connection between the AL and CME spatial occurrences, here we investigate the morphological properties of active regions. The first morphological property studied is the separateness parameter, which is able to characterize the probability of the occurrence of an energetic event, such asmore » a solar flare or CME. The second morphological property is the sunspot tilt angle. The tilt angle of sunspot groups allows us to estimate the helicity of active regions. The increased helicity leads to a more complex buildup of the magnetic structure and also can cause CME eruption. We found that the most complex active regions appear near the AL and that the AL itself is associated with the most tilted active regions. Therefore, the number of CME occurrences is higher within the AL. The origin of the fast CMEs is also found to be associated with this region. We concluded that the source of the most probably CME-capable active regions is at the AL. By applying this method, we can potentially forecast a flare and/or CME source several Carrington rotations in advance. This finding also provides new information for solar dynamo modeling.« less

  19. Active Longitude and Coronal Mass Ejection Occurrences

    NASA Astrophysics Data System (ADS)

    Gyenge, N.; Singh, T.; Kiss, T. S.; Srivastava, A. K.; Erdélyi, R.

    2017-03-01

    The spatial inhomogeneity of the distribution of coronal mass ejection (CME) occurrences in the solar atmosphere could provide a tool to estimate the longitudinal position of the most probable CME-capable active regions in the Sun. The anomaly in the longitudinal distribution of active regions themselves is often referred to as active longitude (AL). In order to reveal the connection between the AL and CME spatial occurrences, here we investigate the morphological properties of active regions. The first morphological property studied is the separateness parameter, which is able to characterize the probability of the occurrence of an energetic event, such as a solar flare or CME. The second morphological property is the sunspot tilt angle. The tilt angle of sunspot groups allows us to estimate the helicity of active regions. The increased helicity leads to a more complex buildup of the magnetic structure and also can cause CME eruption. We found that the most complex active regions appear near the AL and that the AL itself is associated with the most tilted active regions. Therefore, the number of CME occurrences is higher within the AL. The origin of the fast CMEs is also found to be associated with this region. We concluded that the source of the most probably CME-capable active regions is at the AL. By applying this method, we can potentially forecast a flare and/or CME source several Carrington rotations in advance. This finding also provides new information for solar dynamo modeling.

  20. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  1. Managing fire and fuels in a warmer climate

    Treesearch

    David L. Peterson

    2010-01-01

    This historical perspective on fire provides a window into the future of fire in the Pacific Northwest. Although fire will always be more common in the interior portion of the region, a warmer climate could bring more fire to the westside of the Cascade Range where summers are typically dry and will probably become drier. If future climate resembles the climate now...

  2. Probabilities of Possible Future Prices (Short-Term Energy Outlook Supplement April 2010)

    EIA Publications

    2010-01-01

    The Energy Information Administration introduced a monthly analysis of energy price volatility and forecast uncertainty in the October 2009 Short-Term Energy Outlook (STEO). Included in the analysis were charts portraying confidence intervals around the New York Mercantile Exchange (NYMEX) futures prices of West Texas Intermediate (equivalent to light sweet crude oil) and Henry Hub natural gas contracts.

  3. Using Information Technologies in Professional Training of Future Security Specialists in the USA, Great Britain, Poland and Israel

    ERIC Educational Resources Information Center

    Kyslenko, Dmytro

    2017-01-01

    The paper discusses the use of information technologies in professional training of future security specialists in the United States, Great Britain, Poland and Israel. The probable use of computer-based techniques being available within the integrated Web-sites have been systematized. It has been suggested that the presented scheme may be of great…

  4. Future fire probability modeling with climate change data and physical chemistry

    Treesearch

    Richard P. Guyette; Frank R. Thompson; Jodi Whittier; Michael C. Stambaugh; Daniel C. Dey

    2014-01-01

    Climate has a primary influence on the occurrence and rate of combustion in ecosystems with carbon-based fuels such as forests and grasslands. Society will be confronted with the effects of climate change on fire in future forests. There are, however, few quantitative appraisals of how climate will affect wildland fire in the United States. We demonstrated a method for...

  5. Scenario studies as a synthetic and integrative research activity for Long-Term Ecological Research

    Treesearch

    Jonathan R. Thompson; Arnim Wiek; Frederick J. Swanson; Stephen R. Carpenter; Nancy Fresco; Teresa Hollingsworth; Thomas A. Spies; David R. Foster

    2012-01-01

    Scenario studies have emerged as a powerful approach for synthesizing diverse forms of research and for articulating and evaluating alternative socioecological futures. Unlike predictive modeling, scenarios do not attempt to forecast the precise or probable state of any variable at a given point in the future. Instead, comparisons among a set of contrasting scenarios...

  6. Prediction of the future number of wells in production (in Spanish)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coca, B.P.

    1981-01-01

    A method to predict the number of wells that will continue producing at a certain date in the future is presented. The method is applicable to reservoirs of the depletion type and is based on the survival probability concept. This is useful when forecasting by empirical methods. An example of a field in primary production is presented.

  7. Looking to the Future: Will Behavior Analysis Survive and Prosper?

    ERIC Educational Resources Information Center

    Poling, Alan

    2010-01-01

    Behavior analysis as a discipline currently is doing relatively well. How it will do in the future is unclear and depends on how the field, and the world at large, changes. Five current characteristics of the discipline that appear to reduce the probability that it will survive and prosper are discussed and suggestions for improvement are offered.…

  8. We'll Meet Again: Revealing Distributional and Temporal Patterns of Social Contact

    PubMed Central

    Pachur, Thorsten; Schooler, Lael J.; Stevens, Jeffrey R.

    2014-01-01

    What are the dynamics and regularities underlying social contact, and how can contact with the people in one's social network be predicted? In order to characterize distributional and temporal patterns underlying contact probability, we asked 40 participants to keep a diary of their social contacts for 100 consecutive days. Using a memory framework previously used to study environmental regularities, we predicted that the probability of future contact would follow in systematic ways from the frequency, recency, and spacing of previous contact. The distribution of contact probability across the members of a person's social network was highly skewed, following an exponential function. As predicted, it emerged that future contact scaled linearly with frequency of past contact, proportionally to a power function with recency of past contact, and differentially according to the spacing of past contact. These relations emerged across different contact media and irrespective of whether the participant initiated or received contact. We discuss how the identification of these regularities might inspire more realistic analyses of behavior in social networks (e.g., attitude formation, cooperation). PMID:24475073

  9. Steve Ostro and the Near-Earth Asteroid Impact Hazard

    NASA Astrophysics Data System (ADS)

    Chapman, Clark R.

    2009-09-01

    The late Steve Ostro, whose scientific interests in Near-Earth Asteroids (NEAs) primarily related to his planetary radar research in the 1980s, soon became an expert on the impact hazard. He quickly realized that radar provided perspectives on close-approaching NEAs that were both very precise as well as complementary to traditional astrometry, enabling good predictions of future orbits and collision probabilities extending for centuries into the future. He also was among the few astronomers who considered the profound issues raised by this newly recognized hazard and by early suggestions of how to mitigate the hazard. With Carl Sagan, Ostro articulated the "deflection dilemma" and other potential low-probability but real dangers of mitigation technologies that might be more serious than the low-probability impact hazard itself. Yet Ostro maintained a deep interest in developing responsible mitigation technologies, in educating the public about the nature of the impact hazard, and in learning more about the population of threatening bodies, especially using the revealing techniques of delay-doppler radar mapping of NEAs and their satellites.

  10. Real-Time Safety Monitoring and Prediction for the National Airspace System

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil

    2016-01-01

    As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have both an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecasts, predicted health of assets in the airspace, and so on. To this end, we have developed a Real-Time Safety Monitoring (RTSM) that first, estimates the state of the NAS using the dynamic models. Then, given the state estimate and a probability distribution of future inputs to the NAS, the framework predicts the evolution of the NAS, i.e., the future state, and analyzes these future states to predict the occurrence of unsafe events. The entire probability distribution of airspace safety metrics is computed, not just point estimates, without significant assumptions regarding the distribution type and or parameters. We demonstrate our overall approach by predicting the occurrence of some unsafe events and show how these predictions evolve in time as flight operations progress.

  11. The minimization of pylon-mounted store effects on air combat capability

    NASA Technical Reports Server (NTRS)

    Spearman, M. L.

    1983-01-01

    Some effects of pylon-mounted missiles on aft-tail delta wing supersonic fighter concepts have been investigated. Whereas minimum drag penalties do occur with the addition of missiles, the effects at higher lifts, corresponding to maneuvering flight, are less severe and often favorable. Lower speeds and altitudes enhance the maneuvering capability and one-on-one air combat would probably tend to degenerate to subsonic speeds even though the combatants may be flying supersonic fighters. Higher speed (supersonic) flight might best be reserved for interceptors with long-range missiles where the weapon carriage effects at low angles of attack are of prime importance.

  12. Chronology of Postglacial Eruptive Activity and Calculation of Eruption Probabilities for Medicine Lake Volcano, Northern California

    USGS Publications Warehouse

    Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.

    2007-01-01

    Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.

  13. Curriculum Change and Climate Change: Inside outside Pressures in Higher Education

    ERIC Educational Resources Information Center

    Fahey, Shireen J.

    2012-01-01

    In higher education today, institutions are facing a number of challenges--including the challenge to create future-proof graduates. Higher education institutions have a particular mandate to develop future leaders and decision-makers capable of understanding and providing solutions to complex, global issues. Education programmes that focus on…

  14. Video Feedforward for Reading

    ERIC Educational Resources Information Center

    Dowrick, Peter W.; Kim-Rupnow, Weol Soon; Power, Thomas J.

    2006-01-01

    Video feedforward can create images of positive futures, as has been shown by researchers using self-modeling methods to teach new skills with carefully planned and edited videos that show the future capability of the individual. As a supplement to tutoring provided by community members, we extended these practices to young children struggling to…

  15. ISS Efforts to Fully Utilize its Target Acquisition Capability Serves as an Analog for Future Laser Pointing Communications Systems

    NASA Technical Reports Server (NTRS)

    Jackson, Dan

    2017-01-01

    The ISS is an outstanding platform for developing, testing and refining laser communications systems for future exploration. A recent ISS project which improved ISS communications satellite acquisition performance proves the platform’s utility as a laser communications systems testbed.

  16. State/federal interaction of LANDSAT system and related technical assistance

    NASA Technical Reports Server (NTRS)

    Tesser, P. A.

    1981-01-01

    The history of state involvement in LANDSAT systems planning and related efforts is described. Currently 16 states have visual LANDSAT capabilities and 10 others are planning on developing such capabilities. The federal government's future plans for the LANDSAT system, the impacts of recent budget decisions on the systems, and the FY 82 budget process are examined.

  17. Developing a Diagnosis System of Work-Related Capabilities for Students: A Computer-Assisted Assessment

    ERIC Educational Resources Information Center

    Liao, C. H.; Yang, M. H.; Yang, B. C.

    2013-01-01

    A gap exists between students' employment needs and higher education offerings. Thus, developing the capability to meet the learning needs of students in supporting their future aspirations should be facilitated. To bridge this gap in practice, this study uses multiple methods (i.e., nominal group technique and instructional systems development)…

  18. Evolution of Embedded Processing for Wide Area Surveillance

    DTIC Science & Technology

    2014-01-01

    future vision . 15. SUBJECT TERMS Embedded processing; high performance computing; general-purpose graphical processing units (GPGPUs) 16. SECURITY...recon- naissance (ISR) mission capabilities. The capabilities these advancements are achieving include the ability to provide persistent all...fighters to support and positively affect their mission . Significant improvements in high-performance computing (HPC) technology make it possible to

  19. Facilitating Collaborative Capabilities for Future Work: What Can Be Learnt from Interprofessional Fieldwork in Health

    ERIC Educational Resources Information Center

    Brewer, Margo; Flavell, Helen

    2018-01-01

    There is growing pressure in higher education to develop graduates with the capabilities to work effectively in collaborative, interdisciplinary teams to solve the key issues facing humankind. For many years, health has been pioneering interprofessional education as the means to deliver professionals with capacity to work together to deliver high…

  20. An Analysis of Marine Corps Beyond Line of Sight Wideband Satellite Communications Requirements

    DTIC Science & Technology

    2010-09-01

    Tactical SHF Satellite Terminal UFO ... what made it bearable. Stephen Musick: Thanks are due to my family and friends for their support and encouragement. I want to especially thank... what beyond LOS WB SATCOM capabilities the USMC requires in order to prepare for the future. A clear understanding of desired capabilities allows for

  1. Projecting Future Sea Level Rise for Water Resources Planning in California

    NASA Astrophysics Data System (ADS)

    Anderson, J.; Kao, K.; Chung, F.

    2008-12-01

    Sea level rise is one of the major concerns for the management of California's water resources. Higher water levels and salinity intrusion into the Sacramento-San Joaquin Delta could affect water supplies, water quality, levee stability, and aquatic and terrestrial flora and fauna species and their habitat. Over the 20th century, sea levels near San Francisco Bay increased by over 0.6ft. Some tidal gauge and satellite data indicate that rates of sea level rise are accelerating. Sea levels are expected to continue to rise due to increasing air temperatures causing thermal expansion of the ocean and melting of land-based ice such as ice on Greenland and in southeastern Alaska. For water planners, two related questions are raised on the uncertainty of future sea levels. First, what is the expected sea level at a specific point in time in the future, e.g., what is the expected sea level in 2050? Second, what is the expected point of time in the future when sea levels will exceed a certain height, e.g., what is the expected range of time when the sea level rises by one foot? To address these two types of questions, two factors are considered: (1) long term sea level rise trend, and (2) local extreme sea level fluctuations. A two-step approach will be used to develop sea level rise projection guidelines for decision making that takes both of these factors into account. The first step is developing global sea level rise probability distributions for the long term trends. The second step will extend the approach to take into account the effects of local astronomical tides, changes in atmospheric pressure, wind stress, floods, and the El Niño/Southern Oscillation. In this paper, the development of the first step approach is presented. To project the long term sea level rise trend, one option is to extend the current rate of sea level rise into the future. However, since recent data indicate rates of sea level rise are accelerating, methods for estimating sea level rise that account for this acceleration are needed. One such method is an empirical relationship between air temperatures and global sea levels. The air temperature-sea level rise relationship was applied to the 12 climate change projections selected by the California Climate Action Team to estimate future sea levels. The 95% confidence level developed from the historical data was extrapolated to estimate the uncertainties in the future projections. To create sea level rise trend probability distributions, a lognormal probability distribution and a generalized extreme value probability distribution are used. Parameter estimations for these distributions are subjective and inevitably involve uncertainties, which will be improved as more research is conducted in this area.

  2. The Defense Industrial Base: Prescription for a Psychosomatic Ailment

    DTIC Science & Technology

    1983-08-01

    The Decision- Making Process ------------------------- 65 Notes ---------------------------------------- FIGURE 4-1. The Decision [laking Process...the strategy and tactics process to make certain that we can attain out national security objectives. (IFP is also known as mobilization planning or...decision- making model that could improve the capacity and capability-of the military-industrial complex, thereby increasing the probability of success

  3. BEHAVE: fire behavior prediction and fuel modeling system - BURN subsystem, Part 2

    Treesearch

    Patricia L. Andrews; Carolyn H. Chase

    1989-01-01

    This is the third publication describing the BEHAVE system of computer programs for predicting behavior of wildland fires. This publication adds the following predictive capabilities: distance firebrands are lofted ahead of a wind-driven surface fire, probabilities of firebrands igniting spot fires, scorch height of trees, and percentage of tree mortality. The system...

  4. A probability model for enterotoxin production of Bacillus cereus as a function of pH and temperature

    USDA-ARS?s Scientific Manuscript database

    Bacillus cereus is frequently isolated from a variety of foods including vegetables, dairy products, meat, and other raw and processed foods. The bacterium is capable of producing enterotoxin and emetic toxin that can cause severe nausea, vomiting and diarrhea. The objectives of this study were to a...

  5. Quality-Based Analysis Capability for National Youth Surveys: Development, Application, and Implications for Policy.

    ERIC Educational Resources Information Center

    Orvis, Bruce R.; Gahart, Martin T.

    As part of the military recruiting effort, the Department of Defense sponsors surveys of the national youth population to help design recruiting and advertising strategies. This report develops and applies a method of using the information contained in national youth surveys to estimate the probability that respondents taking the Armed Forces…

  6. Some Probable Technological Trends and Their Impact on an Information Network System. LINCS Project Document Series.

    ERIC Educational Resources Information Center

    Ebersole, Joseph L.

    Improvements in the technology associated with the information sciences will have their primary potential impact on the distribution of costs, information flow level, information availability, and use among information channels. This improvement implied not only a capability to perform a given function, but a lower cost. For example, the trend…

  7. 77 FR 55108 - Explosive Siting Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-07

    ... against hazardous fragments, which are defined as having a kinetic energy of 58 foot-pounds, which is a level of kinetic energy capable of causing a fatality. The probability of a person six feet tall and one.... Explosions are due to the sudden release of energy over a short period of time and may or may not involve...

  8. Play-fairway analysis for geothermal exploration: Examples from the Great Basin, western USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siler, Drew L; Faulds, James E

    2013-10-27

    Elevated permeability within fault systems provides pathways for circulation of geothermal fluids. Future geothermal development depends on precise and accurate location of such fluid flow pathways in order to both accurately assess geothermal resource potential and increase drilling success rates. The collocation of geologic characteristics that promote permeability in a given geothermal system define the geothermal ‘fairway’, the location(s) where upflow zones are probable and where exploration efforts including drilling should be focused. We define the geothermal fairway as the collocation of 1) fault zones that are ideally oriented for slip or dilation under ambient stress conditions, 2) areas withmore » a high spatial density of fault intersections, and 3) lithologies capable of supporting dense interconnected fracture networks. Areas in which these characteristics are concomitant with both elevated temperature and fluids are probable upflow zones where economic-scale, sustainable temperatures and flow rates are most likely to occur. Employing a variety of surface and subsurface data sets, we test this ‘play-fairway’ exploration methodology on two Great Basin geothermal systems, the actively producing Brady’s geothermal system and a ‘greenfield’ geothermal prospect at Astor Pass, NV. These analyses, based on 3D structural and stratigraphic framework models, reveal subsurface characteristics about each system, well beyond the scope of standard exploration methods. At Brady’s, the geothermal fairways we define correlate well with successful production wells and pinpoint several drilling targets for maintaining or expanding production in the field. In addition, hot-dry wells within the Brady’s geothermal field lie outside our defined geothermal fairways. At Astor Pass, our play-fairway analysis provides for a data-based conceptual model of fluid flow within the geothermal system and indicates several targets for exploration drilling.« less

  9. Watershed erosion modeling using the probability of sediment connectivity in a gently rolling system

    NASA Astrophysics Data System (ADS)

    Mahoney, David Tyler; Fox, James Forrest; Al Aamery, Nabil

    2018-06-01

    Sediment connectivity has been shown in recent years to explain how the watershed configuration controls sediment transport. However, we find no studies develop a watershed erosion modeling framework based on sediment connectivity, and few, if any, studies have quantified sediment connectivity for gently rolling systems. We develop a new predictive sediment connectivity model that relies on the intersecting probabilities for sediment supply, detachment, transport, and buffers to sediment transport, which is integrated in a watershed erosion model framework. The model predicts sediment flux temporally and spatially across a watershed using field reconnaissance results, a high-resolution digital elevation models, a hydrologic model, and shear-based erosion formulae. Model results validate the capability of the model to predict erosion pathways causing sediment connectivity. More notably, disconnectivity dominates the gently rolling watershed across all morphologic levels of the uplands, including, microtopography from low energy undulating surfaces across the landscape, swales and gullies only active in the highest events, karst sinkholes that disconnect drainage areas, and floodplains that de-couple the hillslopes from the stream corridor. Results show that sediment connectivity is predicted for about 2% or more the watershed's area 37 days of the year, with the remaining days showing very little or no connectivity. Only 12.8 ± 0.7% of the gently rolling watershed shows sediment connectivity on the wettest day of the study year. Results also highlight the importance of urban/suburban sediment pathways in gently rolling watersheds, and dynamic and longitudinal distributions of sediment connectivity might be further investigated in future work. We suggest the method herein provides the modeler with an added tool to account for sediment transport criteria and has the potential to reduce computational costs in watershed erosion modeling.

  10. Application of statistical distribution theory to launch-on-time for space construction logistic support

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1989-01-01

    The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.

  11. Review of the use of pretest probability for molecular testing in non-small cell lung cancer and overview of new mutations that may affect clinical practice

    PubMed Central

    Martin, Petra; Leighl, Natasha B.

    2017-01-01

    This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice. PMID:28607579

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Antoinette J

    Los Alamos National Laboratory (LANL) uses Capability Reviews to assess the quality and institutional integration of science, technology and engineering (STE) and to advise Laboratory Management on the current and future health of LANL STE. The capabilities are deliberately chosen to be crosscutting over the Laboratory and therefore will include experimental, theoretical and simulation disciplines from multiple line organizations. Capability Reviews are designed to provide a more holistic view of the STE quality, integration to achieve mission requirements, and mission relevance. The scope of these capabilities necessitate that there will be significant overlap in technical areas covered by capability reviewsmore » (e.g., materials research and weapons science and engineering). In addition, LANL staff may be reviewed in different capability reviews because of their varied assignments and expertise. The principal product of the Capability Review is the report that includes the review committee's assessments, recommendations, and recommendations for STE.« less

  13. Tribology needs for future space and aeronautical systems

    NASA Technical Reports Server (NTRS)

    Fusaro, Robert L.

    1991-01-01

    Future aeronautical and space missions will push tribology technology beyond its current capability. The objective is to discuss the current state of the art of tribology as it is applied to advanced aircraft and spacecraft. Areas of discussion include materials lubrication mechanisms, factors affecting lubrication, current and future tribological problem areas, potential new lubrication techniques, and perceived technology requirements that need to be met in order to solve these tribology problems.

  14. System-Level Experimentation: Executive Summary and Annotated Brief

    DTIC Science & Technology

    2006-07-01

    both friendlies and adversaries – to define their own capabilities, unfettered by doctrinal or cultural limitations and bounded only by the laws of...Gen Eric Nelson, USAF (Ret) Independent Consultant (Systems, Software) Dr. Brad Parkinson Stanford University (GPS, Sensors, Systems) Mr. Skip Saunders...RCO* – rapid fielding for high priority needs SLE is focused on understanding the future - unknown needs A8’ s Future Game – analysis of future

  15. Running on Empty: The Development of Helicopter Aerial Refueling and Implications for Future USAF Combat Rescue Capabilities

    DTIC Science & Technology

    1997-03-01

    Haiti, Somalia, Liberia, and Bosnia. The future appears very busy for Air Force rescue units as well. According to ?Strategic Assessment 1996?Instruments...13 COMTEMPORARY ROLE OF AIR REFUELING IN US AIR FORCE SEARCH AND RESCUE...places such as Haiti, Somalia, Liberia, and Bosnia. The future appears very busy for Air Force rescue units as well. According to “Strategic Assessment

  16. Advanced Ceramic Materials for Future Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Misra, Ajay

    2015-01-01

    With growing trend toward higher temperature capabilities, lightweight, and multifunctionality, significant advances in ceramic matrix composites (CMCs) will be required for future aerospace applications. The presentation will provide an overview of material requirements for future aerospace missions, and the role of ceramics and CMCs in meeting those requirements. Aerospace applications will include gas turbine engines, aircraft structure, hypersonic and access to space vehicles, space power and propulsion, and space communication.

  17. Department of Defense Energy and Logistics: Implications of Historic and Future Cost, Risk, and Capability Analysis

    NASA Astrophysics Data System (ADS)

    Tisa, Paul C.

    Every year the DoD spends billions satisfying its large petroleum demand. This spending is highly sensitive to uncontrollable and poorly understood market forces. Additionally, while some stakeholders may not prioritize its monetary cost and risk, energy is fundamentally coupled to other critical factors. Energy, operational capability, and logistics are heavily intertwined and dependent on uncertain security environment and technology futures. These components and their relationships are less understood. Without better characterization, future capabilities may be significantly limited by present-day acquisition decisions. One attempt to demonstrate these costs and risks to decision makers has been through a metric known as the Fully Burdened Cost of Energy (FBCE). FBCE is defined as the commodity price for fuel plus many of these hidden costs. The metric encouraged a valuable conversation and is still required by law. However, most FBCE development stopped before the lessons from that conversation were incorporated. Current implementation is easy to employ but creates little value. Properly characterizing the costs and risks of energy and putting them in a useful tradespace requires a new framework. This research aims to highlight energy's complex role in many aspects of military operations, the critical need to incorporate it in decisions, and a novel framework to do so. It is broken into five parts. The first describes the motivation behind FBCE, the limits of current implementation, and outlines a new framework that aids decisions. Respectively, the second, third, and fourth present a historic analysis of the connections between military capabilities and energy, analyze the recent evolution of this conversation within the DoD, and pull the historic analysis into a revised framework. The final part quantifies the potential impacts of deeply uncertain futures and technological development and introduces an expanded framework that brings capability, energy, and their uncertainty into the same tradespace. The work presented is intended to inform better policies and investment decisions for military acquisitions. The discussion highlights areas within the DoD's understanding of energy that could improve or whose development has faltered. The new metric discussed allows the DoD to better manage and plan for long-term energy-related costs and risk.

  18. Laser Time-of-Flight Mass Spectrometry for Future In Situ Planetary Missions

    NASA Technical Reports Server (NTRS)

    Getty, S. A.; Brinckerhoff, W. B.; Cornish, T.; Ecelberger, S. A.; Li, X.; Floyd, M. A. Merrill; Chanover, N.; Uckert, K.; Voelz, D.; Xiao, X.; hide

    2012-01-01

    Laser desorption/ionization time-of-flight mass spectrometry (LD-TOF-MS) is a versatile, low-complexity instrument class that holds significant promise for future landed in situ planetary missions that emphasize compositional analysis of surface materials. Here we describe a 5kg-class instrument that is capable of detecting and analyzing a variety of analytes directly from rock or ice samples. Through laboratory studies of a suite of representative samples, we show that detection and analysis of key mineral composition, small organics, and particularly, higher molecular weight organics are well suited to this instrument design. A mass range exceeding 100,000 Da has recently been demonstrated. We describe recent efforts in instrument prototype development and future directions that will enhance our analytical capabilities targeting organic mixtures on primitive and icy bodies. We present results on a series of standards, simulated mixtures, and meteoritic samples.

  19. Space Weather - Current Capabilities, Future Requirements, and the Path to Improved Forecasting

    NASA Astrophysics Data System (ADS)

    Mann, Ian

    2016-07-01

    We present an overview of Space Weather activities and future opportunities including assessments of current status and capabilities, knowledge gaps, and future directions in relation to both observations and modeling. The review includes input from the scientific community including from SCOSTEP scientific discipline representatives (SDRs), COSPAR Main Scientific Organizers (MSOs), and SCOSTEP/VarSITI leaders. The presentation also draws on results from the recent activities related to the production of the COSPAR-ILWS Space Weather Roadmap "Understanding Space Weather to Shield Society" [Schrijver et al., Advances in Space Research 55, 2745 (2015) http://dx.doi.org/10.1016/j.asr.2015.03.023], from the activities related to the United Nations (UN) Committee on the Peaceful Uses of Outer Space (COPUOS) actions in relation to the Long-term Sustainability of Outer Space (LTS), and most recently from the newly formed and ongoing efforts of the UN COPUOS Expert Group on Space Weather.

  20. The high order dispersion analysis based on first-passage-time probability in financial markets

    NASA Astrophysics Data System (ADS)

    Liu, Chenggong; Shang, Pengjian; Feng, Guochen

    2017-04-01

    The study of first-passage-time (FPT) event about financial time series has gained broad research recently, which can provide reference for risk management and investment. In this paper, a new measurement-high order dispersion (HOD)-is developed based on FPT probability to explore financial time series. The tick-by-tick data of three Chinese stock markets and three American stock markets are investigated. We classify the financial markets successfully through analyzing the scaling properties of FPT probabilities of six stock markets and employing HOD method to compare the differences of FPT decay curves. It can be concluded that long-range correlation, fat-tailed broad probability density function and its coupling with nonlinearity mainly lead to the multifractality of financial time series by applying HOD method. Furthermore, we take the fluctuation function of multifractal detrended fluctuation analysis (MF-DFA) to distinguish markets and get consistent results with HOD method, whereas the HOD method is capable of fractionizing the stock markets effectively in the same region. We convince that such explorations are relevant for a better understanding of the financial market mechanisms.

  1. Supernova Cosmology Inference with Probabilistic Photometric Redshifts (SCIPPR)

    NASA Astrophysics Data System (ADS)

    Peters, Christina; Malz, Alex; Hlozek, Renée

    2018-01-01

    The Bayesian Estimation Applied to Multiple Species (BEAMS) framework employs probabilistic supernova type classifications to do photometric SN cosmology. This work extends BEAMS to replace high-confidence spectroscopic redshifts with photometric redshift probability density functions, a capability that will be essential in the era the Large Synoptic Survey Telescope and other next-generation photometric surveys where it will not be possible to perform spectroscopic follow up on every SN. We present the Supernova Cosmology Inference with Probabilistic Photometric Redshifts (SCIPPR) Bayesian hierarchical model for constraining the cosmological parameters from photometric lightcurves and host galaxy photometry, which includes selection effects and is extensible to uncertainty in the redshift-dependent supernova type proportions. We create a pair of realistic mock catalogs of joint posteriors over supernova type, redshift, and distance modulus informed by photometric supernova lightcurves and over redshift from simulated host galaxy photometry. We perform inference under our model to obtain a joint posterior probability distribution over the cosmological parameters and compare our results with other methods, namely: a spectroscopic subset, a subset of high probability photometrically classified supernovae, and reducing the photometric redshift probability to a single measurement and error bar.

  2. 42 CFR 438.700 - Basis for imposition of sanctions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... medical condition or history indicates probable need for substantial future medical services. (4... directly, or indirectly through any agent or independent contractor, marketing materials that have not been...

  3. Conceptualizing and assessing improvement capability: a review

    PubMed Central

    Boaden, Ruth; Walshe, Kieran

    2017-01-01

    Abstract Purpose The literature is reviewed to examine how ‘improvement capability’ is conceptualized and assessed and to identify future areas for research. Data sources An iterative and systematic search of the literature was carried out across all sectors including healthcare. The search was limited to literature written in English. Data extraction The study identifies and analyses 70 instruments and frameworks for assessing or measuring improvement capability. Information about the source of the instruments, the sectors in which they were developed or used, the measurement constructs or domains they employ, and how they were tested was extracted. Results of data synthesis The instruments and framework constructs are very heterogeneous, demonstrating the ambiguity of improvement capability as a concept, and the difficulties involved in its operationalisation. Two-thirds of the instruments and frameworks have been subject to tests of reliability and half to tests of validity. Many instruments have little apparent theoretical basis and do not seem to have been used widely. Conclusion The assessment and development of improvement capability needs clearer and more consistent conceptual and terminological definition, used consistently across disciplines and sectors. There is scope to learn from existing instruments and frameworks, and this study proposes a synthetic framework of eight dimensions of improvement capability. Future instruments need robust testing for reliability and validity. This study contributes to practice and research by presenting the first review of the literature on improvement capability across all sectors including healthcare. PMID:28992146

  4. AC HTS Transmission Cable for Integration into the Future EHV Grid of the Netherlands

    NASA Astrophysics Data System (ADS)

    Zuijderduin, R.; Chevtchenko, O.; Smit, J. J.; Aanhaanen, G.; Melnik, I.; Geschiere, A.

    Due to increasing power demand, the electricity grid of the Netherlands is changing. The future grid must be capable to transmit all the connected power. Power generation will be more decentralized like for instance wind parks connected to the grid. Furthermore, future large scale production units are expected to be installed near coastal regions. This creates some potential grid issues, such as: large power amounts to be transmitted to consumers from west to east and grid stability. High temperature superconductors (HTS) can help solving these grid problems. Advantages to integrate HTS components at Extra High Voltage (EHV) and High Voltage (HV) levels are numerous: more power with less losses and less emissions, intrinsic fault current limiting capability, better control of power flow, reduced footprint, etc. Today's main obstacle is the relatively high price of HTS. Nevertheless, as the price goes down, initial market penetration for several HTS components is expected by year 2015 (e.g.: cables, fault current limiters). In this paper we present a design of intrinsically compensated EHV HTS cable for future grid integration. Discussed are the parameters of such cable providing an optimal power transmission in the future network.

  5. Effects of climate change on water abstraction restrictions for irrigation during droughts - The UK case

    NASA Astrophysics Data System (ADS)

    Rey Vicario, D.; Holman, I.

    2016-12-01

    The use of water for irrigation and on-farm reservoir filling is globally important for agricultural production. In humid climates, like the UK, supplemental irrigation can be critical to buffer the effects of rainfall variability and to achieve high quality crops. Given regulatory efforts to secure sufficient environmental river flows and meet rising water demands due to population growth and climate change, increasing water scarcity is likely to compound the drought challenges faced by irrigated agriculture in this region. Currently, water abstraction from surface waters for agricultural irrigation can be restricted by the Environment Agency during droughts under Section 57 of the Water Resources Act (1991), based on abnormally low river flow levels and rainfall forecast, causing significant economic impacts on irrigated agricultural production. The aim of this study is to assess the impact that climate change may have on agricultural abstraction in the UK within the context of the abstraction restriction triggers currently in place. These triggers have been applied to the `Future Flows hydrology' database to assess the likelihood of increasing restrictions on agricultural abstraction in the future by comparing the probability of voluntary and compulsory restrictions in the baseline (1961-1990) and future period (2071-2098) for 282 catchments throughout the whole of the UK. The results of this study show a general increase in the probability of future agricultural irrigation abstraction restrictions in the UK in the summer, particularly in the South West, although there is significant variability between the 11 ensemble members. The results also indicate that UK winters are likely to become wetter in the future, although in some catchments the probability of abstraction restriction in the reservoir refilling winter months (November-February) could increase slightly. An increasing frequency of drought events due to climate change is therefore likely to lead to more water abstraction restrictions, increasing the need for irrigators to adapt their businesses to increase drought resilience and hence food security.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krier, D. J.; Perry, F. V.

    Location, timing, volume, and eruptive style of post-Miocene volcanoes have defined the volcanic hazard significant to a proposed high-level radioactive waste (HLW) and spent nuclear fuel (SNF) repository at Yucca Mountain, Nevada, as a low-probability, high-consequence event. Examination of eruptive centers in the region that may be analogueues to possible future volcanic activity at Yucca Mountain have aided in defining and evaluating the consequence scenarios for intrusion into and eruption above a repository. The probability of a future event intersecting a repository at Yucca Mountain has a mean value of 1.7 x 10{sup -8} per year. This probability comes frommore » the Probabilistic Volcanic Hazard Assessment (PVHA) completed in 1996 and updated to reflect change in repository layout. Since that time, magnetic anomalies representing potential buried volcanic centers have been identified fiom magnetic surveys; however these potential buried centers only slightly increase the probability of an event intersecting the repository. The proposed repository will be located in its central portion of Yucca Mountain at approximately 300m depth. The process for assessing performance of a repository at Yucca Mountain has identified two scenarios for igneous activity that, although having a very low probability of occurrence, could have a significant consequence should an igneous event occur. Either a dike swarm intersecting repository drifts containing waste packages, or a volcanic eruption through the repository could result in release of radioactive material to the accessible environment. Ongoing investigations are assessing the mechanisms and significance of the consequence scenarios. Lathrop Wells Cone ({approx}80,000 yrs), a key analogue for estimating potential future volcanic activity, is the youngest surface expression of apparent waning basaltic volcanism in the region. Cone internal structure, lavas, and ash-fall tephra have been examined to estimate eruptive volume, eruption type, and subsurface disturbance accompanying conduit growth and eruption. The Lathrop Wells volcanic complex has a total volume estimate of approximately 0.1 km{sup 3}. The eruptive products indicate a sequence of initial magmatic fissure fountaining, early Strombolian activity, and a brief hydrovolcanic phase, and violent Strombolian phase(s). Lava flows adjacent to the Lathrop Wells Cone probably were emplaced during the mid-eruptive sequence. Ongoing investigations continue to address the potential hazards of a volcanic event at Yucca Mountain.« less

  7. RAVEN User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph

    2015-10-01

    RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncertainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters that need tomore » be perturbed are accessible by inputs files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN has started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assessment capability to RELAP-7, currently under development at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activaties are currently ongoing for coupling RAVEN with software such as RELAP5-3D, etc. The aim of this document is the explanation of the input requirements, focalizing on the input structure.« less

  8. RAVEN User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph

    2016-02-01

    RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncertainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters that need tomore » be perturbed are accessible by input files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assessment capability to RELAP-7, currently under development at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activates are currently ongoing for coupling RAVEN with software such as RELAP5-3D, etc. The aim of this document is the explanation of the input requirements, focusing on the input structure.« less

  9. RAVEN User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph

    2017-03-01

    RAVEN is a generic software framework to perform parametric and probabilistic analy- sis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncer- tainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters thatmore » need to be perturbed are accessible by inputs files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused to- ward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN has started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is in- terested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assess- ment capability to RELAP-7, currently under develop-ment at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activates are currently ongoing for coupling RAVEN with soft- ware such as RELAP5-3D, etc. The aim of this document is the explaination of the input requirements, focalizing on the input structure.« less

  10. Understanding Climate Trends Using IR Brightness Temperature Spectra from AIRS, IASI and CrIS

    NASA Astrophysics Data System (ADS)

    Deslover, D. H.; Nikolla, E.; Knuteson, R. O.; Revercomb, H. E.; Tobin, D. C.

    2016-12-01

    NASA's Atmospheric Infrared Sounder (AIRS) provides a data record that extends from its 2002 launch to the present. The Infrared Atmospheric Sounding Interferometer (IASI) onboard Metop- (A launched in 2006, B in 2012), as well as the Joint Polar Satellite System (JPSS) Cross-track Infrared Sounder (CrIS) launched in 2011, complement this data record. Future infrared sounders with similar capabilities will augment these measurements into the near future. We have created a global data set from these infrared measurements, using the nadir-most observations for each of the aforementioned instruments. We can filter the data based upon spatial, diurnal and seasonal properties to discern trends for a given spectral channel and, therefore, a specific atmospheric layer. Subtle differences between spectral sampling among the three instruments can lead significant differences in the resultant probability distribution functions for similar spectral channels. We take advantage of the higher (0.25 cm-1) IASI spectral resolution to subsample the IASI spectra onto AIRS and CrIS spectral grids to better compare AIRS/IASI and CrIS/IASI trends in the brightness temperature anomalies. To better understand the dependance of trace gases on the measured brightness temperature spectral time-series, a companion study has utilized coincident vertical profiles of stratospheric carbon dioxide, water vapor and ozone concentration are used to infer a correlation with the CrIS brightness temperatures. The goal was to investigate the role of ozone heating and carbon dioxide cooling on the observed brightness temperature spectra. Results from that study will be presented alongside the climate trend analysis.

  11. Climate variability and change in the United States: potential impacts on water- and foodborne diseases caused by microbiologic agents.

    PubMed Central

    Rose, J B; Epstein, P R; Lipp, E K; Sherman, B H; Bernard, S M; Patz, J A

    2001-01-01

    Exposure to waterborne and foodborne pathogens can occur via drinking water (associated with fecal contamination), seafood (due to natural microbial hazards, toxins, or wastewater disposal) or fresh produce (irrigated or processed with contaminated water). Weather influences the transport and dissemination of these microbial agents via rainfall and runoff and the survival and/or growth through such factors as temperature. Federal and state laws and regulatory programs protect much of the U.S. population from waterborne disease; however, if climate variability increases, current and future deficiencies in areas such as watershed protection, infrastructure, and storm drainage systems will probably increase the risk of contamination events. Knowledge about transport processes and the fate of microbial pollutants associated with rainfall and snowmelt is key to predicting risks from a change in weather variability. Although recent studies identified links between climate variability and occurrence of microbial agents in water, the relationships need further quantification in the context of other stresses. In the marine environment as well, there are few studies that adequately address the potential health effects of climate variability in combination with other stresses such as overfishing, introduced species, and rise in sea level. Advances in monitoring are necessary to enhance early-warning and prevention capabilities. Application of existing technologies, such as molecular fingerprinting to track contaminant sources or satellite remote sensing to detect coastal algal blooms, could be expanded. This assessment recommends incorporating a range of future scenarios of improvement plans for current deficiencies in the public health infrastructure to achieve more realistic risk assessments. PMID:11359688

  12. Climate variability and change in the United States: potential impacts on water- and foodborne diseases caused by microbiologic agents.

    PubMed

    Rose, J B; Epstein, P R; Lipp, E K; Sherman, B H; Bernard, S M; Patz, J A

    2001-05-01

    Exposure to waterborne and foodborne pathogens can occur via drinking water (associated with fecal contamination), seafood (due to natural microbial hazards, toxins, or wastewater disposal) or fresh produce (irrigated or processed with contaminated water). Weather influences the transport and dissemination of these microbial agents via rainfall and runoff and the survival and/or growth through such factors as temperature. Federal and state laws and regulatory programs protect much of the U.S. population from waterborne disease; however, if climate variability increases, current and future deficiencies in areas such as watershed protection, infrastructure, and storm drainage systems will probably increase the risk of contamination events. Knowledge about transport processes and the fate of microbial pollutants associated with rainfall and snowmelt is key to predicting risks from a change in weather variability. Although recent studies identified links between climate variability and occurrence of microbial agents in water, the relationships need further quantification in the context of other stresses. In the marine environment as well, there are few studies that adequately address the potential health effects of climate variability in combination with other stresses such as overfishing, introduced species, and rise in sea level. Advances in monitoring are necessary to enhance early-warning and prevention capabilities. Application of existing technologies, such as molecular fingerprinting to track contaminant sources or satellite remote sensing to detect coastal algal blooms, could be expanded. This assessment recommends incorporating a range of future scenarios of improvement plans for current deficiencies in the public health infrastructure to achieve more realistic risk assessments.

  13. Analyzing Future Flooding under Climate Change Scenario using CMIP5 Streamflow Data

    NASA Astrophysics Data System (ADS)

    Nyaupane, Narayan; Parajuli, Ranjan; Kalra, Ajay

    2017-12-01

    Flooding is the most severe and costlier natural hazard in US. The effect of climate change has intensified the scenario in recent years. Flood prevention practice along with proper understanding of flooding event can mitigate the risk of such hazard. The flood plain mapping is one of the technique to quantify the severity of the flooding. Carson City, which is one of the agricultural area in the desert of Nevada has experienced peak flood in recent year. The underlying probability distribution for the area, latest Coupled Model Intercomparison Project (CMIP5) streamflow data of Carson River were analyzed for 27 different statistical distributions. The best fitted distribution underlying was used to forecast the 100yr flood (design flood). The data from 1950-2099 derived from 31 model and total 97 projections were used to predict the future streamflow. Delta change method is adopted to quantify the amount of future (2050-2099) flood. To determine the extent of flooding 3 scenarios (i) historic design flood, (ii) 500yr flood and (iii) future 100yr flood were routed on a HEC-RAS model, prepared using available terrain data. Some of the climate projection shows extreme increase in future design flood. The future design flood could be more than the historic 500yr flood. At the same time, the extent of flooding could go beyond the historic flood of 0.2% annual probability. This study suggests an approach to quantify the future flood and floodplain using climate model projections. The study would provide helpful information to the facility manager, design engineer and stake holders.

  14. Advanced EVA Capabilities: A Study for NASA's Revolutionary Aerospace Systems Concept Program

    NASA Technical Reports Server (NTRS)

    Hoffman, Stephen J.

    2004-01-01

    This report documents the results of a study carried out as part of NASA s Revolutionary Aerospace Systems Concepts Program examining the future technology needs of extravehicular activities (EVAs). The intent of this study is to produce a comprehensive report that identifies various design concepts for human-related advanced EVA systems necessary to achieve the goals of supporting future space exploration and development customers in free space and on planetary surfaces for space missions in the post-2020 timeframe. The design concepts studied and evaluated are not limited to anthropomorphic space suits, but include a wide range of human-enhancing EVA technologies as well as consideration of coordination and integration with advanced robotics. The goal of the study effort is to establish a baseline technology "road map" that identifies and describes an investment and technical development strategy, including recommendations that will lead to future enhanced synergistic human/robot EVA operations. The eventual use of this study effort is to focus evolving performance capabilities of various EVA system elements toward the goal of providing high performance human operational capabilities for a multitude of future space applications and destinations. The data collected for this study indicate a rich and diverse history of systems that have been developed to perform a variety of EVA tasks, indicating what is possible. However, the data gathered for this study also indicate a paucity of new concepts and technologies for advanced EVA missions - at least any that researchers are willing to discuss in this type of forum.

  15. Forecasting the duration of volcanic eruptions: an empirical probabilistic model

    NASA Astrophysics Data System (ADS)

    Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.

    2014-01-01

    The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well-documented volcanoes or for different duration data such as the duration of explosive episodes or the duration of repose periods between eruptions.

  16. Exploration Medical Capability (ExMC) Projects

    NASA Technical Reports Server (NTRS)

    Wu, Jimmy; Watkins, Sharmila; Baumann, David

    2010-01-01

    During missions to the Moon or Mars, the crew will need medical capabilities to diagnose and treat disease as well as for maintaining their health. The Exploration Medical Capability Element develops medical technologies, medical informatics, and clinical capabilities for different levels of care during space missions. The work done by team members in this Element is leading edge technology, procedure, and pharmacological development. They develop data systems that protect patient's private medical information, aid in the diagnosis of medical conditions, and act as a repository of relevant NASA life sciences experimental studies. To minimize the medical risks to crew health the physicians and scientists in this Element develop models to quantify the probability of medical events occurring during a mission. They define procedures to treat an ill or injured crew member who does not have access to an emergency room and who must be cared for in a microgravity environment where both liquids and solids behave differently than on Earth. To support the development of these medical capabilities, the Element manages the development of medical technologies that prevent, monitor, diagnose, and treat an ill or injured crewmember. The Exploration Medical Capability Element collaborates with the National Space Biomedical Research Institute (NSBRI), the Department of Defense, other Government-funded agencies, academic institutions, and industry.

  17. Capability Investment Strategy to Enable JPL Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lincoln, William; Merida, Sofia; Adumitroaie, Virgil; Weisbin, Charles R.

    2006-01-01

    The Jet Propulsion Laboratory (JPL) formulates and conducts deep space missions for NASA (the National Aeronautics and Space Administration). The Chief Technologist of JPL has responsibility for strategic planning of the laboratory's advanced technology program to assure that the required technological capabilities to enable future missions are ready as needed. The responsibilities include development of a Strategic Plan (Antonsson, E., 2005). As part of the planning effort, a structured approach to technology prioritization, based upon the work of the START (Strategic Assessment of Risk and Technology) (Weisbin, C.R., 2004) team, was developed. The purpose of this paper is to describe this approach and present its current status relative to the JPL technology investment.

  18. No food for thought: moderating effects of delay discounting and future time perspective on the relation between income and food insecurity1234

    PubMed Central

    Epstein, Leonard H; Jankowiak, Noelle; Lin, Henry; Paluch, Rocco; Koffarnus, Mikhail N; Bickel, Warren K

    2014-01-01

    Background: Low income is related to food insecurity, and research has suggested that a scarcity of resources associated with low income can shift attention to the present, thereby discounting the future. Objective: We tested whether attending to the present and discounting the future may moderate the influence of income on food insecurity. Design: Delay discounting and measures of future time perspective (Zimbardo Time Perspective Inventory, Consideration of Future Consequences Scale, time period of financial planning, and subjective probability of living to age 75 y) were studied as moderators of the relation between income and food insecurity in a diverse sample of 975 adults, 31.8% of whom experienced some degree of food insecurity. Results: Income, financial planning, subjective probability of living to age 75 y, and delay discounting predicted food insecurity as well as individuals who were high in food insecurity. Three-way interactions showed that delay discounting interacted with financial planning and income to predict food insecurity (P = 0.003). At lower levels of income, food insecurity was lowest for subjects who had good financial planning skills and did not discount the future, whereas having good financial skills and discounting the future had minimal influence on food insecurity. The same 3-way interaction was observed when high food insecurity was predicted (P = 0.008). Conclusion: Because of the role of scarce resources on narrowing attention and reducing prospective thinking, research should address whether modifying future orientation may reduce food insecurity even in the face of diminishing financial resources. This trial was registered at clinicaltrials.gov as NCT02099812. PMID:25008855

  19. Continuous-time random-walk model for financial distributions

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Montero, Miquel; Weiss, George H.

    2003-02-01

    We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.

  20. Energy efficient engine: Propulsion system-aircraft integration evaluation

    NASA Technical Reports Server (NTRS)

    Owens, R. E.

    1979-01-01

    Flight performance and operating economics of future commercial transports utilizing the energy efficient engine were assessed as well as the probability of meeting NASA's goals for TSFC, DOC, noise, and emissions. Results of the initial propulsion systems aircraft integration evaluation presented include estimates of engine performance, predictions of fuel burns, operating costs of the flight propulsion system installed in seven selected advanced study commercial transports, estimates of noise and emissions, considerations of thrust growth, and the achievement-probability analysis.

Top